Category Archives: Linux

Remembering the twins

I waited this long to recap the two Dell Latitude LMs I had as guests last month because I couldn’t find the pictures I took while I was poking and prodding. This doesn’t usually happen where I lose photos of something, but I did find one leftover on my camera, taken the day they left.

2014-11-13-84kbn-dsl

That at least proves it wasn’t a dream. :roll:

The picture, of course, shows the ironclad and eternally trustworthy DSL running in its most basic form on the prettier of the two machines, replete with a wireless network connection courtesy of an old WPC11v3 b-only card. That was not my most successful attempt — and I really wish I could find those other pictures :evil: — but DSL did at least tell me that the guts of the machine were working.

I don’t have a picture of DSL’s graphical desktop on that unit because I never got one. DSL isn’t picky when it comes to hardware, but I have seen more than one computer over the years that is less-than-visual. In this case, both the vesa and VGA attempts, in every variation, resulted in a scrambled video display.

Some of my other attempts were also less than successful, but a few bore fruit. I had better luck with early, early versions of Debian and Fedora, but some very bad experiences with … anything after 2002 or so (which thanks to this machine, did not come as a surprise). :\ And of course, I managed to get a blinking cursor on a Crux 2.4 installation, which I count as a flawless victory. :lol:

The biggest difficulty in working with these machines (I say “these,” but I did almost everything on the one you see in the picture) was twofold: First, these computers were not intended to boot from CD — only the primary hard drive or the floppy drive, of which I had none (and by that I mean the owner had none). Don’t even talk to me about a USB port. You know better than that. >:(

That’s a huge complication, but not something I haven’t had to work with before. I’m not above creating an entire system in an emulator, writing out the image file to a hard disk, and transplanting it physically into a target machine.

In that sense, these are great designs for that task. I’ve run into machines that were a bit curmudgeonly in that respect, but the drives on these laptops pull out of the front left corner like a drawer, connect firmly in a dedicated tray, and are more or less exchangeable in seconds. What’s more, there’s plenty of space in the tray for an IDE-to-whatever converter, which in my case was an SDHC adapter.

I did run into an additional mystery though, which constitutes Biggest Difficulty Part Two: a hesitance to boot from some systems, and I’m not sure why.

It may have been some sort of partitioning inconsistency, between the BIOS and the installation. Occasionally a system wouldn’t boot that I had written out via dd, but other times preinstalled or original installations wouldn’t boot either.

I don’t suspect hardware issues; instead, I suspect either (a) the old BIOS drive dimension limit, cropping up again decades after its relevancy, causing problems again in its passive-aggressive way of suggesting you should get a new computer, or (b) some misalignment between the way GRUB or LILO worked two decades ago, and what the BIOS expected.

I’ve seen machines — in fact, this EzBook 800 has it — that have a BIOS-switchable option for Windows-esque drive arrangements, with the other option as … “other.” :\ I know of one or two machines in my past that couldn’t boot an “other” system if the BIOS was set to Windows-ish, or vice versa. This old warrior was one of those.

I don’t have any way to document that, and I don’t know how or why it happens, but that’s my underlying suspicion. Since the BIOS in these Latitudes doesn’t have an option to switch, it was a crap shoot to see what will boot and what won’t.

Both of these issues, and their underlying problems, are magnified by the glacial pace of working at 133Mhz, and with the added time of swapping drives and bouncing between drive caddies. Plus, the constant risk of snapping or bending a 20-year-old pin array, or the natural brittleness of aging plastic. … I imagine even that caused a little hesitation on my part.

I can say with some honesty, if these were my personal machines, I’d probably be a little more aggressive in seeing what they were capable of. I tend to be a little antsy around other people’s computers though, for no other reason than general courtesy.

In any case, I gave them back a few weeks ago after giving each one a quick cleanup, and they returned to their home of record.

The irony of their departure is that the owner, when he came to pick them up, hinted that I might be able to keep them if I were inclined, and if my offer came within the range of what he thought they were worth.

I declined politely, partly in fear of bringing more wayward laptops into the house on a permanent basis, but also because I know he feels the pair together, with the power supply and a Dell-branded PS/2 ball mouse (woohoo!), are worth close to US$100. I put them around a quarter of that, maybe a little more. I doubt we could come to compromise, even if he were a little more realistic.

But if I were to find one of these in the recycling dump, I wouldn’t pass it over. It would be almost impossible (by my cursory research) to find replacement parts now, and even if you did, you’d likely be paying incredibly inflated amounts for something worth a fraction of the price tag. So you’d have to find one complete, unadulterated and in pristine condition to really appreciate it.

There are better machines of this era to experiment with. But treading the 20-year-mark on hardware this old, perhaps this exhibit machine and its “scavengee” comrade are a good investment. Maybe his offer isn’t far off the mark after all. :|

(And in a worst-case scenario, it’s reassuring to think that Dell actually still has Windows drivers for machines of this pedigree. That, in itself, is amazing, even if the nightmare of running Windows 95 on those machines is only partially massaged by the thought of rehashing a few sessions of Age of Empires.)

Who knows? Junk — ahem, I mean, vintage computing is nothing if not an unpredictable hobby. :mrgreen:

Tricks of the trade

I had to remind myself the other day that I’ve been using Arch Linux for more than eight years. I did my first trial installs early in 2006, and while I cut my teeth on Ubuntu, as time wore on, Arch eventually became more like home.

With the exception of strict i586 machines, I’m more likely to install Arch on any given computer, with Linux Mint coming in a close second. The logic there is that I can get it running faster with Arch, but there are some features that I use so rarely that I’d rather a distro like Mint take care of them — things like CD burning or encrypted file systems.

I can do those things in Arch, but it’s rare that I need them, and Mint usually sets them up quicker and better than I do.

Over the years I’ve jotted down a few minor notes on one-shot commands or quick-step loops to tackle oddball tasks. I keep them in a flat text file called “tricks,” and when I need to get to one of them, I just grep through it until the command I want appears. Adjust for minor differences or filenames, fire, and forget.

For example, a while back I found a command to list all the packages that are installed on your system, in alphabetical groups. I modified it a little bit, to:

for i in {a..z} ; do echo -e $(pacman -Qq | grep ^${i}) >> packages.txt ; echo >> packages.txt ; done ; fmt -w $(tput cols) packages.txt

The original post (which I don’t have a link to any more :( ) split them out differently, and words broke at the end of the line, sometimes making them hard to read. I solved that with fmt, which is more than happy to wrap text like a word processor, but likes to run together lines. Hence the extra echo. Oh, and it doesn’t seem to like to have text piped into it, so the external file was necessary.

I think the original post caught some flak for 26 iterations of pacman, but I don’t have a problem with that. Might was well put the system to use for a little bit. If it annoys you, feel free to adjust it.

Some of the “tricks” were my own creation. Back in September I got the (stupid) idea that I would dump all the executables from bsd-games, util-linux, coreutils and even binutils into my list of text-based software, just to be sure I hadn’t missed any hidden gems. :roll:

It turned out to be a real hassle, and the results of it were one of my biggest lists of oddball commands in the history of that blog or this. In any case, this was the command that got me in trouble:

for j in {util-linux,binutils,coreutils,bsd-games} ; do LEN=$(echo ${j} | wc -c) ; for i in $(yaourt -Ql "${j}" | grep bin | cut -c$(($LEN+10))- ) ; do echo "${j}" >> temp/"${i}".wiki ; done ; done

Most of that was done to trim off the extra stuff that appears with grep’s output; since the length of the name of each package was different, I had to check where the title ended and the actual binary name began. wc takes care of that, with the -c flag. And to keep this from polluting my home directory, it dumps everything into temp/. The .wiki suffix is just for the benefit of vimwiki. ;)

Not everything is Arch-specific in that file. Here’s one that I use more often than I thought I would: Taking a folder of files, and moving each one to prepend the original name with its date stamp:

for i in * ; do mv "${i}" "$(stat -c %y "${i}" | cut -d' ' -f1)-${i}" ; done

stat comes through this time, as a way of generating the timestamp of the file. cut that down to the first field only — the date — and voila, moved to a new name, listed by date. Suddenly your folder has organization.

I use yaourt on a daily basis, and with good reason. With such an abysmally slow Internet connection, I have a tendency to hoard software packages, which is not generally an advisable habit with Arch. Occasionally I make a mistake and wipe out my cache, or just slip after a failed upgrade, and need to pull in a fresh copy.

And so … download a new copy of everything that’s installed on your computer. This one took on a new meaning when I realized you could pipe yes through another program, if you wanted to automatically feed it a positive response:

yes | yaourt -Sw $(yaourt -Q | grep -v local | cut -d'/' -f2 | cut -d' ' -f1 | tr '\n' ' ')

Get a full list of packages from yaourt, which will be preceded by repo names. Filter out anything from “local,” since that’s built and not just downloaded. First cut off the repository name at the slash, then cut off the version at the space. Finally, tr substitutes every carriage return for a space, so we can give one giant command to the outermost yaourt, which will download the entire list. Better leave that one overnight. :|

If you don’t use yaourt, you’ll need to adjust that a little bit, since pacman -Q does not show group names. It also means your “local” packages will be mixed in with your downloadables. Just so you know.

Since we’re on the topic, yaourt lets you do some funky things with your cached packages, and also with the materials it downloads to build local software. If you look in /etc/yaourtrc, you’ll find:

EXPORT=2           # Export to 1: EXPORTDIR or PKGDEST
                   # 2: pacman cache (as root)
EXPORTSRC=1        # Need EXPORT>0 to be used
#EXPORTDIR=""      # If empty, use makepkg's connfiguration (see makepkg.conf)

I’ve adjusted those values to do a few things.

First, EXPORT=2 will send the built package to pacman’s cache, which is a wise move if you ask me. By default yaourt builds everything in /tmp, and it evaporates when you’re not looking. I’ve lost more than one hard-built package by not moving the final product out of that directory. :'(

EXPORTSRC=1 does the same thing for the source files and git trees that it downloads. This too can be a lifesaver if you lose the completed package. EXPORTSRC will send everything to EXPORTDIR, or in the absence of that, to the destinations listed in /etc/makepkg.conf. And what are those?

#-- Destination: specify a fixed directory where all packages will be placed
PKGDEST=/home/packages
#-- Source cache: specify a fixed directory where source files will be cached
SRCDEST=/home/sources
#-- Source packages: specify a fixed directory where all src packages will be placed
SRCPKGDEST=/home/srcpackages

By default, all those folders are commented out, so makepkg’s configuration leaves everything in the same folder where it was made. Change those values above, and it will shuffle packages and source files to those directions. It will also create a symlink to the package it just built, so you’re not wasting space.

Let yaourt use those directories, and it will follow those rules, and tar up its $srcdir as well, before sending it off to that destination. In that case, yaourt will tidy up its efforts and leave you all the pieces you need to do it all over again.

And yaourt is generally smart enough to check those directories for source files before re-downloading them. Or so it seems, in most cases. :\

Two more smaller non-secrets I should share:

yaourt -G pkgname

will create a folder named “pkgname,” download its PKGBUILD and all the patches or installation scripts for that package, ready for building. It’s another reason I use yaourt, to be honest. And:

pacman-optimize

It’s not often that your pacman database will need optimizing, but I can vouch for it on slower machines as a way to speed up searching and filtering. Again, just so you know. ;)

Clearing out the bookmarks … again

I did it again: I collected a mass of bookmarks that I figure I’ll need at some time in the future. Maybe I used them and will again … and maybe not. Either way, they may still prove useful. I do refer back to this site when I can’t remember a page or a topic, you see. :roll:

So here we go again: More links for future reference.

  • I sometimes keep links to pages that have instructions for lightweight systems for old distributions; here’s one for Debian Lenny and one for Crux 2.7 (in the i686 flavor, which doesn’t really matter). That might seem counterintuitive, but I will fall back on old distros when working with old hardware, before making the leap to current flavors of Linux. For an example, peek here.
  • Along those same lines, I found a fairly coherent tutorial on how to install Exherbo. I had a link to another one, but apparently the author took it down. :( I have been wanting to spend a little more time with Gentoo (and possibly Exherbo) but I’m always attracted to the way Crux handles things. That being said, Crux dropped i586 support years ago, and hasn’t had i686 ISOs (unless they’re hiding) for a year or two at least. :( Story of my life. …
  • I use dd a lot, not just to blank drives or scramble the contents of files, but for other things too. To that end, a speed comparison at different block sizes is actually very useful. Of course, I’ve seen some posts on StackExchange that might offer different solutions.
  • Along those same lines, this page gave me a little insight on how to mount a specific partition in a disk image. It saved me a little time with a copy of an old 10Gb hard drive, since I didn’t have to write it back out to a drive to get at the files I wanted. On the downside, counting out all those offsets was a trick. I’m surprised Linux hasn’t thought up a more straightforward way to do that. …
  • I used to be real nit-picky about fonts, but these days I don’t really mind. I did find a good collection of font suggestions for Arch on Reddit, but I’m not the kind of person who installs two dozen font packages just to see a few extra characters in my terminal emulator. Now if we were talking about fonts for virtual consoles, I’d be much more interested. …
  • Since I’m in fix-it mode, here are a few pages about
    • installing python programs to different directories with pip, which is interesting because I’ve thought for a long time that there is no setup.py uninstall;
    • checking to see if directories exist with bash, which came in handy just a day or two ago;
    • how to install Arch from within an existing Linux installation, which I want to try sometime, just to see if it works; and
    • the difference between single brackets and double brackets to bash, which I never knew but explains why some of my long-ago scripts didn’t work as expected.
  • emacs fans would probably love to run just emacs on a Linux kernel with nothing else, and this post can tell you how. It reminds me of my long-ago attempt to trap Midnight Commander within a tty session, much like could be done a long time ago with rtorrent.
  • I should take the time to set up mutt with multiple GMail accounts, like this. I don’t dislike alpine, but I only keep it around because I’m too lazy to set things up. :\
  • From the Department of Ancient Awesomeness comes three flasbacks that just made me nod my head: one on the best distros of the year 2000, another of the best window managers of the year 2000, and perhaps best of all … a complaint from 2008 about how Firefox is utter bloat. The more things change, the more they stay the same. …
  • I watch the Debian systemd soap opera with only a little interest. I’ve been using Arch for quite some time now, and I have no complaints about the newcomer. All the same, if you’re wondering where you’ll stand when the revolution comes, raymii’s chart from earlier this month might be helpful for you, as might this systemd vs. sysvinit cheatsheet. Neither page will convince you one is better than another, but might help you understand how they each handle the startup task. Knowledge is power. :twisted:
  • You won’t hurt my feelings if you find some Linux know-how somewhere else; even I found this list of tech podcasts rather interesting. I don’t really get into podcasts much, but from time to time I will grab one and spin it up.
  • Finally, from the Completely Unrelated to Anything Else Department, here‘s an interesting project: An Android browser that displays web pages (believe it or not) by way of relaying the content through SMS messages. O_o Now I’ve seen everything.

And now I’ve listed everything. If those are at all useful to you, please bookmark them in your own system. Hold on to them for about four months, and then yell “I gotta do something about these bookmarks!” and offload them to your own blog. It seems to work for me. … ;)

Text-based gamers can afford to be picky

I just finished up 10 days riffling through a score of console games, and the experience reinforced something I wrote about a month ago: that text-based gamers can afford to be picky.

And I should probably be clear about what that means. I’m not suggesting text-based games are somehow superior to graphical games — that, after all, would depend on what we were comparing. There are some very good games in both camps, but unless we’re comparing direct renditions, both sides will have obvious winners.

No, what struck me about the past 10 days — and the past two years, really — was the sheer depth and breadth of the field, and how many of those were actually very good games. Not necessarily technical or visual triumphs, but games that grabbed my attention and engaged me for a serious amount of time.

Pick an interface, a game or a genre, and there will be more than a handful of games that fit it. Limit yourself to text-based games, and there are still many, many titles that will impress you.

Money is no object. It's no joke either.

Money is no object. It’s no joke either.

Here’s a good example: Starlanes, which I just learned about a week ago, and now love to death.

Starlanes won’t wow you with its “immersive” 3D interface or tickle your retinas with intricate shader effects. It’s just a straightforward game that has precise economic rules, and those rules permit some wild, but completely logical, turns of events.

In the “sci-fi-strategic-economic-territorial-market-acquisition” genre, Starlanes pretty much dominates the field — and has, for the past decade or two, considering it’s not a new game. You might have to look outside the text-only field to find another title that competes, or compares, with it.

But don’t think that I’m backing the only horse in the race. Starlanes is a powerful game in its own right, even if you think it’s just a big fish in a small pond.

War is hell, even in ASCII.

War is hell, even in ASCII.

Here’s another one that follows a basic format but has excellent rules: Curse of War, which also makes the leap from turn-based to real-time strategy.

Guide the growth of your civilization toward geographic goals, and dominate your opponents by sheer force of demographics. Set flags and your population migrates toward them; fortify your hold on resources by improving structures. It has all the flavor of most real-time strategy games, with tiny splashes of Life, Populous, StarCraft and even Civilization, to a small degree.

Starlanes and Curse of War don’t compete by any stretch. Starlanes is turn-based, and has a strategic appeal that is reinforced by some simple territory annexation and market rules. Curse of War is real-time and allows less precise geographic control, but offers its own set of rules that govern migration, zones of control and even fortifications. Different, but also a very good game.

That's me, running from the Americans again.

That’s me, running from the Americans again.

So economic simulations and population strategy games aren’t to your liking? How about a tall ships simulator?

I found Sail a few months ago, hiding in the ancient bsd-games package, and the level of detail made my jaw drop. It should be enough to say that Sail was intended as a conversion of a decades-old Avalon Hill tabletop pen-and-paper, map-and-counter game, but if that doesn’t ring your bell, let me say this: Imagine Sid Meier’s Pirates!, cross-bred with any version of Microsoft Flight Simulator, and strained through an ASCII filter. It’s almost frightening.

The majesty in something like Sail isn’t in its uniqueness; after all, pirate games have been around since kids were invented. Sail wins points for pulling in an established rule base, automating the more cumbersome points like firing angles or wind calculations, and creating a game that’s just as fun as the tabletop game … and still in a text-only format.

The best analogue I can think of for Sail would be the classic Star Trek: Starfleet Command PC games of a decade ago, that converted the original Star Fleet Battles games into a fully graphical environment. I played the original Starfleet Battles games a long time ago, and let me tell you, the PC rendition was a gift straight from Allah. :|

But simulations aren’t everyone’s cup of tea either, and I don’t find fault if it’s not your favorite. How about we go the opposite direction, and pick up something simpler? Something old, and yet new?

Eat your heart out, Gameboy.

Eat your heart out, Gameboy.

If I must be honest, Yetris is not my choice for the tip-top Tetris clone available in chunky block characters. That honor belongs to vitetris, as I’ve mentioned many times in the past. It’s definitely not for lack of trying though, and as you can see, Yetris takes the Tetris precept and spins it at a fever pitch.

Yetris stands out to me because it doesn’t accept the terminal environment as a limitation, full stop. vitetris offers network play and a few other smaller fillips, but Yetris takes command of the terminal, sets its own rules for visual appeal and space usage, and does not back down.

At the start of this post I said most of these games wouldn’t be visual triumphs; Yetris is. If half the games I’ve seen in the past two years approached the text-only medium with the same level of aggressiveness, there would never have been a migration to the graphical desktop. We’d all be playing fantastic Yetris-esque versions of World of Warcraft. :???:

Papa was a troll 'n stone ...


2014-10-26-6m47421-angband Papa was a troll ‘n stone …

And so long as we’re on the topic, if I had to pick out a text-only game that might satisfy the fantasy RPAG crowd, I’d have to think for a minute. When I opened the field at Inconsolation to more roguelike titles, I realized I was going to be drawing in dozens upon dozens of games that followed much the same format, but offered their own small personal tweak.

I saw a lot of them — a lot of good roguelike games — last week, and the two at the top of the stack were definitely ADOM and Angband. It would be impossible to try them all, so you could argue that I missed out on [insert title here] which was clearly superior :roll:, but those two are what I remember most.

ADOM took the dungeoneer out of the dungeon, and turned him/her loose on a world above. Angband refined the classic moria format derived from hack and rogue, injected healthy doses of Tolkien and Gygax, and arranged everything with better color and a better layout. And both games are very good indeed.

Hail to the king, baby.

Hail to the king, baby.

But still — after years of poking and prodding the Internet and seeing what falls out — but still the game that keeps pulling me back for hours on end, is Dungeon Crawl: Stone Soup.

I don’t even know if I could tell you what makes Crawl better than any of the others, except that it manages almost every task efficiently and expertly. It handles dungeon generation, targeting, ammunition, autoexploring, autofighting, automapping, spellcasting, religion, regex object searches, skills, proficiencies and specialties, poisons, mutations, races, subraces, classes, subclasses, divine blessings, diseases, vampirism, draconism, encumbrance, hunger … I can’t list them all.

Truth be told, I stopped installing Crawl a long time ago — because I can get to it through ssh, and that’s even better than installing it if you’re on an old machine. All that intricate stat management might take a toll on my dear old Pentium. ;)

There’s only one game I know of that approaches the same level of detail and comprehension as Crawl, but still works in a text-based environment. But we have to switch atmospheres to peek at it.

Of course I died right after this screenshot.

Of course I died right after this screenshot.

For my money, Cataclysm: Dark Days Ahead is the pinnacle of text-based games, regardless of the genre. I realize that’s a bold statement, but I think I’m in a position to defend it. I’ve done my share of research.

Just about everything that I’ve mentioned in Crawl is also available in Cataclysm, understanding of course that Crawl is a fantasy RPAG and Cataclysm is a zombie apocalypse survival epic. So of course, some parts don’t overlap.

But I knew I had found a winner when I realized I could actually create primitive explosives in Cataclysm, by scrounging through stacks of other garbage left over in the remains of my world. And then I discovered I could rearrange my layers of clothes to provide for more pockets. :shock:

And when I thought through my defenses and resources, rather than just wandering around the countryside like an idiot. Staking out an abandoned power station and devising barricades and gauntlets for zombie onslaughts suddenly made perfect sense. …

This is where I have to stop. This post has already taken me about two days to assemble, because every time I mention a game, I reinstall it, and then I play it, and from there … half the day is gone. … Not that I’m complaining, of course. :P

But don’t ever let it be said that there’s nothing to do for fun in a text-only environment. There is a ton of great entertainment available that doesn’t require a graphical environment, and probably more importantly, doesn’t require a quad-core with dual SLI cards and 12Gb of memory … just to install. ;)

Ghosts of the machines

I haven’t taken the time to update much here, but I seem to still devote most of my time to following trends in text-based software. It consumes a considerable amount of the day.

There are a few things I should mention though, even just as updates to my trials and tribulations with outdated hardware and modern software.

First, the CTX EzBook 800 I mentioned a few months ago is not much closer to a full working state, but it was never very far off. Most distros, aside from Crux, either miss a beat with the hard drive or the optical drive, or sometimes both.

An added complication is that there seems to be no response from the PCMCIA port whatsoever. Every attempt to even acknowledge a card there comes up dead, regardless of card or distro.

As a troubleshooting measure — strictly for troubleshooting, I swear — I swallowed my pride and installed Windows 98SE from a friend’s CD (try and find one of those these days :shock: ). No life, no lights, no response.

Which leads me to believe the port is damaged or dead. It’s reassuring in a way, since it means it’s not necessarily a configuration error on my part, so much as a hardware defect that may or may not be fixable.

In any case, I could conceivably use it un-networked, as some sort of offline data storage device, and transfer files on and off via USB. It’s not an appealing option, but it’s possible.

Second, in the Thinkpad realm, I’ve allowed the ’41s to move on to new owners. I enjoyed my time with them but I am overburdened with laptops these days and need to make space.

The T41 will be a low-strain home PC for a local friend, but — even better — the X41 is going to be part of a business IT department, monitoring server performance. How exciting! :mrgreen:

The aforementioned Inspiron 4000 turned out to have far deeper problems than I suspected. A new power supply wasn’t … supplying power :roll: for some reason, so I finally pulled the entire business apart to see what could be done.

And it appears someone else had already had that idea, and “repaired” it with cyanoacrylate. The machine must have been dropped at some point, and the cracks and seams resealed. Most of the casing was in shards by the time I could get to the motherboard, at which point I made a command decision and pronounced a time of death. Its usable parts are now awaiting transplants into other hosts. A sad ending. :|

Next, I feel obligated to mention that I’ve run through a stream of Dell machines in the past month or so. I spent a short time with an Inspiron 5150, which was an interesting experience. On almost every front it outstripped my in-house 8200 machine, but was almost dull by comparison. That machine has moved on to a Windows fan, who wanted a native XP environment for classic 3D games of the 2002-2005 era (one of the FIFA games, I think).

I also came across two D610s, one in mediocre condition but the other in pristine shape, to include the carrying bag, CDs, cables, batteries, etc. It was a very nice gift.

D610s are strictly business though, and most of the internal hardware is unappealing — a lot of Broadcom network interfaces, which I hold in high disdain. Both are viable Linux candidates, but would probably require as much in supporting hardware (i.e., replacement MiniPCI wireless cards or PCMCIA network cards) that it might not be worth my effort to keep them.

I understand that the 3.17 kernel has better Broadcom support, so I might keep them around until that reaches the Arch core repos, and see if it’s true. I’ve been promised that before though, and in this day and age, there’s no need for me to cling to a Broadcom-based machine.

I also should mention a rather battered D810 that made its way to my doorstep. It was more a curiosity than the D610s, because the hardware seemed comparable, but the widescreen aspect threatened to bog down the desktop. Perhaps 1680×1050 is a bit big for a 32Mb ATI X300 card. …

What else … ? A couple of Thinkpad R50p‘s, which together were in such bad shape that there wasn’t enough left to make a single working computer out of them. And one had a password lock at the BIOS, and I’m not going through the trouble of reading EEPROMs just to start up a 10-year-old laptop. >:(

I also got an old Gateway 6518GZ that was DOA … an HP dv6000 with an unseated video card that probably would have needed a reflow to bring to life. … A couple of other lesser creatures. … :|

But the real score of the month was permission to tinker with a pair — not just one, but a pair — of truly ancient Dell Latitude LM machines. I can’t keep them but I’m allowed to poke, prod and pick at them for a while, provided I return them in original condition. Here’s one in action, with its original Windows installation.

2014-10-20-84kbn-win98

These are true 133Mhz Pentium machines with 40Mb of RAM apiece, NeoMagic video cards and 1Gb hard drives (one completely error-free, after all these years!). Otherwise standard arrangements of PCMCIA slots and sound cards which are undoubtedly ISA components. I haven’t had a challenge of this level since … oh, probably this machine.

I have to return them to their owner in a couple of weeks, but I’m enjoying the opportunity. The biggest threat at this point seems to be their complete inability to boot from CD, and neither has a working floppy drive. Luckily they were designed with the hard drive in a pull-out tray at the front of the machine, accessible with only two screws. And even better — my IDE-to-SDHC adapter works well. :D

I’ll put up some more pictures and maybe a full post sometime in the next week or two, if I can. They have to return to their owner at the end of the month but I have free reign for a while yet. If I can make any progress, I’ll make a note of it here.

Odd, the things I think are fun. … :???:

P.S.: A very special thank-you to my online donor, who provided some of the machines mentioned above, but asked to remain anonymous. ;)

One existential crisis at a time, please

Not everything I keep around the house is an absolute winner. I do feel like I can pick and choose the machines that stay with me, and which ones go on to new owners and new lives. But sometimes there are machines that really test my principles.

Here’s one. This is a lowly Dell Inspiron 4000. And it’s definitely not a model specimen.

2014-09-14-insp-4000

Quite to the contrary. This machine is a veritable best-of list for everything that can possibly go wrong with an old laptop. When it came to me,

  1. It had no memory.
  2. It had no battery, and no power supply.
  3. It had a CD player, but the door mechanism is broken, and if you don’t hold it in, it doesn’t read the CD.
  4. It had a floppy drive, but I shook so much dust out of it that I’m seriously concerned about jeopardizing one of my few remaining floppy disks by testing the drive.
  5. It has no rubber feet left, and what remains causes it to rock on a flat surface.
  6. The screen is in good shape, but takes a while to warm up. Until then, the display has a red tint to it.
  7. It has one — only one — USB port, and that’s a version 1.1 port, so it’s phenomenally slow. To make matters worse, it feels like the port is losing its grip on the motherboard, because the port flexes when you push in a drive. Scary.
  8. About a fourth of the keys — mostly in the upper right quadrant — don’t work. Either the keyboard is on the fritz, or the signal isn’t being caught by the machine. I guess the former.
  9. The CMOS battery is dead, so you have to set the date each time the machine boots. Which is tricky, because again, a quarter of the keys don’t work.
  10. It has no built-in network port, or rather, this particular model has a plastic shield over the ethernet port, which usually was a sign that the board didn’t carry that port.
  11. It has more than its share of cracks, dings, scrapes, gouges, split seams, broken corners, busted lips, scuffs and scratches.

It looks a great deal cleaner now than it did when I got it. It still needs a complete disassembly and scrubbing — if it stays, of course.

And that’s where the existential crisis comes in. Because in spite of all that damage and all those deficiencies, it still works. Its saving grace is the the fact that it belongs to the Dell C-series, which means that laptops from about five or six years before and five or six years after it all used compatible parts — including this one.

So, after calling in some favors for a 256Mb stick of PC100, then borrowing the battery and a modular DVDRW drive from the 8200, I turned it on, and it came back to life. The Windows 2000 installation was still in place and functional, even if it was hideously slow. The touchpad is in good shape. And the screen is clear and free of flaws.

I gave it an Atheros-based PCMCIA wireless card, and started it up with a PLOP CD and the Arch Linux install ISO on USB. From there I could ssh into it and work up a system, for as long as the battery would last. And as you can see, after some slight delays, it’s functional again.

But from here it becomes a question of worth, because at its core, it’s still a 600Mhz Celeron, with only 256Mb of memory, a lowly 30Gb hard drive … and all-over barely functional. Sure, it has all-Intel guts and an ATI Mobility card. But it’s not something your day-to-day computer user, circa 2014, wants to take home to meet the family.

So the jury is still out on this machine. I still haven’t tried it with a proper power supply, and I need to know for sure that the keyboard issues are just in the keyboard. I have a feeling that it will cost me more than the value of the machine just to find that out, which is why I’m debating disassembly for parts.

I hate doing that, but sometimes you have to make difficult decisions. :|

Linux desktop hate, and the profit in yellow journalism

I’m going to give you two links today, but I don’t want you to click on them.

Usually when I have links I don’t want you to see, I just withhold them altogether. It’s safer that way, and I can generally give you an idea of what’s there without inflating your blood pressure by sending you to those pages.

This time both articles are critical of Linux, and if you’re reading this you’re either familiar with Linux or a proponent of it. The first is Matt Asay’s insistence that Linux abandon efforts toward a desktop, and the other is John C. Dvorak pulling the plug on Linux’s viability at the desktop.

I’m not sure why the Linux “desktop” is getting so much hate these days, but then again, I’m not really sure what the Linux “desktop” is. If there is a concerted effort to corral the efforts of every free software project out there, and herd the masses toward the “desktop,” I wasn’t aware of it. The Linux desktop has always just “been there” for me, and so maybe I take it for granted.

But it’s worth looking at both articles, for wider reasons that actually move beyond the scope of this site.

Matt Asay should be a name you’re familiar with, if you’ve been around the Ubuntu fan club for a year or two. Matt was a former company officer with Canonical, and apparently has links to Novell and did some academic work with open source licensure.

It might be easy to see why a former Canonical headman might prefer the Linux “desktop” expire. For half a decade now, Ubuntu has been trying to convince me that my computer is actually a cellphone, with no success. Unity’s glaring shortcomings aside, it’s easy to see how someone who drank so deeply of the post-2010 Ubuntu Kool-Aid might walk away insisting that Linux abandon the “desktop” and embrace its smartphone/server renditions.

Mr. Dvorak is another matter, with a slightly longer repertoire in the tech industry … including insisting as far back as 1984 that a computer mouse was nothing appealing, that Apple should jettison the iPhone, and that the iPad would end up in the dead zone of tablet computing.

Perhaps with such a track record for faulty divination, his dismissal of the Linux “desktop” for its lack of a killer app might actually be a good sign.

I’m not going to criticize either gentleman on the grounds of their technical or academic backgrounds, mostly because my own resume doesn’t include a CS degree, or any computer, electronic or technical expertise beyond “hobbyist.” Asay is a career corporate officer, Dvorak is a history and chemistry major, and my own academics are similarly distant from technology. We all found our way here somehow.

But here are a couple of thoughts for you, before the topic widens.

Matt Asay’s rant appears on TechRepublic. CNET bought TechRepubic in 2001. CNET is part of the holdings of CBS Interactive and subsequently CBS Corporation.

John C. Dvorak posted his casual dismissal of the Linux “desktop” on PCMag.com. PC Magazine is published by Ziff Davis, which has sold off some media assets to QuinStreet but has a parent company in j2 Global.

That’s no great feat of investigative journalism on my part; it’s really just following links through Wikipedia or About pages. I hope, though, that it shows a trail of bread crumbs back to news and information corporations.

And this is when the word “clickbait” should spring to your mind … and hopefully now, you can see why I didn’t want you to visit those links.

I worked in journalism for a long time, which was a mixed blessing. When paste-up print media faded and graphical page design took over was around the same time journalism on the whole began to decay.

It would be easy to blame technology and the Internet for that, but that’s not completely the case. Newsprint in particular never had a sky-high profit margin, and even in the golden days of 50 or 60 years ago, a lot of journalists were in the field because of a sense of social responsibility, or out of respect for the tradition.

If I had to pick one point in time, I’d say things changed with 60 Minutes, which showed that the news could turn a profit. It didn’t matter that 60 Minutes, even into the 80s, was at times an exceptionally well written and well researched program — in other words, good journalism. The profit was there, and some smelled the potential for more.

From then on — roughly 20 or 25 years ago — the news was no longer a business held for generations by liberal-leaning family-owned corporations. Decades of thin profits earned through a “noble” pursuit of news were hacked down to increase the amount of money moving upward.

If newspapers were slipping by the 1990s, the Internet probably greased the slope. Even so, newspapers and media corporations in particular were eager to throw out the paper model, and I can recall editors foaming at the mouth when the prospect of going all-digital appeared. Ad men and editors alike were all too eager to drop a physical medium for an electronic one.

But with that came a corresponding drop in quality — after all, if you can skimp on the medium, you can skimp on the message. It was easy to slap a story onto a web page, and it was even easier to hire someone off the street to concoct a rambling 36-inch story about fly-fishing, pieced together without ever leaving the office. The biological tendency for reporters to plant themselves in front of computer monitors and dredge up a few quotes off the Internet became the norm.

I can recall a particularly painful moment when I and a city editor ransacked an editor-in-chief’s office one night, looking for the application materials for a writer who had been on our staff for about a month. We were dumbfounded that the man was such a horrible writer but had gotten the job; when we saw his application test we realized he couldn’t string two words together to save his life. But he worked cheap and had ten fingers, so they hired him.

The corollary: There are no good reporters, only good editors. Remember that, and you’ll do fine in life.

But in a nutshell, that’s how we find ourselves where we are today. Asay and Dvorak are just the latest in a trend of yellow journalism that publishes uninformed or poorly researched news material in the hopes of winning a visit from you. In the old days, circulation, single-copy sales or viewership determined how a newspaper or television station was performing; these days your click is one out of a million, but they all add up to revenue.

You too can post a profit with one inciteful (but not necessarily insightful) writer and a pay-per-visit contract with an ad company. And the Linux audience is no different, as Asay and Dvorak have shown.

I could go on about this for hours, but no one is served by it. It is my hope that the next time you see a particularly vitriolic article deriding any point on the social continuum — be it the Linux “desktop” or otherwise — you pause just long enough to follow the bread crumbs back to the corporation that’s making the money from your visit.

It’s always easier to recognize a marionette when you can see who’s holding the strings.

Postscript: If for some bizarre reason this topic is interesting to you, Paul Steiger wrote a long but terrific memoir in 2007 of his days on the Wall Street Journal that encapsulates the less-than-graceful shift from paper to Web site. Alessandra Potenza’s defunct but exceptional investigation into journalism in Italy, Europe and America is also worth visiting. You can compare those to a the perspective of a younger journalist who joined the profession at the crest of the digital wave, and see how the focus shifts away from social values and toward the technological element. Some of that can be attributed to experience, some to inexorable sea changes. You be the judge.