Category Archives: Linux

If thou beest he β€” but Oh how fallen! how changed. …

Day by day things get a little more busy, and the month of March will be a real grind for me. And so as small tragedies unfold and develop, I have to make quick mental notes and hope to tack them here as time allows.

For example, a week or two ago, this machine suffered the same fate as this one, with the simple act of opening the lid resulting in a thousand tiny shards of plastic bursting in a thousand directions.

And to add insult to injury, the torque of the splintering bent the LCD, and a fine prismatic web enveloped the lower right quadrant. So not only was the casing and chassis on the verge of complete collapse, but the display was suddenly valueless too. Fate — or the gods or whatever deity you prefer — had made its decision, and I was left with the pieces.

I blame the natural deterioration of plastic, which must, I suppose, slowly decay with age and give way to the laws of physics. It doesn’t dry my tears, but that is the way of all flesh, I suppose.

Ironically, not a day later, I’m handed a 700Mhz Thinkpad T22 with most of its components in working condition. A little elbow grease, a few memory chips, and the EzBook is mostly forgotten. Or at least transplanted into a new host. πŸ™„

And call me crazy, but just about any Thinkpad from pre-2012 or so is now a definite keeper, and that’s the real point of this little speech today. The golden era of the Thinkpad came to a thundering halt only days ago, with the revelation that Lenovo was preinstalling laptops with the faulty certificate tied to Superfish.

Any crudware so bad that the U.S. Department of Homeland Security tells you to uninstall it is an honest-to-god, plain-Jane screwup of colossal proportions. Not that the U.S. government has any real integrity on the subject of computer security, but you couldn’t sell me a new Lenovo now for holding shut the screen door to my back yard. I don’t think it could handle that much responsibility.

Of course, the slow decline of Lenovo machines was something I had sensed earlier than this week. The newest Thinkpad I have now is a T410 that’s already pushing four or five years old, and it’s only so-so in my mind. The newer Lenovo-branded laptops I’ve seen were even more disappointing.

A friend’s Y560 had so much bounce in the keyboard plate that I thought I was typing on a trampoline. And at a time when it was barely two years old, he’d already lost sound in one speaker, torqued the earphone jack so badly he couldn’t insert a plug to it and had a lot of play in the hinges.

I’ve seen a few Yogas and some high-end “Thinkpads” in the past year too, and nothing enthused. Most were not far off from mid-grade department store machines.

Sad, really, for the line that once held the mighty T61p, which was a solid full-size performer with fantastic graphics at a fantastic price. Or the X60, which is still available in resale and is probably one of the best portable dual-core full-feature laptops ever made. I’ve had one of those and an X61 too, and I’d be willing to part with cash — actual cash — to find another one in decent condition.

And that from a person who gets most of their machines for free or as trade.

I’d even be willing to reach back to the X120e, or the X31 or even the i-Series for positive experiences. In fact, I can think of only one Thinkpad I’ve had in the past decade that didn’t leave me smiling, and that one dated back to the early days of Pentiums, and the rise of the brand. From my perspective, the arc of the Thinkpad started around the Pentium II, tapered off four or five years ago, and came to a crashing halt last week.

Of course that does mean any Thinkpad you’re holding that wasn’t built post-2012 is a winner, at least in my book. I really don’t mind if it’s a Pentium, a II, a III or a 4; what you have is a bona-fide slice of computing history, made at a time when the Thinkpad was trustworthy, dependable, flexible and powerful. I’ve got more than my share right now, and I’ve had an even broader array in the past — and I’d still be inclined to invest in one that had potential.

I have to draw the line when the plastic shatters and tiny black splinters fly across the room, though. 😐 That is the way of all flesh, I suppose.

Reviewing the review, the reviewer and the reviewee

I had an interesting conversation by e-mail last week, around the role of software reviews, within the context of that last post when I said Linus was right. My correspondent wanted to know if my process or stance had changed since Gamergate, in the way I approach software, phrase my posts or communicate with developers.

I had to look it up. The only context I had for gamergate was leftover from my primary school biology classes. :\ But after I knew what the hubbub was, the short answer was no.

The long answer went something like this: No, because I wouldn’t call myself a gamer — or at least, not any game that’s likely to still be played on a large scale — so the events of the last six months were a minor surprise. It’s hard for me to relate to triple-A titles on the market now when I still think a few hours of Deus Ex is a good day of gaming. πŸ™„

So whatever swirl of sewage has engulfed the gaming market on any given day is a bit distant for me. I have no immediate context for the latest clickbait drama.

And I wouldn’t call myself a reviewer either. Sometimes people ask me specifically to “review this program” or “review that application,” and I don’t split hairs over the use of the verb. But “reviewer” to me implies the ability to apply a standard of measurement, which in turn implies an equitable level of prowess, which in turn implies a quality of wisdom and the ability to evaluate through comparison. And when it comes to writing and designing software, I have … hmm, let’s see … uh, okay … none of those things. 😐

So what do I bring to the party? Only a perspective built on practical use, learned the hard way over a decade with Linux … and over a decade more with day-to-day computer use in general. It’s true, I draw on experiences that reach back to the dark days of mainframes and dumb terminals, but I don’t count that as any special talent for “reviewing.”

I also have the luxury of working with free software, and removing the money element from the equation changes things drastically. The author of a poorly designed, poorly executed script posted as a freebie on Github is unlikely to take umbrage if I tell the world it’s poorly designed and poorly executed. Chances are it does whatever the author wanted, and everyone else (including me) can go get stuffed.

But it also means people aren’t usually reliant on their work as a source of income, and that deflates the situation. So again, if I point out that text editor X doesn’t actually edit text, no one has really lost financially … except for me, in the sense that time is money, and I lost time to find out that it didn’t work. πŸ™„

And in a smaller sense, I do this for free. It’s the last vestige of a hobby that evolved from a diary that kept track of an experience which turned out to be life-changing, and was never monetized, and never will be. So both parties are on equal footing, in terms of money flow.

That’s not the case with big-name, high-budget software — the developer is always looking for money, and the public is always trying to keep it. A negative review might wave off a few potential buyers and put a dent in the corporate balance sheet, or it might wave off a few potential buyers and prevent someone from wasting their hard-earned dough on a real thumbsucker. I don’t have to worry about that here.

There’s one more thing though, and this might be the most important: You don’t know me, and I don’t know you. It’s not personal. There has never been a name — a real name, mind you — attached to “K.Mandla,” and there never will be. If I say something crass about your opus magnus, it’s not directed at you. It’s directed at your work, your product, and its overall usability.

It may be that I don’t have a frame of reference for what you want to do, and I usually acknowledge those cases. It may be that our systems are incompatible or improperly configured, and I acknowledge that too, when I suspect it. Or it might just be that your program is crap, and it’s plainly obvious that it’s crap, and everyone can see that it’s crap. But it’s got nothing to do with you as a person.

I don’t care if you are a man or a woman, just like you probably don’t care if I am a man or a woman. I don’t care if you’re an ancient Linux guru, or a 14-year-old programming kernel modules out of your dad’s basement. I don’t care if your skin is black, white, brown or green with flecks of purple that sparkle in the moonlight. I don’t even care if you’re actually a dog, and nobody knows it.

None of that really matters if your software is crap. Me telling you, and whoever else in the thousand people that might wander past this page in the course of a day, is really only pointing out a deficiency that’s plain to see. If it damages our relationship … well, we never really had one to start with.

I do try to be understanding. If someone is obviously learning, I can make allowances for that. If a program is obviously incomplete, I make allowances for that too. People occasionally write back to me and say, “You know, that program was never really intended for wider distribution. I just posted it to keep it handy.” That’s fine too, and if I know that, I try to be fair.

But there are times — oh brother, are there ever times — when software is junk, and we all know it’s junk. And you’re just going to have to deal with that fact. Me saying it here, and perhaps wounding your ego in the process, is only drawing a line under a fact that we all see.

So no, Gamergate had no real impact on the way I do business here. I don’t soft-shoe my posts or sugar-coat bad news, and I don’t think I take responsibility for program failures more often than I should. If your program is good — and we’ve already seen a lot of those this year — I try to say so, and we all can applaud it. And if it’s crap, I try to say so, and you can just deal with it.

Once again, welcome to the real world.

Linus was right

I’ve written and rewritten this post about four times this week, trying to get my thoughts on to the screen in the same way they appear in my head. I’m not having much luck, so I’m just going to blurt it out: Linus was right.

About a week ago Linus told a crowd in Australia “I don’t care about you,” which was dutifully reported by Ars Technica as though it were an on-air vulgarity uttered by the pope. A lot of people held that comment up to the light alongside his record for (ahem) strong language on mailing lists, and wept a little for the future of open source.

I’m not one of those people. I don’t worry about the future of Linux after last week; in fact, if anything, I’m reassured by the event.

Let’s be serious: Who do you want at the helm of a linchpin open-source project like the kernel? A dainty hand-holder who says a contributor’s code is “special in its own way,” or a zealot so focused on the project and its viability that the slightest mistake is going to prompt a royal beating?

Do you really want someone to pat a developer on the head and say, “Let’s see if we can do better next time?” Or should faulty and substandard code be met with a red-hot tongue-lashing and public shaming? Should developers be special little snowflakes? or elite programmers who have to consistently meet standards to contribute? Who do you want writing the code that runs your laptop? or your router? or your server? 😐

I think it’s easy to isolate that one clause — “I don’t care about you” — and paint it in a difficult light. The public likes that: It’s drama, and the press wins a few more clickthroughs as a result. We’ve talked about that.

What’s more important though, was the rest of Linus’ sentence: “I don’t care about you. I care about the technology and the kernel — that’s what’s important to me.”

And that’s the part that reassures me — not just as a humble desktop end-user, either. It’s satisfying to know that the person in charge does it out of passion, and brooks no incompetence. If it trickles down to his contributors as uncaring or indifference, and that somehow bristles them, then they need to find new lines of work.

There’s another reason this little soap opera needs attention, and I’ll say it in brief because I’ve talked it over before: You can’t let the whim (or the whine) of the masses steer your project.

I mentioned this five long years ago under different circumstances but I stand by what I said then, and it applies even more now: You set the terms for your project, and if someone doesn’t like it, they can move on.

Personally I see that in a lot of the way Linus handles kernel development — “This is what we do. Contribute or fork, but don’t waste our time.” This is where the bus is going. Ride with us, or get off and walk. Our itinerary is not open to discussion. If you don’t like it, there’s the code, build your own. Don’t bother us with your mockups and wishlists. Get busy or get lost.

I can’t say it’s the best management style, but I think given the scope — both in technical and geographic terms — it’s probably the best solution. I’ve never managed an open-source project specifically, but there are definitely similar situations where an abrupt and off-putting management style is the best solution.

If you don’t like it, if it damages your sense of self-worth, or if it invalidates your image as a special and unique person capable of attaining your dreams in whatever form and fashion you desire … well, what can I say? Welcome to the real world.

The seven-year itch

I haven’t posted much here lately, mostly because of the holiday season but also because the flux of hardware through my house is either feast or famine — either I’m suddenly swamped with four or five new machines that depart equally quickly, or I sit and tap my fingers at the lack of something new to try.

But a week or two ago I thought through some small difficulties that arose with this machine, and made a change that is … rather unconventional.

If you know or remember much about the 8000 line of Inspirons, you’ll understand that the 8200 was really the capstone of the C-series. You could argue that the C840 and the M50 were its counterparts, and I wouldn’t disagree. But the home market (by my estimation) embraced the 8200 in a way that I feel outstripped the Precision or Latitude versions, even if they were compatible even down to the BIOS.

In that sense, the nVidia GeForce4 440 Go in its 64Mb renditions was about the best video card you could implant into an 8000-line machine. I’m not counting the Quadro4 700 Go GL mostly because I’ve never been able to find one — I saw one for sale on ebay early last year, and the price reached a level that could only be described as ludicrous. It’s too rare to compare, so to speak.

Point being, the nVidia card has always — even almost a decade ago, when I was rocking Crux Linux in a souped-up 1Ghz 8000 machine — been the card of choice. And even more so now, since the cards themselves are under US$20 used, and the lesser cards are basically giveaways.

There was an analogue in the ATI Radeon Mobility 9000. It had 64Mb of memory as well, but as we all probably remember as mantra from bygone days as lowly Windows users, if you want good graphics, you have to stick with nVidia. Or at least that was the rule a decade ago. Nowadays … I couldn’t tell you. It may or may not be the case. I’ve been out of the loop for a while, and I still think Neverwinter Nights is great fun in my spare time. πŸ™„

Back in October I think, I started having difficulties updating the kernel and keeping it in sync with the now dusty 96xx proprietary driver from nVidia. I’m not a newb when it comes to that driver; I’ve been building and installing it by hand pretty much since 2007, and while I can’t call myself an expert, I at least know when I’m up against a wall.

And that was the case for more than a few days, particularly when the switch to 3.17 came into Arch core (I think that was October, but I might be wrong). Nothing would build. Errors on anything after 3.17. 3.16.4 would build fine for me in Arch, but I wasn’t having much luck after that point. Something had changed, but I couldn’t see what.

The Internet, despite its unimpeachable pedigree as a clean and honest repository of truth, information and justice 😑 πŸ™„ , wasn’t much help. It may be that I am the last surviving user of a 440 Go card that prefers Linux, and I’d be comfortable with that. nVidia of course wouldn’t be interested in my issues, and if there were others in the same boat, I couldn’t find them. Or a solution.

So I did what any competent Arch user would do when confronted with a seemingly insurmountable inconsistency between hardware and new kernel — I added linux and linux-headers to the list of ignored packages in /etc/pacman.conf, and went about my business with the older versions I knew would work.

I should add a caveat to that though: I say “work,” but I should say “work acceptably.” Even with the last happy combination of proprietary driver and kernel, I would sometimes see tearing in gradients, corrupted edges on images or other tiny graphic defects. The nouveau driver, in case you’re wondering, was always worse — there, I got corrupted icons (icons? why only icons? πŸ˜• ) horrible redraws and a host of other issues. The proprietary driver wasn’t perfect, but between the two, it was the better.

And add one other thing to that — something that I didn’t even notice until last week: System resources were almost always pegged. Fans at full bore except at the lowest of moments, system load hovering around 50 percent even when untaxed, and a lagging sensation at the best of times. It was never a dealbreaker, but the issue was there, and I hardly realized it because that’s the way it had always been.

I mentioned before that the underlying motive to the press for online-everything is what will eventually drive a stake into the heart of any contemporary PC. This was different. I can compare the machine’s performance with others from the same era, and see a drag or a burden that wasn’t present in its peers. Something just wasn’t working right.

Fast forward to about two weeks ago. I saw the jump to the 3.18 kernel and thought, maybe my time has come. Booted to a console environment, updated the kernel, tried to rebuild the video module and … was left with the same error messages that had been cropping up for months.

At that point, I had a little soul-searching to do. I like the machine very much. I’m comfortable using older kernels, but there must come a time when other issues begin to build because I’m stranded on a three- or four-month old kernel. And the ancillary software (and here I admit I’m thinking about things like systemd) is growing at a breakneck pace, so at some point I’ll have to untangle other issues that are related to hanging back at an earlier kernel. Maybe not today and maybe not tomorrow, but someday soon, and for the rest of its life.

I could switch distros. But other distros generally performed worse with that hardware combination. And in any case, I’d be stepping back in software unless I went with something totally wild. So I’d do just as well to … not update at all.

It was the nuclear option really, but not a terrible idea. But still, it seems overblown, to run a machine (a perfectly usable machine) deliberately on out-of-date software (perfectly usable software) because of a failing in the proprietary software that runs the hardware.

And then the answer appeared to me, in a blinding spray of light, sort of like a Hollywood action movie. And I went to an online auction, and dropped about US$19 on the 440’s counterpart, the Mobility 9000.

It got here last week, and life changed immediately.

No more proprietary drivers. No more rebuilds at every kernel update. No more convoluted xorg.conf files. xf86-video-ati does everything I need, and with no more effort than typing out the name and hitting enter.

No more taxing the system resources. No more torn gradients or sluggish page draws. Yes, that Internet drag is still there, and it’s still annoying, but there’s nothing to be done about that. It’s like death and taxes — regardless of your machine, eventually bad code and deliberately obtuse web content will engulf and extinguish your machine.

But it’s amazing, really, and I wish I had changed the card out seven years ago. The logic is bulletproof: I don’t need the decade-old edge of nVidia-over-ATI any more, since I don’t play Windows games on this machine and desperately seek out that sliver of advantage in framerates. The 3D acceleration that might have tipped the nVidia cards a dozen years ago is pointless and irrelevant to me know, in a different operating system and with different needs.

And so there you have it — the epiphany of my past decade. After trading out one-dollar network cards at the local recycling shop because they had Broadcom chipsets, or setting aside entire decade-old laptops because they required too many screws to open conveniently, or abandoning a machine to the wheels of fate over something as trivial as a noisy fan … I finally set aside that ingrained prejudice against the second-place finisher in the graphics card duels of more than a decade ago.

And the world is a better place for it. Who would’ve thought. πŸ˜‰

P.S.: 60fps with glxgears, and I’m satisfied with that. πŸ˜€

Remembering the twins

I waited this long to recap the two Dell Latitude LMs I had as guests last month because I couldn’t find the pictures I took while I was poking and prodding. This doesn’t usually happen where I lose photos of something, but I did find one leftover on my camera, taken the day they left.

2014-11-13-84kbn-dsl

That at least proves it wasn’t a dream. πŸ™„

The picture, of course, shows the ironclad and eternally trustworthy DSL running in its most basic form on the prettier of the two machines, replete with a wireless network connection courtesy of an old WPC11v3 b-only card. That was not my most successful attempt — and I really wish I could find those other pictures πŸ‘Ώ — but DSL did at least tell me that the guts of the machine were working.

I don’t have a picture of DSL’s graphical desktop on that unit because I never got one. DSL isn’t picky when it comes to hardware, but I have seen more than one computer over the years that is less-than-visual. In this case, both the vesa and VGA attempts, in every variation, resulted in a scrambled video display.

Some of my other attempts were also less than successful, but a few bore fruit. I had better luck with early, early versions of Debian and Fedora, but some very bad experiences with … anything after 2002 or so (which thanks to this machine, did not come as a surprise). :\ And of course, I managed to get a blinking cursor on a Crux 2.4 installation, which I count as a flawless victory. πŸ˜†

The biggest difficulty in working with these machines (I say “these,” but I did almost everything on the one you see in the picture) was twofold: First, these computers were not intended to boot from CD — only the primary hard drive or the floppy drive, of which I had none (and by that I mean the owner had none). Don’t even talk to me about a USB port. You know better than that. 😑

That’s a huge complication, but not something I haven’t had to work with before. I’m not above creating an entire system in an emulator, writing out the image file to a hard disk, and transplanting it physically into a target machine.

In that sense, these are great designs for that task. I’ve run into machines that were a bit curmudgeonly in that respect, but the drives on these laptops pull out of the front left corner like a drawer, connect firmly in a dedicated tray, and are more or less exchangeable in seconds. What’s more, there’s plenty of space in the tray for an IDE-to-whatever converter, which in my case was an SDHC adapter.

I did run into an additional mystery though, which constitutes Biggest Difficulty Part Two: a hesitance to boot from some systems, and I’m not sure why.

It may have been some sort of partitioning inconsistency, between the BIOS and the installation. Occasionally a system wouldn’t boot that I had written out via dd, but other times preinstalled or original installations wouldn’t boot either.

I don’t suspect hardware issues; instead, I suspect either (a) the old BIOS drive dimension limit, cropping up again decades after its relevancy, causing problems again in its passive-aggressive way of suggesting you should get a new computer, or (b) some misalignment between the way GRUB or LILO worked two decades ago, and what the BIOS expected.

I’ve seen machines — in fact, this EzBook 800 has it — that have a BIOS-switchable option for Windows-esque drive arrangements, with the other option as … “other.” :\ I know of one or two machines in my past that couldn’t boot an “other” system if the BIOS was set to Windows-ish, or vice versa. This old warrior was one of those.

I don’t have any way to document that, and I don’t know how or why it happens, but that’s my underlying suspicion. Since the BIOS in these Latitudes doesn’t have an option to switch, it was a crap shoot to see what will boot and what won’t.

Both of these issues, and their underlying problems, are magnified by the glacial pace of working at 133Mhz, and with the added time of swapping drives and bouncing between drive caddies. Plus, the constant risk of snapping or bending a 20-year-old pin array, or the natural brittleness of aging plastic. … I imagine even that caused a little hesitation on my part.

I can say with some honesty, if these were my personal machines, I’d probably be a little more aggressive in seeing what they were capable of. I tend to be a little antsy around other people’s computers though, for no other reason than general courtesy.

In any case, I gave them back a few weeks ago after giving each one a quick cleanup, and they returned to their home of record.

The irony of their departure is that the owner, when he came to pick them up, hinted that I might be able to keep them if I were inclined, and if my offer came within the range of what he thought they were worth.

I declined politely, partly in fear of bringing more wayward laptops into the house on a permanent basis, but also because I know he feels the pair together, with the power supply and a Dell-branded PS/2 ball mouse (woohoo!), are worth close to US$100. I put them around a quarter of that, maybe a little more. I doubt we could come to compromise, even if he were a little more realistic.

But if I were to find one of these in the recycling dump, I wouldn’t pass it over. It would be almost impossible (by my cursory research) to find replacement parts now, and even if you did, you’d likely be paying incredibly inflated amounts for something worth a fraction of the price tag. So you’d have to find one complete, unadulterated and in pristine condition to really appreciate it.

There are better machines of this era to experiment with. But treading the 20-year-mark on hardware this old, perhaps this exhibit machine and its “scavengee” comrade are a good investment. Maybe his offer isn’t far off the mark after all. 😐

(And in a worst-case scenario, it’s reassuring to think that Dell actually still has Windows drivers for machines of this pedigree. That, in itself, is amazing, even if the nightmare of running Windows 95 on those machines is only partially massaged by the thought of rehashing a few sessions of Age of Empires.)

Who knows? Junk — ahem, I mean, vintage computing is nothing if not an unpredictable hobby. :mrgreen:

Tricks of the trade

I had to remind myself the other day that I’ve been using Arch Linux for more than eight years. I did my first trial installs early in 2006, and while I cut my teeth on Ubuntu, as time wore on, Arch eventually became more like home.

With the exception of strict i586 machines, I’m more likely to install Arch on any given computer, with Linux Mint coming in a close second. The logic there is that I can get it running faster with Arch, but there are some features that I use so rarely that I’d rather a distro like Mint take care of them — things like CD burning or encrypted file systems.

I can do those things in Arch, but it’s rare that I need them, and Mint usually sets them up quicker and better than I do.

Over the years I’ve jotted down a few minor notes on one-shot commands or quick-step loops to tackle oddball tasks. I keep them in a flat text file called “tricks,” and when I need to get to one of them, I just grep through it until the command I want appears. Adjust for minor differences or filenames, fire, and forget.

For example, a while back I found a command to list all the packages that are installed on your system, in alphabetical groups. I modified it a little bit, to:

for i in {a..z} ; do echo -e $(pacman -Qq | grep ^${i}) >> packages.txt ; echo >> packages.txt ; done ; fmt -w $(tput cols) packages.txt

The original post (which I don’t have a link to any more 😦 ) split them out differently, and words broke at the end of the line, sometimes making them hard to read. I solved that with fmt, which is more than happy to wrap text like a word processor, but likes to run together lines. Hence the extra echo. Oh, and it doesn’t seem to like to have text piped into it, so the external file was necessary.

I think the original post caught some flak for 26 iterations of pacman, but I don’t have a problem with that. Might was well put the system to use for a little bit. If it annoys you, feel free to adjust it.

Some of the “tricks” were my own creation. Back in September I got the (stupid) idea that I would dump all the executables from bsd-games, util-linux, coreutils and even binutils into my list of text-based software, just to be sure I hadn’t missed any hidden gems. πŸ™„

It turned out to be a real hassle, and the results of it were one of my biggest lists of oddball commands in the history of that blog or this. In any case, this was the command that got me in trouble:

for j in {util-linux,binutils,coreutils,bsd-games} ; do LEN=$(echo ${j} | wc -c) ; for i in $(yaourt -Ql "${j}" | grep bin | cut -c$(($LEN+10))- ) ; do echo "${j}" >> temp/"${i}".wiki ; done ; done

Most of that was done to trim off the extra stuff that appears with grep’s output; since the length of the name of each package was different, I had to check where the title ended and the actual binary name began. wc takes care of that, with the -c flag. And to keep this from polluting my home directory, it dumps everything into temp/. The .wiki suffix is just for the benefit of vimwiki. πŸ˜‰

Not everything is Arch-specific in that file. Here’s one that I use more often than I thought I would: Taking a folder of files, and moving each one to prepend the original name with its date stamp:

for i in * ; do mv "${i}" "$(stat -c %y "${i}" | cut -d' ' -f1)-${i}" ; done

stat comes through this time, as a way of generating the timestamp of the file. cut that down to the first field only — the date — and voila, moved to a new name, listed by date. Suddenly your folder has organization.

I use yaourt on a daily basis, and with good reason. With such an abysmally slow Internet connection, I have a tendency to hoard software packages, which is not generally an advisable habit with Arch. Occasionally I make a mistake and wipe out my cache, or just slip after a failed upgrade, and need to pull in a fresh copy.

And so … download a new copy of everything that’s installed on your computer. This one took on a new meaning when I realized you could pipe yes through another program, if you wanted to automatically feed it a positive response:

yes | yaourt -Sw $(yaourt -Q | grep -v local | cut -d'/' -f2 | cut -d' ' -f1 | tr '\n' ' ')

Get a full list of packages from yaourt, which will be preceded by repo names. Filter out anything from “local,” since that’s built and not just downloaded. First cut off the repository name at the slash, then cut off the version at the space. Finally, tr substitutes every carriage return for a space, so we can give one giant command to the outermost yaourt, which will download the entire list. Better leave that one overnight. 😐

If you don’t use yaourt, you’ll need to adjust that a little bit, since pacman -Q does not show group names. It also means your “local” packages will be mixed in with your downloadables. Just so you know.

Since we’re on the topic, yaourt lets you do some funky things with your cached packages, and also with the materials it downloads to build local software. If you look in /etc/yaourtrc, you’ll find:

EXPORT=2           # Export to 1: EXPORTDIR or PKGDEST
                   # 2: pacman cache (as root)
EXPORTSRC=1        # Need EXPORT>0 to be used
#EXPORTDIR=""      # If empty, use makepkg's connfiguration (see makepkg.conf)

I’ve adjusted those values to do a few things.

First, EXPORT=2 will send the built package to pacman’s cache, which is a wise move if you ask me. By default yaourt builds everything in /tmp, and it evaporates when you’re not looking. I’ve lost more than one hard-built package by not moving the final product out of that directory. πŸ˜₯

EXPORTSRC=1 does the same thing for the source files and git trees that it downloads. This too can be a lifesaver if you lose the completed package. EXPORTSRC will send everything to EXPORTDIR, or in the absence of that, to the destinations listed in /etc/makepkg.conf. And what are those?

#-- Destination: specify a fixed directory where all packages will be placed
PKGDEST=/home/packages
#-- Source cache: specify a fixed directory where source files will be cached
SRCDEST=/home/sources
#-- Source packages: specify a fixed directory where all src packages will be placed
SRCPKGDEST=/home/srcpackages

By default, all those folders are commented out, so makepkg’s configuration leaves everything in the same folder where it was made. Change those values above, and it will shuffle packages and source files to those directions. It will also create a symlink to the package it just built, so you’re not wasting space.

Let yaourt use those directories, and it will follow those rules, and tar up its $srcdir as well, before sending it off to that destination. In that case, yaourt will tidy up its efforts and leave you all the pieces you need to do it all over again.

And yaourt is generally smart enough to check those directories for source files before re-downloading them. Or so it seems, in most cases. :\

Two more smaller non-secrets I should share:

yaourt -G pkgname

will create a folder named “pkgname,” download its PKGBUILD and all the patches or installation scripts for that package, ready for building. It’s another reason I use yaourt, to be honest. And:

pacman-optimize

It’s not often that your pacman database will need optimizing, but I can vouch for it on slower machines as a way to speed up searching and filtering. Again, just so you know. πŸ˜‰

Clearing out the bookmarks … again

I did it again: I collected a mass of bookmarks that I figure I’ll need at some time in the future. Maybe I used them and will again … and maybe not. Either way, they may still prove useful. I do refer back to this site when I can’t remember a page or a topic, you see. πŸ™„

So here we go again: More links for future reference.

  • I sometimes keep links to pages that have instructions for lightweight systems for old distributions; here’s one for Debian Lenny and one for Crux 2.7 (in the i686 flavor, which doesn’t really matter). That might seem counterintuitive, but I will fall back on old distros when working with old hardware, before making the leap to current flavors of Linux. For an example, peek here.
  • Along those same lines, I found a fairly coherent tutorial on how to install Exherbo. I had a link to another one, but apparently the author took it down. 😦 I have been wanting to spend a little more time with Gentoo (and possibly Exherbo) but I’m always attracted to the way Crux handles things. That being said, Crux dropped i586 support years ago, and hasn’t had i686 ISOs (unless they’re hiding) for a year or two at least. 😦 Story of my life. …
  • I use dd a lot, not just to blank drives or scramble the contents of files, but for other things too. To that end, a speed comparison at different block sizes is actually very useful. Of course, I’ve seen some posts on StackExchange that might offer different solutions.
  • Along those same lines, this page gave me a little insight on how to mount a specific partition in a disk image. It saved me a little time with a copy of an old 10Gb hard drive, since I didn’t have to write it back out to a drive to get at the files I wanted. On the downside, counting out all those offsets was a trick. I’m surprised Linux hasn’t thought up a more straightforward way to do that. …
  • I used to be real nit-picky about fonts, but these days I don’t really mind. I did find a good collection of font suggestions for Arch on Reddit, but I’m not the kind of person who installs two dozen font packages just to see a few extra characters in my terminal emulator. Now if we were talking about fonts for virtual consoles, I’d be much more interested. …
  • Since I’m in fix-it mode, here are a few pages about
    • installing python programs to different directories with pip, which is interesting because I’ve thought for a long time that there is no setup.py uninstall;
    • checking to see if directories exist with bash, which came in handy just a day or two ago;
    • how to install Arch from within an existing Linux installation, which I want to try sometime, just to see if it works; and
    • the difference between single brackets and double brackets to bash, which I never knew but explains why some of my long-ago scripts didn’t work as expected.
  • emacs fans would probably love to run just emacs on a Linux kernel with nothing else, and this post can tell you how. It reminds me of my long-ago attempt to trap Midnight Commander within a tty session, much like could be done a long time ago with rtorrent.
  • I should take the time to set up mutt with multiple GMail accounts, like this. I don’t dislike alpine, but I only keep it around because I’m too lazy to set things up. :\
  • From the Department of Ancient Awesomeness comes three flasbacks that just made me nod my head: one on the best distros of the year 2000, another of the best window managers of the year 2000, and perhaps best of all … a complaint from 2008 about how Firefox is utter bloat. The more things change, the more they stay the same. …
  • I watch the Debian systemd soap opera with only a little interest. I’ve been using Arch for quite some time now, and I have no complaints about the newcomer. All the same, if you’re wondering where you’ll stand when the revolution comes, raymii’s chart from earlier this month might be helpful for you, as might this systemd vs. sysvinit cheatsheet. Neither page will convince you one is better than another, but might help you understand how they each handle the startup task. Knowledge is power. 😈
  • You won’t hurt my feelings if you find some Linux know-how somewhere else; even I found this list of tech podcasts rather interesting. I don’t really get into podcasts much, but from time to time I will grab one and spin it up.
  • Finally, from the Completely Unrelated to Anything Else Department, here‘s an interesting project: An Android browser that displays web pages (believe it or not) by way of relaying the content through SMS messages. O_o Now I’ve seen everything.

And now I’ve listed everything. If those are at all useful to you, please bookmark them in your own system. Hold on to them for about four months, and then yell “I gotta do something about these bookmarks!” and offload them to your own blog. It seems to work for me. … πŸ˜‰

Text-based gamers can afford to be picky

I just finished up 10 days riffling through a score of console games, and the experience reinforced something I wrote about a month ago: that text-based gamers can afford to be picky.

And I should probably be clear about what that means. I’m not suggesting text-based games are somehow superior to graphical games — that, after all, would depend on what we were comparing. There are some very good games in both camps, but unless we’re comparing direct renditions, both sides will have obvious winners.

No, what struck me about the past 10 days — and the past two years, really — was the sheer depth and breadth of the field, and how many of those were actually very good games. Not necessarily technical or visual triumphs, but games that grabbed my attention and engaged me for a serious amount of time.

Pick an interface, a game or a genre, and there will be more than a handful of games that fit it. Limit yourself to text-based games, and there are still many, many titles that will impress you.

Money is no object. It's no joke either.

Money is no object. It’s no joke either.

Here’s a good example: Starlanes, which I just learned about a week ago, and now love to death.

Starlanes won’t wow you with its “immersive” 3D interface or tickle your retinas with intricate shader effects. It’s just a straightforward game that has precise economic rules, and those rules permit some wild, but completely logical, turns of events.

In the “sci-fi-strategic-economic-territorial-market-acquisition” genre, Starlanes pretty much dominates the field — and has, for the past decade or two, considering it’s not a new game. You might have to look outside the text-only field to find another title that competes, or compares, with it.

But don’t think that I’m backing the only horse in the race. Starlanes is a powerful game in its own right, even if you think it’s just a big fish in a small pond.

War is hell, even in ASCII.

War is hell, even in ASCII.

Here’s another one that follows a basic format but has excellent rules: Curse of War, which also makes the leap from turn-based to real-time strategy.

Guide the growth of your civilization toward geographic goals, and dominate your opponents by sheer force of demographics. Set flags and your population migrates toward them; fortify your hold on resources by improving structures. It has all the flavor of most real-time strategy games, with tiny splashes of Life, Populous, StarCraft and even Civilization, to a small degree.

Starlanes and Curse of War don’t compete by any stretch. Starlanes is turn-based, and has a strategic appeal that is reinforced by some simple territory annexation and market rules. Curse of War is real-time and allows less precise geographic control, but offers its own set of rules that govern migration, zones of control and even fortifications. Different, but also a very good game.

That's me, running from the Americans again.

That’s me, running from the Americans again.

So economic simulations and population strategy games aren’t to your liking? How about a tall ships simulator?

I found Sail a few months ago, hiding in the ancient bsd-games package, and the level of detail made my jaw drop. It should be enough to say that Sail was intended as a conversion of a decades-old Avalon Hill tabletop pen-and-paper, map-and-counter game, but if that doesn’t ring your bell, let me say this: Imagine Sid Meier’s Pirates!, cross-bred with any version of Microsoft Flight Simulator, and strained through an ASCII filter. It’s almost frightening.

The majesty in something like Sail isn’t in its uniqueness; after all, pirate games have been around since kids were invented. Sail wins points for pulling in an established rule base, automating the more cumbersome points like firing angles or wind calculations, and creating a game that’s just as fun as the tabletop game … and still in a text-only format.

The best analogue I can think of for Sail would be the classic Star Trek: Starfleet Command PC games of a decade ago, that converted the original Star Fleet Battles games into a fully graphical environment. I played the original Starfleet Battles games a long time ago, and let me tell you, the PC rendition was a gift straight from Allah. 😐

But simulations aren’t everyone’s cup of tea either, and I don’t find fault if it’s not your favorite. How about we go the opposite direction, and pick up something simpler? Something old, and yet new?

Eat your heart out, Gameboy.

Eat your heart out, Gameboy.

If I must be honest, Yetris is not my choice for the tip-top Tetris clone available in chunky block characters. That honor belongs to vitetris, as I’ve mentioned many times in the past. It’s definitely not for lack of trying though, and as you can see, Yetris takes the Tetris precept and spins it at a fever pitch.

Yetris stands out to me because it doesn’t accept the terminal environment as a limitation, full stop. vitetris offers network play and a few other smaller fillips, but Yetris takes command of the terminal, sets its own rules for visual appeal and space usage, and does not back down.

At the start of this post I said most of these games wouldn’t be visual triumphs; Yetris is. If half the games I’ve seen in the past two years approached the text-only medium with the same level of aggressiveness, there would never have been a migration to the graphical desktop. We’d all be playing fantastic Yetris-esque versions of World of Warcraft. πŸ˜•

Papa was a troll 'n stone ...


2014-10-26-6m47421-angband Papa was a troll ‘n stone …

And so long as we’re on the topic, if I had to pick out a text-only game that might satisfy the fantasy RPAG crowd, I’d have to think for a minute. When I opened the field at Inconsolation to more roguelike titles, I realized I was going to be drawing in dozens upon dozens of games that followed much the same format, but offered their own small personal tweak.

I saw a lot of them — a lot of good roguelike games — last week, and the two at the top of the stack were definitely ADOM and Angband. It would be impossible to try them all, so you could argue that I missed out on [insert title here] which was clearly superior :roll:, but those two are what I remember most.

ADOM took the dungeoneer out of the dungeon, and turned him/her loose on a world above. Angband refined the classic moria format derived from hack and rogue, injected healthy doses of Tolkien and Gygax, and arranged everything with better color and a better layout. And both games are very good indeed.

Hail to the king, baby.

Hail to the king, baby.

But still — after years of poking and prodding the Internet and seeing what falls out — but still the game that keeps pulling me back for hours on end, is Dungeon Crawl: Stone Soup.

I don’t even know if I could tell you what makes Crawl better than any of the others, except that it manages almost every task efficiently and expertly. It handles dungeon generation, targeting, ammunition, autoexploring, autofighting, automapping, spellcasting, religion, regex object searches, skills, proficiencies and specialties, poisons, mutations, races, subraces, classes, subclasses, divine blessings, diseases, vampirism, draconism, encumbrance, hunger … I can’t list them all.

Truth be told, I stopped installing Crawl a long time ago — because I can get to it through ssh, and that’s even better than installing it if you’re on an old machine. All that intricate stat management might take a toll on my dear old Pentium. πŸ˜‰

There’s only one game I know of that approaches the same level of detail and comprehension as Crawl, but still works in a text-based environment. But we have to switch atmospheres to peek at it.

Of course I died right after this screenshot.

Of course I died right after this screenshot.

For my money, Cataclysm: Dark Days Ahead is the pinnacle of text-based games, regardless of the genre. I realize that’s a bold statement, but I think I’m in a position to defend it. I’ve done my share of research.

Just about everything that I’ve mentioned in Crawl is also available in Cataclysm, understanding of course that Crawl is a fantasy RPAG and Cataclysm is a zombie apocalypse survival epic. So of course, some parts don’t overlap.

But I knew I had found a winner when I realized I could actually create primitive explosives in Cataclysm, by scrounging through stacks of other garbage left over in the remains of my world. And then I discovered I could rearrange my layers of clothes to provide for more pockets. 😯

And when I thought through my defenses and resources, rather than just wandering around the countryside like an idiot. Staking out an abandoned power station and devising barricades and gauntlets for zombie onslaughts suddenly made perfect sense. …

This is where I have to stop. This post has already taken me about two days to assemble, because every time I mention a game, I reinstall it, and then I play it, and from there … half the day is gone. … Not that I’m complaining, of course. πŸ˜›

But don’t ever let it be said that there’s nothing to do for fun in a text-only environment. There is a ton of great entertainment available that doesn’t require a graphical environment, and probably more importantly, doesn’t require a quad-core with dual SLI cards and 12Gb of memory … just to install. πŸ˜‰

Ghosts of the machines

I haven’t taken the time to update much here, but I seem to still devote most of my time to following trends in text-based software. It consumes a considerable amount of the day.

There are a few things I should mention though, even just as updates to my trials and tribulations with outdated hardware and modern software.

First, the CTX EzBook 800 I mentioned a few months ago is not much closer to a full working state, but it was never very far off. Most distros, aside from Crux, either miss a beat with the hard drive or the optical drive, or sometimes both.

An added complication is that there seems to be no response from the PCMCIA port whatsoever. Every attempt to even acknowledge a card there comes up dead, regardless of card or distro.

As a troubleshooting measure — strictly for troubleshooting, I swear — I swallowed my pride and installed Windows 98SE from a friend’s CD (try and find one of those these days 😯 ). No life, no lights, no response.

Which leads me to believe the port is damaged or dead. It’s reassuring in a way, since it means it’s not necessarily a configuration error on my part, so much as a hardware defect that may or may not be fixable.

In any case, I could conceivably use it un-networked, as some sort of offline data storage device, and transfer files on and off via USB. It’s not an appealing option, but it’s possible.

Second, in the Thinkpad realm, I’ve allowed the ’41s to move on to new owners. I enjoyed my time with them but I am overburdened with laptops these days and need to make space.

The T41 will be a low-strain home PC for a local friend, but — even better — the X41 is going to be part of a business IT department, monitoring server performance. How exciting! :mrgreen:

The aforementioned Inspiron 4000 turned out to have far deeper problems than I suspected. A new power supply wasn’t … supplying power πŸ™„ for some reason, so I finally pulled the entire business apart to see what could be done.

And it appears someone else had already had that idea, and “repaired” it with cyanoacrylate. The machine must have been dropped at some point, and the cracks and seams resealed. Most of the casing was in shards by the time I could get to the motherboard, at which point I made a command decision and pronounced a time of death. Its usable parts are now awaiting transplants into other hosts. A sad ending. 😐

Next, I feel obligated to mention that I’ve run through a stream of Dell machines in the past month or so. I spent a short time with an Inspiron 5150, which was an interesting experience. On almost every front it outstripped my in-house 8200 machine, but was almost dull by comparison. That machine has moved on to a Windows fan, who wanted a native XP environment for classic 3D games of the 2002-2005 era (one of the FIFA games, I think).

I also came across two D610s, one in mediocre condition but the other in pristine shape, to include the carrying bag, CDs, cables, batteries, etc. It was a very nice gift.

D610s are strictly business though, and most of the internal hardware is unappealing — a lot of Broadcom network interfaces, which I hold in high disdain. Both are viable Linux candidates, but would probably require as much in supporting hardware (i.e., replacement MiniPCI wireless cards or PCMCIA network cards) that it might not be worth my effort to keep them.

I understand that the 3.17 kernel has better Broadcom support, so I might keep them around until that reaches the Arch core repos, and see if it’s true. I’ve been promised that before though, and in this day and age, there’s no need for me to cling to a Broadcom-based machine.

I also should mention a rather battered D810 that made its way to my doorstep. It was more a curiosity than the D610s, because the hardware seemed comparable, but the widescreen aspect threatened to bog down the desktop. Perhaps 1680×1050 is a bit big for a 32Mb ATI X300 card. …

What else … ? A couple of Thinkpad R50p‘s, which together were in such bad shape that there wasn’t enough left to make a single working computer out of them. And one had a password lock at the BIOS, and I’m not going through the trouble of reading EEPROMs just to start up a 10-year-old laptop. 😑

I also got an old Gateway 6518GZ that was DOA … an HP dv6000 with an unseated video card that probably would have needed a reflow to bring to life. … A couple of other lesser creatures. … 😐

But the real score of the month was permission to tinker with a pair — not just one, but a pair — of truly ancient Dell Latitude LM machines. I can’t keep them but I’m allowed to poke, prod and pick at them for a while, provided I return them in original condition. Here’s one in action, with its original Windows installation.

2014-10-20-84kbn-win98

These are true 133Mhz Pentium machines with 40Mb of RAM apiece, NeoMagic video cards and 1Gb hard drives (one completely error-free, after all these years!). Otherwise standard arrangements of PCMCIA slots and sound cards which are undoubtedly ISA components. I haven’t had a challenge of this level since … oh, probably this machine.

I have to return them to their owner in a couple of weeks, but I’m enjoying the opportunity. The biggest threat at this point seems to be their complete inability to boot from CD, and neither has a working floppy drive. Luckily they were designed with the hard drive in a pull-out tray at the front of the machine, accessible with only two screws. And even better — my IDE-to-SDHC adapter works well. πŸ˜€

I’ll put up some more pictures and maybe a full post sometime in the next week or two, if I can. They have to return to their owner at the end of the month but I have free reign for a while yet. If I can make any progress, I’ll make a note of it here.

Odd, the things I think are fun. … πŸ˜•

P.S.: A very special thank-you to my online donor, who provided some of the machines mentioned above, but asked to remain anonymous. πŸ˜‰

One existential crisis at a time, please

Not everything I keep around the house is an absolute winner. I do feel like I can pick and choose the machines that stay with me, and which ones go on to new owners and new lives. But sometimes there are machines that really test my principles.

Here’s one. This is a lowly Dell Inspiron 4000. And it’s definitely not a model specimen.

2014-09-14-insp-4000

Quite to the contrary. This machine is a veritable best-of list for everything that can possibly go wrong with an old laptop. When it came to me,

  1. It had no memory.
  2. It had no battery, and no power supply.
  3. It had a CD player, but the door mechanism is broken, and if you don’t hold it in, it doesn’t read the CD.
  4. It had a floppy drive, but I shook so much dust out of it that I’m seriously concerned about jeopardizing one of my few remaining floppy disks by testing the drive.
  5. It has no rubber feet left, and what remains causes it to rock on a flat surface.
  6. The screen is in good shape, but takes a while to warm up. Until then, the display has a red tint to it.
  7. It has one — only one — USB port, and that’s a version 1.1 port, so it’s phenomenally slow. To make matters worse, it feels like the port is losing its grip on the motherboard, because the port flexes when you push in a drive. Scary.
  8. About a fourth of the keys — mostly in the upper right quadrant — don’t work. Either the keyboard is on the fritz, or the signal isn’t being caught by the machine. I guess the former.
  9. The CMOS battery is dead, so you have to set the date each time the machine boots. Which is tricky, because again, a quarter of the keys don’t work.
  10. It has no built-in network port, or rather, this particular model has a plastic shield over the ethernet port, which usually was a sign that the board didn’t carry that port.
  11. It has more than its share of cracks, dings, scrapes, gouges, split seams, broken corners, busted lips, scuffs and scratches.

It looks a great deal cleaner now than it did when I got it. It still needs a complete disassembly and scrubbing — if it stays, of course.

And that’s where the existential crisis comes in. Because in spite of all that damage and all those deficiencies, it still works. Its saving grace is the the fact that it belongs to the Dell C-series, which means that laptops from about five or six years before and five or six years after it all used compatible parts — including this one.

So, after calling in some favors for a 256Mb stick of PC100, then borrowing the battery and a modular DVDRW drive from the 8200, I turned it on, and it came back to life. The Windows 2000 installation was still in place and functional, even if it was hideously slow. The touchpad is in good shape. And the screen is clear and free of flaws.

I gave it an Atheros-based PCMCIA wireless card, and started it up with a PLOP CD and the Arch Linux install ISO on USB. From there I could ssh into it and work up a system, for as long as the battery would last. And as you can see, after some slight delays, it’s functional again.

But from here it becomes a question of worth, because at its core, it’s still a 600Mhz Celeron, with only 256Mb of memory, a lowly 30Gb hard drive … and all-over barely functional. Sure, it has all-Intel guts and an ATI Mobility card. But it’s not something your day-to-day computer user, circa 2014, wants to take home to meet the family.

So the jury is still out on this machine. I still haven’t tried it with a proper power supply, and I need to know for sure that the keyboard issues are just in the keyboard. I have a feeling that it will cost me more than the value of the machine just to find that out, which is why I’m debating disassembly for parts.

I hate doing that, but sometimes you have to make difficult decisions. 😐