The trailing edge of the wave: The CTX EzBook 800

For as many times as I’ve introduced old laptops on this blog, you’d think I’d have a formula or a template page tucked away somewhere.

But I don’t, and here we are again with another underdog to report. I hope it’s not too dull for you; if it’s any consolation, I have three or four other laptops that I haven’t bothered to mention, because I imagine it to be terribly boring for you.

This one though, I feel is noteworthy. Not because it’s a cherished acquisition, like this one is, but because it’s such a curmudgeon that I have a feeling someone, somewhere down the line — probably me — will need information about it in the future. So I put it here, to avoid slogging through all the quirks again. And because that’s what this site was originally for. ;)


This is a CTX EzBook 800, the top-of-the-line model for EzBooks of 15 years ago. It’s a pure K6 machine, meaning it lacks a lot — and I mean a lot — of the requisites that most people saw in the computers of a decade ago, let alone now.

I got this as a castoff from a friend, who is also a bit of a technophile and prefers to work with out-of-date machines for a number of reasons. My friend is primarily a Windows person though, and I have a feeling this was such an underperformer that he was glad to see it go. I know he considered putting Linux on it and even asked a few questions online, but was out of his depth and didn’t see much future in it.

Apparently he paid about $1 in an online auction for it, plus the cost of a new power adapter. Not bad.

This is not my first EzBook, and that was one of the reasons I agreed to adopt it. I have had 700 and 700E models in the past, and if I remember right, that 700E was one of my first test runs with Linux. It didn’t go well, but I lacked the experience then to make it work.

And it seems that I still lack some experience now, given my rather lackluster success at getting the 800 version to sing along. Not that I have terrifically high expectations, but I do have a reputation to preserve. :???:

Here’s a rundown on the guts, and I can explain the implications later.

00:00.0 Host bridge: Integrated Technology Express, Inc. IT8330G (rev 03)
00:10.0 VGA compatible controller: Neomagic Corporation NM2160 [MagicGraph 128XD] (rev 01) (prog-if 00 [VGA controller])
00:12.0 ISA bridge: Integrated Technology Express, Inc. IT8330G (rev c1)
00:12.1 IDE interface: Integrated Technology Express, Inc. IT8330G (rev 11) (prog-if 0a [SecP PriP])
00:12.2 USB Controller: Integrated Technology Express, Inc. Unknown device 1234 (rev 03) (prog-if 10 [OHCI])
00:18.0 CardBus bridge: Texas Instruments PCI1131 (rev 01)
00:18.1 CardBus bridge: Texas Instruments PCI1131 (rev 01)

The hard drive is a Fujitsu MHD2032AT, and the optical drive is a TEAC CD-220EA. My friend maxed out the memory at 128Mb, which complements the 300Mhz K6 quite nicely. I’ve had good success with NeoMagic cards (better than the Tridents, that’s for sure >:( ), and having USB ports on a machine this old makes it an absolute treasure. Phoenix made the BIOS, which is important because the USB ports and a few other things are enabled or disabled through that.

There are some critical points in there, if you’re fighting with a similar machine or one from this era. Please bear with me, and I’ll work through them slowly.

My friend said he could get no modern version of Linux to work on it, and even though I suggested both Slackware and Debian, he still claimed no success. I can attest to that now: Both Debian 7.x and Slackware 14 ran into problems either locating the CDROM or hard drive, or both. You can add these to that list:

  1. Alpine Linux 2.7 for x86, which boots and will configure itself to the live CLI environment, but can’t find the hard drive.
  2. Puppy Linux, slacko in the non-PAE version, which spit out errors demanding a CPU with cmov.
  3. TinyCore, in its newest version, which reached text mode but couldn’t find the hard drive or CDROM.

In most cases, those were dealbreaker attempts, because the live or installation environment couldn’t find hardware I would need to move forward. Here are some others that fell flat, but for slightly different reasons.

  1. Crux Linux 2.7, which was the last i586 rendition. Refused to boot past connecting to the CDROM and ended in the jaws of the mythical “can’t access tty; job control turned off” error.
  2. Debian 5.1, which installed but boots into a soft lockup and seems content to spend eternity reporting its hopelessly frozen state at 90-second intervals.
  3. *buntu versions after 6.10, which usually didn’t get so far as Debian 5.1, and reported no hard drive or no CDROM or both.
  4. Slitaz, the 4.0 release, which booted into text mode and would allow me to install, but locked on boot.

Just out of curiosity, I also tried:

  1. ReactOS 0.3.16, the live rendition, which amazingly worked better on that machine than any other I’ve tried in recent years. I reached a Windows-esque blue desktop and a brief show of some wallpaper, but then it hung and became unresponsive. That may have been a low-memory complication.
  2. FreeDOS 1.1, which took an exceptionally long time to install, and would boot with the assistance of the installation CD. From there it would need the obvious additions of useful software and perhaps a graphical desktop.
  3. Clonezilla in recent 486 versions couldn’t find the hard drive, which is only important because it means any system I build on there will have to be dd’d off via USB1.1 for backups. :shock: Oh well, it’s not the first time. …

The real plot twists come here:

  • Ubuntu 6.06.1 and Xubuntu 6.06, both of which would find the hard drive and CD drive, and install over the course of an hour or so. The resulting desktop was forced into 800×600 (on a 1024×768 screen), and was marginally useful. I tried hand-editing the xorg.conf file but only managed to bork the display so badly as to require starting over. No network access through the PCMCIA port, which sounds familiar.
  • DSL 4.4.10 would of course work, but I ran aground again with the system freeze on wireless insert bug, which I blame on the 2.4 kernels. I used to suspect the PCMCIA-to-CardBus switchover for that, but it seems even CardBus PC cards inserted into a CardBus bridge will trigger it. My only orinoco-based card just doesn’t respond with DSL. :(
  • Crux 2.4 for the i586, which includes kernel by default but could have a newer one implanted. Booted, found CDROM, found hard drive, and installed without major incident.

For me, what is at issue here is the evolution of PC hardware away from ISA-based components to the standards which are more common now. Along with that, there was the shift away from the old kernel support for PATA hard drives to the newer SATA-style code. Add to that an ATAPI CD drive, and it’s easy to see why some distros just didn’t work, and others worked reasonably well.

You can almost pick out a month and year when the trailing edge of the wave fell away. This machine seems to have ridden the far edge of that crest, and as a result finds itself drifting on the other side. :sad:

My proof for this is in the kernel configuration for Crux 2.4, where the old-style ATA options are enabled and all the drives are found. That should correspond to the mid-2000s versions of Ubuntu, where the last support for those same drives is found. After 6.10 or so, the machine falls off again.

I can’t account for Lenny’s soft lockups though, and I don’t see much help online for that particular issue. I tried the old noacpi gimmicks from a decade ago, but whatever plagued the 5.x versions of Lenny persists.

But all is not lost. If I absolutely gut Crux’s kernel, I can compile it in about 45 minutes at 300Mhz, and best of all, I can boot to a graphical desktop with blackbox, which comes by default. (Now you understand my recent affection for blackbox. ;) )

In fact, short of getting a CardBus network adapter to respond, the entire machine works fine.

And depending on how CDs I’m willing to burn, I could conceivably hopscotch my way up from 2007 to circa 2011. The bulk of those packages is precompiled and available on the ISOs, with the exception of the contrib ports. And I have time these days to babysit it, as it churns away at the code.

There’s a little voice in my head that keeps telling me to yank the hard drive and install it externally, and then replace it. Usually there’s another little voice right after that one though, that says I’m too clumsy to get the case open on this without cracking or scratching the body somehow, and it’s too pretty as it is.. And of course, there are no service manuals online any more. … :(

So while all is not lost, this is definitely on the verge of falling through the cracks. And let’s be clear: I have no aspirations of bringing this machine into the 21st century, or for that matter, playing a YouTube video with it. Those days are over, friends. We have the Internet to blame for that.

I can’t deny it’s a terrific challenge though, and I am enjoying smacking my head against the screen for hours on end. But it does feel good when I stop. ;)

Poor man’s SSD: A cryptic twist

If you’ve suffered through this site over the years, you will recall there was a time in the previous decade when a little idea paid off big, and an ancient laptop got a nifty upgrade.

Fast-forward to this year, and again, a little crablike thinking seems to have paid off.

Let’s start at the beginning. Remember this machine? It’s humming along nicely, and with only a few shortcomings, I expect it will last quite a while into the future.

Among those shortcomings are a lack of USB2.0 ports, and nothing to interface with SD cards. For some reason Dell never swapped out the USB1.1 ports that were part of the early 8000 line for the higher-speed ports that were more common with Pentium 4 machines. Design flaw, or programmed obsolescence? You decide.

Regardless, the obvious solution is a PCMCIA-to-USB2.0 card, which costs all of about US$2 these days. They’re literally recycle store giveaways, to be honest.

Which means the two omissions — USB2.0 and an SD card reader — are related in an odd way: From PCMCIA to USB2.0, to USB-to-SD reader, to an SD card. It’s not as ungainly as it sounds, and really, I’ve done much worse in the past. At roughly US$6, something like this was well worth the price.

And with a lot of leftover SD cards lying around — mostly from the same camera I’ve owned for about seven years now — this is a good way to pick up a little extra storage space, in oddball sizes.

Now shift gears for a little bit, to a larger, grander scale. Online privacy is something that I think about a lot more these days, and I hope you do too. Knowing that most anything that’s transmitted unprotected is likely to be archived somewhere by someone for sometime has, in short, caused me to retract just about anything I kept on the web — everything to the lowliest .conf file — and either keep it locally or repost it encrypted.

A few months ago I decided the best way toward physical security for that data was to dedicate one entire machine to the prospect of data storage. Starting with the operating system, I wanted something that could encrypt without excessive entanglement, require several passwords to access, be more or less impervious to environmental issues, be self-sufficient and not need network access or frequent updates.

No, I’m not Edward Snowden. I just have a hope of protecting electronic documents, and I don’t think I’m too far from the target.

Hopefully the picture in your mind at this point is about the same as the one I had in mine. I decided to use one of my many leftover machines for the purpose, and even went so far as to purchase a small, inexpensive SSD to avoid the pitfalls of data errors or drive crashes. I installed Linux Mint 17, encrypted the entire drive, encrypted the home partition, and even password-protected grub. I turned the wireless switch off, put it in a generic black laptop sleeve and set it on a bookshelf next to a copy of Walden and a can of compressed air.

And then I got to thinking: Now I’m dependent not just on that drive, but on all the components that keep it running. Why did I lock myself into that particular computer? Just because it was available? Only the drive is important.

So I took it back down off the shelf, unscrewed the case and took out the drive, and put the drive back on the shelf between Walden and the canned air.

And then I got to thinking again: Now I have to put that drive into the computer, every time I decide to move a file on or off of there. That’s terrifically inconvenient.

So I took it back down off the shelf, took out an old USB drive enclosure, dropped it in, and started screwing it back together.

And then I got to thinking, and this was the last time: I only really need about 20Gb of space, for family photos and scanned documents. The drive is three times as big as that, and the remainder will basically go unused.

I have SD cards that are plenty big for that. And while some machines won’t boot from a card reader, almost anything after 2002 will boot from a USB port. And I have a USB-to-SD card adapter. Why couldn’t I just reinstall everything to an SD card?

It’s much more portable. It’s easier to back up. And prices on SD cards are falling. A 128Gb SD card, at the time of this writing, was only about US$60. That’s as much as the value of Vista-era computer I was using, and I spent almost as much on an SSD.

You can figure out the rest of the story. I re-ran the entire installation and encryption process on a leftover 64Gb SD card I got from a family member last year, and it works like a champ. I transferred all my sensitive files onto the SD card, put it back in its teeny-tiny plastic case, and put it on the shelf between Walden and the can of compressed air.

I’ve tried booting that same SD card on a half-dozen machines now, the fastest being a 2.4Ghz core 2 duo Penryn-based machine, and the slowest being an old, 1.6Ghz non-PAE Pentium M (a good reason to rely on 32-bit versions). Perfect performance, every time.

Great security too: Knowing the grub password might grant access to recovery mode, but doesn’t give you access to the drive, and knowing the drive password doesn’t give you access to the privileged user’s home folder. And if I encrypt anything inside there, that will be one last small measure of prevention.

And no hardware issues, since Mint is smart enough to adjust itself to the hardware of the host machine, no questions asked.

I know there are some downsides. It takes a little while longer to boot across a USB port, particularly on that Pentium M. And there’s the rumor that SD cards have limited read-write lifespans … whatever that happens to be. :roll: And besides: I might start up from that card once a week at most, probably less. I’m not real concerned about lifespans right now.

But I’m satisfied at present with this arrangement. It streamlines the entire process and doesn’t lock me to one particular machine into the future. I can drop that card in my pocket, I can dd between two cards and have a duplicate in a matter of hours, I can mail it cross-country without worrying about someone intercepting it, and I can lose it without fear of anyone picking through my 2011 vacation photos. :roll:

So there it is: The poor man’s SSD strikes again. Perhaps I shall sit around for a little while again today, and try to dream up new uses for old ideas. :D

blackbox, as seen in the wake of Openbox

I haven’t posted much here lately, and that’s a sure-fire sign that I’ve been quite busy. You probably know what with. ;)

Yes, after literally years of baby-stepping through the alphabet, I finally finished that ginormous list of applications for the console. The one that I discovered four years ago, dragged around for another couple of years, and finally dissected over the course of the last 18 months.

So, yes, in that alone I’ve been quite busy.

On top of that though, there have been some recent hardware adoptions that I’ll show off later. A couple of them are real prizes, and some might be … curses. More on that in the days to come.

Today though, I needed to come to grips with blackbox, for reasons that will be clear in the future. Suffice to say that it was worth learning on my “production machine,” and sounded vaguely like fun after May’s run-in with twm.


That’s Arch Linux again, for no particular reason other than it was easier to strip down my existing Arch-based desktop, and build it back up again with blackbox.

I first remember blackbox from my very earliest days with Ubuntu, and I daresay I tried blackbox before I ever got into Openbox. I even tracked down the original how-to I used to set it up, almost a decade ago.

Some of what’s in that post doesn’t really apply though; there are small changes in what bluevoodoo1 was doing, and what you can do with Arch now, eight years later. Most of those changes are not deal-breakers.

I have to give blackbox credit for being infinitely easier to configure by hand than Openbox. The menu system is strictly brackets-parentheses-braces, for the command-title-executable, and that’s a huge advance over Openbox’s XML-based configuration files. Yes, I know I’m a weakling for complaining about XML. I’ve learned to live with my shortcomings.

Configurations, if you can believe this, are mostly done through the right-click menu. There are quite a lot of settings that will require you to edit your .blackboxrc file — especially the path to your styles (think: themes) — but I’d guess 90 percent of blackbox’s setup is handled through the right-click menu … a la Fluxbox.

And since I mentioned it, blackbox “styles” are fairly easy to handle too. I don’t hold theming against Openbox since that’s generally a single file that needs attention. And part of that can be managed through obconf.

From a ground-zero setup I’d have to say blackbox was quite manageable. I had it up and working in a matter of minutes, and configured to my liking over the course of an hour or so, while I allowed myself to be distracted by events on television.

Once it’s in place, it plays a lot like Openbox, with obvious additions and subtractions here and there. blackbox has its built-in “toolbar;” I don’t recall seeing anything like that in Openbox. blackbox has a “slit” that I generally ignore; I don’t think Openbox uses a slit (Fluxbox did, last time I checked).

Openbox can do a few things blackbox can’t, of course. Most painful to me are the loss of programmable hotkeys — Super_L+1 for alpine, Super_L+2 for Midnight Commander, and so on. If I understand things right, there was a bbkeys utility, a half dozen years ago, that could handle keystrokes like that, but has since faded away. AUR can’t build it, and Debian dropped it.

On the purely aesthetic front, it would be nice to insert proper separators into right-click menus. All my menus in blackbox look like hastily scrobbled lists of programs mashed up against each other. And since I can’t relegate them to key presses, the list is longer and scrobblier than ordinary.

I do admire blackbox’s austere default schemes though. As you can see above I removed some of the frills that remained and came up with a very flat, very rectangular desktop … that springs into life like a 1950s American housewife on prescription methamphetamines.

So in spite of reaching maturity at a time when dual core machines were just mirages in the desert, blackbox has managed to win a few points with me. It definitely shows a degree of greater usability than twm, even if it never approached the feature-completeness of Openbox.

But really: Yes, it doesn’t have all the bells and whistles that Openbox has learned over the past decade, but it does manage all the fundamentals without becoming overburdened with XML gobbledygook or bogged down in the need for endless ancillary graphical tools.

I never thought I’d say it, but in that sense, I prefer this to Openbox. :shock:

A social experiment, and a pill of disappointment

I have a daily habit of perusing a few forums that are bent toward Linux distributions that I prefer. I don’t choose a username associated with this blog, mostly because it draws attention. So don’t go looking for a “K.Mandla” out there. ;)

I don’t contribute as much as I could (or probably should), but I still consider myself a newbie when it comes to 90 percent of the things I see. That’s just the way I am.

There’s nothing lost in that, since most pleas for help are generally met with a correct response. So there’s no call for my bizarre viewpoint on software usability, or scant practical knowledge on post-2006 hardware to pollute things.

A few days ago I noticed that there was a pattern to almost any hardware help request: Regardless of the topic or issue, invariably the first chimes were to demand core information from tools like lscpi or dmesg.

No harm in that. In most cases, the request was warranted, particularly if compatibility was the topic. If you’re going to help someone troubleshoot specific technology, then it’s helpful to know what the computer is reporting.

Except in a lot of cases, there was no real answer or reply to the original question. Random passersby were simply shouting out “What’s your dmesg say?”, then disappearing back into the ether. No one ever answered or approached the original issue. If the original poster was lucky, someone with practical experience or expertise appeared, and coached them through the issue.

But occasionally the question was just left to rot in the sun. A few spattered demands for information, and then the inexorable slide down the page into obscurity.

But what was worse, random passersby occasionally hounded the poster on seemingly insignificant details that couldn’t possibly contribute to the issue. “Not lspci -vv, lspci -vvv” was one. What really irritated me was a demand for lsusb -t instead of just lsusb, which didn’t really make a difference in that case.

It seemed what was happening was a spattered pecking for information, without any real direction or guiding impulse. Rather than actively answering a question, other forum posters were just parroting the same demands they had heard in the past, perhaps hoping some sort of shotgun effect would lead toward an answer. Not often did that seem to happen.

And another post drifted down the page and into digital history.

Seeing this, and coming to the point of this blabber, I decided to try a little social experiment. I picked a fairly straightforward issue of hardware compatibility, and one that could be solved with what (I guess) was a low-to-intermediate level of expertise. Nothing exotic or esoteric. Just an issue of inserting a particular module, and a USB trinket should work again.

I opened a new account on a site, picked a forum intended for beginners and hardware issues, and asked the question.

And as you probably have guessed, within a few minutes I had the first requisite demand for dmesg, which I supplied. No real answer came.

A few minutes later, right on cue, the output of lsusb, and then tail /var/log/Xorg.0.log. But still no suggestion of lsmod, or anything to do with modprobe.

Another poster wanted to know the maker of the USB toy, and chided me for not mentioning it in the original question.

An hour later I had a very persistent reply, in a somewhat belittling tone, asking which ISO I had used, if I had done the md5sum check, and if the installation had finished cleanly. I replied in the positive to each of these, still wondering if this was leading up to the issue of the missing module.

After that, the same person asked which site I had downloaded it from O_o and if I had done a CD integrity check. Bemusedly, I linked specifically to the torrent, wondering how we had strayed so far from the original problem. Ultimately I got a suggestion to reinstall from USB instead.

I stopped checking the post after about six hours. By then it had fallen off the first page of the help site, and I doubted anyone with expertise would find it so far down the list. In all, I got about 12 replies, only one or two of which were helpful, and none of which mentioned reinserting a module. Like a bad day fishing. :/

I know that won’t happen every time. And I know it depends vey much on who is available at any given time, and what they know, and how inclined they are to contribute.

But it also speaks to the caliber of help available on some support sites. Well-meaning but underqualified attempts don’t really solve problems, and in some cases they can cause more frustration and confusion than they cure.

You get what you pay for, I suppose. To a large degree we are all learning this as we go, picking up bits and pieces of knowledge when and where they happen to appear. When you’re genuinely new to the Linux landscape, it’s hard to judge quality.

And for those who know the difference, it’s sometimes disappointing to watch in action. :|

Links for later … I hope

One of my family members refuses to bookmark anything, saying it’s a given that whatever he saves and tells himself, “I’ll read that later,” gets pushed aside forever. I think he may be right.

I’ve been hoarding quite a few links over the past few weeks, and most of them I told myself I would read later. I’ll drop some of them here, on the hopes that I get the time and urge to actually do something with them.

  • A long time ago I found a site which does a much better job of showing the available console fonts than the meager version I compiled years ago. Much nicer arrangement and a better rundown on many of the obscure ones too. Well done, sir.
  • I am one of those people who learns better from examples than instructions; to that end, Linux Command Examples is a much better resource for me than just a man page or help flags. Submit your own snippets for the benefit of assimilators like me.
  • I name all my computers after the serial number on the bottom; I get and give computers at a rate that really doesn’t allow for meaningful relationships and full names. :roll: I must be doing it right though, because unless I’m mistaken, my paradigm falls well within the suggestions of RFC 1178. ;)
  • AnnaSagrera put together a nice post about using mplayer with some of the video output tweaks, most notably the aa and caca drivers. I don’t agree with her on every point, but she does have quite a few screenshots of what the combinations can yield, on contemporary hardware. I eagerly await a post on jamming Quake 2 through the aa libraries. …
  • Still in graphical mode, I have heard of tools that dim or torque monitor light output, according to the time of day. Ideally this should correspond to your diurnal cycle, and prevent late-night browsing sessions from preventing restful sleep. Ricardo Catalinas Jiménez offers one specific to Xorg, free of external software. Bah, who am I kidding? I’ll sleep when I’m dead.
  • I was digging around for renaming utilities a few months ago and came across a rock-and-roll solution that only relies on find, sed and mv to get the job done. I tried it with a few test files and it seemed to do the trick, but it’s a little too Wild West for me. I like to have some warning about the damage I’m about to do. :roll:
  • I don’t bother much altering the base colors in a terminal, mostly because I’m just not so idle that I have time to tinker with color schemes these days. If you are among the blissfully stressless, you might enjoy this page. No experience necessary; it does all the heavy lifting for you.
  • Of course you know about the years-long project to consolidate text-based software into some sort of navigable list. I do occasionally get submissions that are not really programs but probably worthy of mention. As an example, add this to your .bashrc when you get a chance:
    function http() {

    Now you have an online reference at your shell prompt for all those weird HTTP error codes. I only know 404 and 504, and you can look those two up as a test of your new gimmick. Alma sent me the tip, via a reddit post.

  • I mentioned Pale Moon the other day as a replacement for the tumor people call Firefox; imagine my chagrin when the Linux maintainer deserts it only days after my feeble endorsement. :oops: I’ve been watching this thread to see if a new champion steps forward. Even if one doesn’t, there have been posts to that thread explaining how to build your own, and it might come to that yet.
  • Actual memory usage is a concept that I see misrepresented a lot online. One of the best explanations I know of is here. Not only does it step through the venerable free -m trick, but also takes the time to explain page caching and other memory management. Worth a read, even just briefly.
  • I mentioned a long time ago that it is possible to configure an entire computer to work as a router, but that I hadn’t ever met anyone who actually did it. Here’s a very recent Ask Ubuntu page that talks about the process, and comes to a conclusion of sorts. Don’t throw out that Pentium II, friend. …
  • Last but not least, if you ever needed a quick rundown on some basic Linux console commands, this page is at once one of the most concise and most useful arrangements I have seen. It’s simple, lacks thick graphics and is easy to navigate in a text-only environment. Bookmark that in elinks and save yourself a lot of trouble.

That’s it for now. I have some others but they will probably require further effort before they are suitable for public mastication. We will post no link before its time. … ;)

betty and the cult of personality

There’s a case to be made for making things easier — particularly for newcomers. And of course, there’s a case to be made for keeping things as they are, and expecting everyone — newcomers included — to learn the basics and gain some ground-level proficiency.

I’ve seen more than a few web sites and forums drop links to betty over the past couple months, most touting it as a way to use natural (English) language patterns at the console. For Linux newcomers or CLI-o-phobes, betty is probably a godsend.

As I understand it, betty interprets a request and attempts to link it to a standard Unix-ish command. I like the idea; it suggests one could send instructions to a computer, using natural language (or perhaps even speech-to-text), and expect an intelligible answer or appropriate action.

Usually betty does a pretty good job, so long as she (I’ll just call her “she” for convenience ;) ) can figure out what you want.


And that’s the real trick: Making sure what you want is what betty understands. For example, she has no issue at all with this:

kmandla@6m47421: ~/downloads$ betty how many words are in this directory

She dutifully replies with:

Betty: Running find . -type f -exec wc -w {} \; | awk '{total += $1} END {print total}'

which in this case, was correct. Unfortunately, ask

kmandla@6m47421: ~/downloads$ betty how many files are in this directory

and betty returns:

Betty: I don't understand. Hopefully someone will make a pull request so that one day I will understand.

It’s odd to me that betty can tear apart a directory to count out individual words, but gets confused when asked how many files there are. Is word counting in a directory a command used so frequently that it gets taught to betty? I honestly don’t recall ever needing that before, although I daresay I could piece it together on my own, if I had to.

Moreover, is it really more convenient to type out “betty whats my username” — and here, it’s actually important to deviate from correct English punctuation, because the apostrophe would throw bash into a tailspin — than just to use whoami? whoami is shorter, and given that it’s just a contraction of the natural English words “who am i”, I don’t see how betty’s way is an improvement.

betty’s git page has a long list of precise commands she understands, and can reply to. I have an even longer list of precise commands that betty has no knowledge of, and can’t seem to comprehend — most of which are just one-word changes, like above.

It’s my unfortunate opinion that betty is no more effective or efficient than a mile-long .bashrc jam-packed with aliases for specific commands. If betty doesn’t actually pick apart and interpret a command, in the same way a valid artificial intelligence might, then what betty actually does is obfuscate things: She turns most basic commands, some of which were derived from natural language, into longer commands that carry their own eccentricities.

In other words, betty is the anti-alias. :shock:

The entire business reminds me of a time a few years ago, when I accompanied our CEO on a survey of a local school building in Japan. In the lull between our arrival and meeting the school representative, my boss showed me his smartphone, and demonstrated how it could interpret his speech and return a map showing directions to the school.

Except it didn’t work. He tried four or five times, rephrased his question four or five different ways each time, and the closest he got was the home page for the school. The arrival of the representative saved him the embarrassment of admitting it wasn’t as great as he liked, and me the embarrassment of pointing out that he could have gotten the same information directly, 10 minutes earlier, if he had taken charge of the situation and sought out the map himself.

Shortcuts and gee-whiz tools aren’t really improvements if they don’t work in the way people think and behave. Expecting someone to type out the exact line “betty please tell me what is the weather like in London” (which is supposedly a valid command but returned an error from the git version I installed in Arch) is not an improvement for anyone who instinctively asks, “betty what is the weather in London” or “betty what is the weather report for London”.

On the other hand, learning whoami and its syntax means you can probably navigate almost any variation on “betty whats my username” … with or without the punctuation errors.

I didn’t intend for this to be a hate-fest against betty; as I said above, I like the idea, and I think were it to actually parse and deconstruct language and find a likely solution, it would be near-genius.

My complaint is partly in the fact that, as it is, it’s no improvement — in fact, it’s quite the opposite. And partly in the fact that, as it is, I can get the same results from a long, long list of aliases.

And partly in the fact that, as things are, Linux users seem too eager to join the cult of personality of any tool that “streamlines” console lifestyle, without really taking into account what the tool does … or doesn’t do. :|

Never the twain shall meet

There’s a joke that says there are 10 kinds of people in the world: those who can count in binary, and those who can’t.

There’s another way to divvy up the population of the earth, in technophile terms: those who absolutely fret over minute details of performance in their games, and those who don’t.

I belong to the latter group. My last real gaming addiction was Neverwinter Nights, and I’m not ashamed to admit that I still occasionally start it up again for a day or two at a time. That game is at least a decade beyond its prime, and hardly going to stress any of the hardware available today.

It’s true, I do occasionally lose a few hours of my life slapping around the computer in an eight-way free-for-all in Warzone 2100, or dabbling with 0ad. But I’m no hardcore gamer, not by a long shot. Heck, the games I play most these days rely on text and telnet.

I’ve also had the unfortunate experience of shepherding more than one gamer into the shallow waters of Linux adoption. I say “unfortunate” because I would guess that 90 percent of the time, if not more, it ends in a dissatisfied customer.

And the reason again goes back to those two kinds of people, and how much they’re willing to sacrifice in performance terms. Players who measure performance by fractions of frames-per-second are, in my experience, generally unwilling to make the leap if it means suffering through 198 frames per second with Wine, instead of 206 in Windows 8. Even though your eye probably can’t see the difference. :roll:

In my meager opinion, what’s at play here is actually a sidelong game of geek poseur, and suffering an 8-frame-per-second hit is a blackball to the upper echelons. That doesn’t surprise me.

But I know enough now, and I’ve seen enough failed converts to realize, that there’s no point in offering up enlightenment if the applicant is going to measure their satisfaction in hundredths of frames per second. I don’t sell life stories to have them divided up into individual words, and rejected as statistically insignificant.

I’ve even gone so far these days as to make a complete 180-degree turn on the issue, and shy people away from Linux if I suspect they belong to that first group of people. I know, the revolution needs converts, but something tells me the candidate’s heart isn’t in the game. Or maybe it is, and that’s the problem.

But it wouldn’t be fair for me to give advice I wouldn’t follow myself, so I make a clear division between the games I’ll play in Linux, and the ones I devote to a leftover Windows machine.

Yes, I keep a castoff XP laptop in a closet, and when the time comes, I bring it out and dust off the cover. It’s not networked, it holds no personal information, I don’t worry about anti-virus software, and I’ve stripped out everything but the essential drivers. I’m comfortable with that.

My point is, at this stage of life, it’s most convenient to me to have a Linux side of the house, and an abandoned XP machine in the closet for those rare times when I can dawdle with a game of Deus Ex. And never the twain shall meet.

That may be a solution for you too. There’s no shame in it. I said a long time ago, if you want to eat toast, you buy a toaster. And I’m guessing your kitchen has more than just a toaster in it. ;)