Author Archives: K.Mandla

Poor man’s SSD: A cryptic twist

If you’ve suffered through this site over the years, you will recall there was a time in the previous decade when a little idea paid off big, and an ancient laptop got a nifty upgrade.

Fast-forward to this year, and again, a little crablike thinking seems to have paid off.

Let’s start at the beginning. Remember this machine? It’s humming along nicely, and with only a few shortcomings, I expect it will last quite a while into the future.

Among those shortcomings are a lack of USB2.0 ports, and nothing to interface with SD cards. For some reason Dell never swapped out the USB1.1 ports that were part of the early 8000 line for the higher-speed ports that were more common with Pentium 4 machines. Design flaw, or programmed obsolescence? You decide.

Regardless, the obvious solution is a PCMCIA-to-USB2.0 card, which costs all of about US$2 these days. They’re literally recycle store giveaways, to be honest.

Which means the two omissions — USB2.0 and an SD card reader — are related in an odd way: From PCMCIA to USB2.0, to USB-to-SD reader, to an SD card. It’s not as ungainly as it sounds, and really, I’ve done much worse in the past. At roughly US$6, something like this was well worth the price.

And with a lot of leftover SD cards lying around — mostly from the same camera I’ve owned for about seven years now — this is a good way to pick up a little extra storage space, in oddball sizes.

Now shift gears for a little bit, to a larger, grander scale. Online privacy is something that I think about a lot more these days, and I hope you do too. Knowing that most anything that’s transmitted unprotected is likely to be archived somewhere by someone for sometime has, in short, caused me to retract just about anything I kept on the web — everything to the lowliest .conf file — and either keep it locally or repost it encrypted.

A few months ago I decided the best way toward physical security for that data was to dedicate one entire machine to the prospect of data storage. Starting with the operating system, I wanted something that could encrypt without excessive entanglement, require several passwords to access, be more or less impervious to environmental issues, be self-sufficient and not need network access or frequent updates.

No, I’m not Edward Snowden. I just have a hope of protecting electronic documents, and I don’t think I’m too far from the target.

Hopefully the picture in your mind at this point is about the same as the one I had in mine. I decided to use one of my many leftover machines for the purpose, and even went so far as to purchase a small, inexpensive SSD to avoid the pitfalls of data errors or drive crashes. I installed Linux Mint 17, encrypted the entire drive, encrypted the home partition, and even password-protected grub. I turned the wireless switch off, put it in a generic black laptop sleeve and set it on a bookshelf next to a copy of Walden and a can of compressed air.

And then I got to thinking: Now I’m dependent not just on that drive, but on all the components that keep it running. Why did I lock myself into that particular computer? Just because it was available? Only the drive is important.

So I took it back down off the shelf, unscrewed the case and took out the drive, and put the drive back on the shelf between Walden and the canned air.

And then I got to thinking again: Now I have to put that drive into the computer, every time I decide to move a file on or off of there. That’s terrifically inconvenient.

So I took it back down off the shelf, took out an old USB drive enclosure, dropped it in, and started screwing it back together.

And then I got to thinking, and this was the last time: I only really need about 20Gb of space, for family photos and scanned documents. The drive is three times as big as that, and the remainder will basically go unused.

I have SD cards that are plenty big for that. And while some machines won’t boot from a card reader, almost anything after 2002 will boot from a USB port. And I have a USB-to-SD card adapter. Why couldn’t I just reinstall everything to an SD card?

It’s much more portable. It’s easier to back up. And prices on SD cards are falling. A 128Gb SD card, at the time of this writing, was only about US$60. That’s as much as the value of Vista-era computer I was using, and I spent almost as much on an SSD.

You can figure out the rest of the story. I re-ran the entire installation and encryption process on a leftover 64Gb SD card I got from a family member last year, and it works like a champ. I transferred all my sensitive files onto the SD card, put it back in its teeny-tiny plastic case, and put it on the shelf between Walden and the can of compressed air.

I’ve tried booting that same SD card on a half-dozen machines now, the fastest being a 2.4Ghz core 2 duo Penryn-based machine, and the slowest being an old, 1.6Ghz non-PAE Pentium M (a good reason to rely on 32-bit versions). Perfect performance, every time.

Great security too: Knowing the grub password might grant access to recovery mode, but doesn’t give you access to the drive, and knowing the drive password doesn’t give you access to the privileged user’s home folder. And if I encrypt anything inside there, that will be one last small measure of prevention.

And no hardware issues, since Mint is smart enough to adjust itself to the hardware of the host machine, no questions asked.

I know there are some downsides. It takes a little while longer to boot across a USB port, particularly on that Pentium M. And there’s the rumor that SD cards have limited read-write lifespans … whatever that happens to be. :roll: And besides: I might start up from that card once a week at most, probably less. I’m not real concerned about lifespans right now.

But I’m satisfied at present with this arrangement. It streamlines the entire process and doesn’t lock me to one particular machine into the future. I can drop that card in my pocket, I can dd between two cards and have a duplicate in a matter of hours, I can mail it cross-country without worrying about someone intercepting it, and I can lose it without fear of anyone picking through my 2011 vacation photos. :roll:

So there it is: The poor man’s SSD strikes again. Perhaps I shall sit around for a little while again today, and try to dream up new uses for old ideas. :D

blackbox, as seen in the wake of Openbox

I haven’t posted much here lately, and that’s a sure-fire sign that I’ve been quite busy. You probably know what with. ;)

Yes, after literally years of baby-stepping through the alphabet, I finally finished that ginormous list of applications for the console. The one that I discovered four years ago, dragged around for another couple of years, and finally dissected over the course of the last 18 months.

So, yes, in that alone I’ve been quite busy.

On top of that though, there have been some recent hardware adoptions that I’ll show off later. A couple of them are real prizes, and some might be … curses. More on that in the days to come.

Today though, I needed to come to grips with blackbox, for reasons that will be clear in the future. Suffice to say that it was worth learning on my “production machine,” and sounded vaguely like fun after May’s run-in with twm.

2014-07-13-6m47421-blackbox

That’s Arch Linux again, for no particular reason other than it was easier to strip down my existing Arch-based desktop, and build it back up again with blackbox.

I first remember blackbox from my very earliest days with Ubuntu, and I daresay I tried blackbox before I ever got into Openbox. I even tracked down the original how-to I used to set it up, almost a decade ago.

Some of what’s in that post doesn’t really apply though; there are small changes in what bluevoodoo1 was doing, and what you can do with Arch now, eight years later. Most of those changes are not deal-breakers.

I have to give blackbox credit for being infinitely easier to configure by hand than Openbox. The menu system is strictly brackets-parentheses-braces, for the command-title-executable, and that’s a huge advance over Openbox’s XML-based configuration files. Yes, I know I’m a weakling for complaining about XML. I’ve learned to live with my shortcomings.

Configurations, if you can believe this, are mostly done through the right-click menu. There are quite a lot of settings that will require you to edit your .blackboxrc file — especially the path to your styles (think: themes) — but I’d guess 90 percent of blackbox’s setup is handled through the right-click menu … a la Fluxbox.

And since I mentioned it, blackbox “styles” are fairly easy to handle too. I don’t hold theming against Openbox since that’s generally a single file that needs attention. And part of that can be managed through obconf.

From a ground-zero setup I’d have to say blackbox was quite manageable. I had it up and working in a matter of minutes, and configured to my liking over the course of an hour or so, while I allowed myself to be distracted by events on television.

Once it’s in place, it plays a lot like Openbox, with obvious additions and subtractions here and there. blackbox has its built-in “toolbar;” I don’t recall seeing anything like that in Openbox. blackbox has a “slit” that I generally ignore; I don’t think Openbox uses a slit (Fluxbox did, last time I checked).

Openbox can do a few things blackbox can’t, of course. Most painful to me are the loss of programmable hotkeys — Super_L+1 for alpine, Super_L+2 for Midnight Commander, and so on. If I understand things right, there was a bbkeys utility, a half dozen years ago, that could handle keystrokes like that, but has since faded away. AUR can’t build it, and Debian dropped it.

On the purely aesthetic front, it would be nice to insert proper separators into right-click menus. All my menus in blackbox look like hastily scrobbled lists of programs mashed up against each other. And since I can’t relegate them to key presses, the list is longer and scrobblier than ordinary.

I do admire blackbox’s austere default schemes though. As you can see above I removed some of the frills that remained and came up with a very flat, very rectangular desktop … that springs into life like a 1950s American housewife on prescription methamphetamines.

So in spite of reaching maturity at a time when dual core machines were just mirages in the desert, blackbox has managed to win a few points with me. It definitely shows a degree of greater usability than twm, even if it never approached the feature-completeness of Openbox.

But really: Yes, it doesn’t have all the bells and whistles that Openbox has learned over the past decade, but it does manage all the fundamentals without becoming overburdened with XML gobbledygook or bogged down in the need for endless ancillary graphical tools.

I never thought I’d say it, but in that sense, I prefer this to Openbox. :shock:

A social experiment, and a pill of disappointment

I have a daily habit of perusing a few forums that are bent toward Linux distributions that I prefer. I don’t choose a username associated with this blog, mostly because it draws attention. So don’t go looking for a “K.Mandla” out there. ;)

I don’t contribute as much as I could (or probably should), but I still consider myself a newbie when it comes to 90 percent of the things I see. That’s just the way I am.

There’s nothing lost in that, since most pleas for help are generally met with a correct response. So there’s no call for my bizarre viewpoint on software usability, or scant practical knowledge on post-2006 hardware to pollute things.

A few days ago I noticed that there was a pattern to almost any hardware help request: Regardless of the topic or issue, invariably the first chimes were to demand core information from tools like lscpi or dmesg.

No harm in that. In most cases, the request was warranted, particularly if compatibility was the topic. If you’re going to help someone troubleshoot specific technology, then it’s helpful to know what the computer is reporting.

Except in a lot of cases, there was no real answer or reply to the original question. Random passersby were simply shouting out “What’s your dmesg say?”, then disappearing back into the ether. No one ever answered or approached the original issue. If the original poster was lucky, someone with practical experience or expertise appeared, and coached them through the issue.

But occasionally the question was just left to rot in the sun. A few spattered demands for information, and then the inexorable slide down the page into obscurity.

But what was worse, random passersby occasionally hounded the poster on seemingly insignificant details that couldn’t possibly contribute to the issue. “Not lspci -vv, lspci -vvv” was one. What really irritated me was a demand for lsusb -t instead of just lsusb, which didn’t really make a difference in that case.

It seemed what was happening was a spattered pecking for information, without any real direction or guiding impulse. Rather than actively answering a question, other forum posters were just parroting the same demands they had heard in the past, perhaps hoping some sort of shotgun effect would lead toward an answer. Not often did that seem to happen.

And another post drifted down the page and into digital history.

Seeing this, and coming to the point of this blabber, I decided to try a little social experiment. I picked a fairly straightforward issue of hardware compatibility, and one that could be solved with what (I guess) was a low-to-intermediate level of expertise. Nothing exotic or esoteric. Just an issue of inserting a particular module, and a USB trinket should work again.

I opened a new account on a site, picked a forum intended for beginners and hardware issues, and asked the question.

And as you probably have guessed, within a few minutes I had the first requisite demand for dmesg, which I supplied. No real answer came.

A few minutes later, right on cue, the output of lsusb, and then tail /var/log/Xorg.0.log. But still no suggestion of lsmod, or anything to do with modprobe.

Another poster wanted to know the maker of the USB toy, and chided me for not mentioning it in the original question.

An hour later I had a very persistent reply, in a somewhat belittling tone, asking which ISO I had used, if I had done the md5sum check, and if the installation had finished cleanly. I replied in the positive to each of these, still wondering if this was leading up to the issue of the missing module.

After that, the same person asked which site I had downloaded it from O_o and if I had done a CD integrity check. Bemusedly, I linked specifically to the torrent, wondering how we had strayed so far from the original problem. Ultimately I got a suggestion to reinstall from USB instead.

I stopped checking the post after about six hours. By then it had fallen off the first page of the help site, and I doubted anyone with expertise would find it so far down the list. In all, I got about 12 replies, only one or two of which were helpful, and none of which mentioned reinserting a module. Like a bad day fishing. :/

I know that won’t happen every time. And I know it depends vey much on who is available at any given time, and what they know, and how inclined they are to contribute.

But it also speaks to the caliber of help available on some support sites. Well-meaning but underqualified attempts don’t really solve problems, and in some cases they can cause more frustration and confusion than they cure.

You get what you pay for, I suppose. To a large degree we are all learning this as we go, picking up bits and pieces of knowledge when and where they happen to appear. When you’re genuinely new to the Linux landscape, it’s hard to judge quality.

And for those who know the difference, it’s sometimes disappointing to watch in action. :|

Links for later … I hope

One of my family members refuses to bookmark anything, saying it’s a given that whatever he saves and tells himself, “I’ll read that later,” gets pushed aside forever. I think he may be right.

I’ve been hoarding quite a few links over the past few weeks, and most of them I told myself I would read later. I’ll drop some of them here, on the hopes that I get the time and urge to actually do something with them.

  • A long time ago I found a site which does a much better job of showing the available console fonts than the meager version I compiled years ago. Much nicer arrangement and a better rundown on many of the obscure ones too. Well done, sir.
  • I am one of those people who learns better from examples than instructions; to that end, Linux Command Examples is a much better resource for me than just a man page or help flags. Submit your own snippets for the benefit of assimilators like me.
  • I name all my computers after the serial number on the bottom; I get and give computers at a rate that really doesn’t allow for meaningful relationships and full names. :roll: I must be doing it right though, because unless I’m mistaken, my paradigm falls well within the suggestions of RFC 1178. ;)
  • AnnaSagrera put together a nice post about using mplayer with some of the video output tweaks, most notably the aa and caca drivers. I don’t agree with her on every point, but she does have quite a few screenshots of what the combinations can yield, on contemporary hardware. I eagerly await a post on jamming Quake 2 through the aa libraries. …
  • Still in graphical mode, I have heard of tools that dim or torque monitor light output, according to the time of day. Ideally this should correspond to your diurnal cycle, and prevent late-night browsing sessions from preventing restful sleep. Ricardo Catalinas Jiménez offers one specific to Xorg, free of external software. Bah, who am I kidding? I’ll sleep when I’m dead.
  • I was digging around for renaming utilities a few months ago and came across a rock-and-roll solution that only relies on find, sed and mv to get the job done. I tried it with a few test files and it seemed to do the trick, but it’s a little too Wild West for me. I like to have some warning about the damage I’m about to do. :roll:
  • I don’t bother much altering the base colors in a terminal, mostly because I’m just not so idle that I have time to tinker with color schemes these days. If you are among the blissfully stressless, you might enjoy this page. No experience necessary; it does all the heavy lifting for you.
  • Of course you know about the years-long project to consolidate text-based software into some sort of navigable list. I do occasionally get submissions that are not really programs but probably worthy of mention. As an example, add this to your .bashrc when you get a chance:
    function http() {
        curl http://httpcode.info/$1
    }

    Now you have an online reference at your shell prompt for all those weird HTTP error codes. I only know 404 and 504, and you can look those two up as a test of your new gimmick. Alma sent me the tip, via a reddit post.

  • I mentioned Pale Moon the other day as a replacement for the tumor people call Firefox; imagine my chagrin when the Linux maintainer deserts it only days after my feeble endorsement. :oops: I’ve been watching this thread to see if a new champion steps forward. Even if one doesn’t, there have been posts to that thread explaining how to build your own, and it might come to that yet.
  • Actual memory usage is a concept that I see misrepresented a lot online. One of the best explanations I know of is here. Not only does it step through the venerable free -m trick, but also takes the time to explain page caching and other memory management. Worth a read, even just briefly.
  • I mentioned a long time ago that it is possible to configure an entire computer to work as a router, but that I hadn’t ever met anyone who actually did it. Here’s a very recent Ask Ubuntu page that talks about the process, and comes to a conclusion of sorts. Don’t throw out that Pentium II, friend. …
  • Last but not least, if you ever needed a quick rundown on some basic Linux console commands, this page is at once one of the most concise and most useful arrangements I have seen. It’s simple, lacks thick graphics and is easy to navigate in a text-only environment. Bookmark that in elinks and save yourself a lot of trouble.

That’s it for now. I have some others but they will probably require further effort before they are suitable for public mastication. We will post no link before its time. … ;)

betty and the cult of personality

There’s a case to be made for making things easier — particularly for newcomers. And of course, there’s a case to be made for keeping things as they are, and expecting everyone — newcomers included — to learn the basics and gain some ground-level proficiency.

I’ve seen more than a few web sites and forums drop links to betty over the past couple months, most touting it as a way to use natural (English) language patterns at the console. For Linux newcomers or CLI-o-phobes, betty is probably a godsend.

As I understand it, betty interprets a request and attempts to link it to a standard Unix-ish command. I like the idea; it suggests one could send instructions to a computer, using natural language (or perhaps even speech-to-text), and expect an intelligible answer or appropriate action.

Usually betty does a pretty good job, so long as she (I’ll just call her “she” for convenience ;) ) can figure out what you want.

2014-06-08-6m47421-betty-01

And that’s the real trick: Making sure what you want is what betty understands. For example, she has no issue at all with this:

kmandla@6m47421: ~/downloads$ betty how many words are in this directory

She dutifully replies with:

Betty: Running find . -type f -exec wc -w {} \; | awk '{total += $1} END {print total}'
100

which in this case, was correct. Unfortunately, ask

kmandla@6m47421: ~/downloads$ betty how many files are in this directory

and betty returns:

Betty: I don't understand. Hopefully someone will make a pull request so that one day I will understand.

It’s odd to me that betty can tear apart a directory to count out individual words, but gets confused when asked how many files there are. Is word counting in a directory a command used so frequently that it gets taught to betty? I honestly don’t recall ever needing that before, although I daresay I could piece it together on my own, if I had to.

Moreover, is it really more convenient to type out “betty whats my username” — and here, it’s actually important to deviate from correct English punctuation, because the apostrophe would throw bash into a tailspin — than just to use whoami? whoami is shorter, and given that it’s just a contraction of the natural English words “who am i”, I don’t see how betty’s way is an improvement.

betty’s git page has a long list of precise commands she understands, and can reply to. I have an even longer list of precise commands that betty has no knowledge of, and can’t seem to comprehend — most of which are just one-word changes, like above.

It’s my unfortunate opinion that betty is no more effective or efficient than a mile-long .bashrc jam-packed with aliases for specific commands. If betty doesn’t actually pick apart and interpret a command, in the same way a valid artificial intelligence might, then what betty actually does is obfuscate things: She turns most basic commands, some of which were derived from natural language, into longer commands that carry their own eccentricities.

In other words, betty is the anti-alias. :shock:

The entire business reminds me of a time a few years ago, when I accompanied our CEO on a survey of a local school building in Japan. In the lull between our arrival and meeting the school representative, my boss showed me his smartphone, and demonstrated how it could interpret his speech and return a map showing directions to the school.

Except it didn’t work. He tried four or five times, rephrased his question four or five different ways each time, and the closest he got was the home page for the school. The arrival of the representative saved him the embarrassment of admitting it wasn’t as great as he liked, and me the embarrassment of pointing out that he could have gotten the same information directly, 10 minutes earlier, if he had taken charge of the situation and sought out the map himself.

Shortcuts and gee-whiz tools aren’t really improvements if they don’t work in the way people think and behave. Expecting someone to type out the exact line “betty please tell me what is the weather like in London” (which is supposedly a valid command but returned an error from the git version I installed in Arch) is not an improvement for anyone who instinctively asks, “betty what is the weather in London” or “betty what is the weather report for London”.

On the other hand, learning whoami and its syntax means you can probably navigate almost any variation on “betty whats my username” … with or without the punctuation errors.

I didn’t intend for this to be a hate-fest against betty; as I said above, I like the idea, and I think were it to actually parse and deconstruct language and find a likely solution, it would be near-genius.

My complaint is partly in the fact that, as it is, it’s no improvement — in fact, it’s quite the opposite. And partly in the fact that, as it is, I can get the same results from a long, long list of aliases.

And partly in the fact that, as things are, Linux users seem too eager to join the cult of personality of any tool that “streamlines” console lifestyle, without really taking into account what the tool does … or doesn’t do. :|

Never the twain shall meet

There’s a joke that says there are 10 kinds of people in the world: those who can count in binary, and those who can’t.

There’s another way to divvy up the population of the earth, in technophile terms: those who absolutely fret over minute details of performance in their games, and those who don’t.

I belong to the latter group. My last real gaming addiction was Neverwinter Nights, and I’m not ashamed to admit that I still occasionally start it up again for a day or two at a time. That game is at least a decade beyond its prime, and hardly going to stress any of the hardware available today.

It’s true, I do occasionally lose a few hours of my life slapping around the computer in an eight-way free-for-all in Warzone 2100, or dabbling with 0ad. But I’m no hardcore gamer, not by a long shot. Heck, the games I play most these days rely on text and telnet.

I’ve also had the unfortunate experience of shepherding more than one gamer into the shallow waters of Linux adoption. I say “unfortunate” because I would guess that 90 percent of the time, if not more, it ends in a dissatisfied customer.

And the reason again goes back to those two kinds of people, and how much they’re willing to sacrifice in performance terms. Players who measure performance by fractions of frames-per-second are, in my experience, generally unwilling to make the leap if it means suffering through 198 frames per second with Wine, instead of 206 in Windows 8. Even though your eye probably can’t see the difference. :roll:

In my meager opinion, what’s at play here is actually a sidelong game of geek poseur, and suffering an 8-frame-per-second hit is a blackball to the upper echelons. That doesn’t surprise me.

But I know enough now, and I’ve seen enough failed converts to realize, that there’s no point in offering up enlightenment if the applicant is going to measure their satisfaction in hundredths of frames per second. I don’t sell life stories to have them divided up into individual words, and rejected as statistically insignificant.

I’ve even gone so far these days as to make a complete 180-degree turn on the issue, and shy people away from Linux if I suspect they belong to that first group of people. I know, the revolution needs converts, but something tells me the candidate’s heart isn’t in the game. Or maybe it is, and that’s the problem.

But it wouldn’t be fair for me to give advice I wouldn’t follow myself, so I make a clear division between the games I’ll play in Linux, and the ones I devote to a leftover Windows machine.

Yes, I keep a castoff XP laptop in a closet, and when the time comes, I bring it out and dust off the cover. It’s not networked, it holds no personal information, I don’t worry about anti-virus software, and I’ve stripped out everything but the essential drivers. I’m comfortable with that.

My point is, at this stage of life, it’s most convenient to me to have a Linux side of the house, and an abandoned XP machine in the closet for those rare times when I can dawdle with a game of Deus Ex. And never the twain shall meet.

That may be a solution for you too. There’s no shame in it. I said a long time ago, if you want to eat toast, you buy a toaster. And I’m guessing your kitchen has more than just a toaster in it. ;)

imlib2 and @my_libs@

Nostalgia struck this morning and I was pining after the wmhdplop and wmforkplop dock apps, from … gosh, more than seven years ago.

If you don’t know what those are, or if you don’t remember, the home pages are here and here, and in action, they look like this:

2014-05-10-6m47421-wmhdplop-wmforkplop

That’s them, up in the corner. Yes, they are quite obtrusive. I know. :roll:

Dock apps are my few extravagances, and these two are really the only ones that have survived over the years. Imagine my surprise today when the tried-and-true PKGBUILDs out of the AUR spat out ugly little gcc errors instead.

 gcc: error: @my_libs@: No such file or directory 

This is odd. And what is that “@my_libs@” stuff? I don’t ever recall seeing a gcc error like that before. Of course, I have all the compiling prowess of a dusty brick, but still. …

For once, rather than just yammer mindlessly about it on this site, I put on my deerstalker cap and went investigating. :|

To make a ridiculously long and uninteresting story into a short and only a little more interesting story, the culprit is … imlib2, of all things. Apparently that “@my_libs@” string appears in version 1.4.6 of imlib2, which dates back to around Christmas.

I don’t know what exactly @my_libs@ is supposed to mean there; apparently the packaged imlib-config.in file is some sort of configuration script. It could be stereo instructions for all I know. :roll:

But the Linux From Scratch gang, with the electric precision that earned them their reputation, decided that it was best excised from the source code. With extreme prejudice.

I’m all for that. In short, if you build imlib2 from scratch, either with ABS or yaourt or whatever, do this first:

sed -i 's/@my_libs@//' imlib2-config.in

You should end up with an installable version of imlib2 that doesn’t trigger gcc errors when you attempt to build wmhdplop and wmforkplop. :???: Which was the original goal. :D

The resulting package, to the best of my knowledge, is perfectly compatible on every other front, and perhaps even an improvement in that it doesn’t cause problems. I think. :shock:

I’m debating if this is bug-worthy for the Arch version; it seems like this occurs waaay up the food chain, back to the Enlightenment crew. I’ll at least leave a note on the AUR pages for whoever stumbles past them. ;)

twm, because what’s old is what’s new

I saw a screenshot for twm last week, and it inspired me enough to swing past it for the a couple days. This was my inspiration:

twm

What I managed to create was:

2014-05-07-6m47421-twm-rhapsody-palemoon-vim

That’s Arch Linux again, and it was not the terribly long and uphill journey you might imagine. If you’ve ever worked with Openbox, you’ll have no problem getting twm to do something similar. And in some ways, twm has an edge.

For one, twm is incredibly fast. Window moving and resizing are exceptionally quick, even on decade-old hardware. htop says it’s taking up less than 0.3 percent of the memory out of 1Gb available, which suggests less than 400Kb, while managing five or six windows. To the side of that machine I have a D830 with 4Gb of memory in it running Musca, and the difference is between the window managers is trivial.

Also on the plus side: Most everything you could want to do with twm — colors, borders, menus and submenus — is done in one configuration file, and in a straightforward arrangement.

You can specify per-application settings, desktop-wide color schemes, and exact per-window cursor behavior. You can even jam theme settings directly into your menu, and twm handles it with grace.

It’s simple, and doesn’t have too many frills to distract you. It keeps things clean and fast, but doesn’t become the tiling window manager du jour.

Of course, there are some things I don’t like about twm, or things that I’m used to that I find it difficult to work without. To wit:

  • First and most obvious, twm’s focus model. I realize this dates back a human generation, but it’s immensely irritating. A window has focus when the mouse moves over it. But that does not raise it, which means grazing the touchpad or bumping the mouse while you type sends commands into the next window … or into the ether.

    Similarly (and this is a little hard to explain), it also means when you spawn windows, they are not necessarily focused. Start an application and you get a ghost frame that you can maneuver into place, then click to drop. It’s a good idea, and makes things very fast for older hardware.

    However, if you click and don’t shift the mouse back over the same window, it doesn’t have focus. Which means you have to learn the habit of spawn-shift-drop-then-mouse-over, to actually use the program.

  • The rules to raise windows are a bit strange too. You can give focus to a window by mousing over it; that we already discussed. You can click on a window, and again, it has focus. But a window doesn’t raise to the top layer until you click specifically on the title bar.

    Unfortunately, that means if you’re like me and you have a tendency to “lose” applications in the stack on your desktop, you might end up shuffling things around to find out what the heck you’ve been typing, and see the whole application. Needless to say, it takes some getting used to.
  • There are no layering options for pinning windows to the top, or trapping them at the bottom. Those are features from IceWM and Openbox that, believe it or not, I need on a daily basis.
  • I’m not a fan of the icon manager. I can’t explain that any more than to say try it, and see if it strikes you as cumbersome too. I’m used to a middle-click in Openbox that shows every window, minimized or not, and you can jump straight to them. Since the iconmanager is not quite a panel, it behaves more like a hidden application that holds all the icons that are running, which has to be unminimized in order to unminimize something else. :???:
  • As best I can tell, twm can’t do xft fonts in borders. I might be mistaken and maybe there’s a patch, but I saw/found nothing to suggest otherwise. Of course, that may be part of what makes it fast. And of course, your applications can use xft fonts, so it’s not a hindrance.
  • There are plenty of options for custom mouse clicks, but I had difficulty getting Mod4+Key hotkeys set up. I don’t think twm was ever really meant to spit out a file manager when I press Mod4+2, or trigger gmrun with Alt+F2. I should really try that again, though.

I know some of these things could be corrected, or at least sidestepped, with a little more time and a little more trial-and-error in my .twmrc file. After a while though, I grew disinterested. I am ashamed. :oops:

I have some other minor complaints, but I don’t want you to get the impression that twm was a bad experience. If you run your Openbox sessions close to the bone, or if you can live without all the doodads that come bundled in IceWM, or if you like tiling window managers but you’re homesick for something with a title bar on it, twm is not a bad option. I might use it again, on something with less muscle.

For what it’s worth, and for my own reference later, I’ll drop my .twmrc file here, after the more tag. If you need better ideas or a few other options to spur your interest, try Graham’s TWM page, which helped me build my .twmrc much quicker than picking through the man page. Oh, and of course, xwinman.org has a page on twm and its cousins. Oddly enough, the Arch Wiki page on twm is a bit scant. Perhaps I should do something about that. :|
Continue reading

Express your individuality, but within these confines

My own personal Two Minutes Hate that came in the wake of the Firefox 29 release last week subsided in just about that much time. I saw enough to know that for my purposes, Firefox no longer fit the bill, and immediately sought out a different solution.

For what I’ve seen around the web, I was not the only one disappointed in a Chrome-like redesign. I’ve tried Chrome, and even used it in the office for a short period a couple of years ago, but I wouldn’t ever adopt it at home.

Chrome, and a lot of modern web-based tools, are quickly slanting toward smartphone users, and I don’t belong to that trend. I’ve had touchscreens and ultralight laptops. I even tried a tablet computer once, but I know what I like. To each his own.

It doesn’t really bother me if Firefox tries to meld with Unity or some other cellphone-ish desktop, or if dropping the window size down below 800×600 causes the WordPress.com backend to contort and grow giant thumb-sized buttons.

I don’t even mind it terribly when someone sends me to a mobile Wikipedia page. It’s an oversight and a slight hassle, but I don’t care. It’s just not something that is aimed at me, so I don’t sweat it.

But I’m not part of that crowd. I do my work at glorious 1600×1200, and smartphones don’t appeal to me. Carving the interface to fit a phone’s dimensions tells me where Mozilla’s priorities are, and I can nod and walk away calmly.

That nonchalance isn’t shared in all corners though. At more than one site that I peruse (under a pseudonym, of course), most questions or complaints about FF29 were met with a lot less tolerance.

People seeking alternatives were told to simply learn to use it. Quit your complaining. Shake it off and move on. I even saw a few threads locked and removed by moderators, who apparently were unhappy with the angle of the criticism.

All that is beside the point, since it’s a sad day for Linux when someone with a legitimate request for a substitute is told to stop complaining and “learn to use it.” Haul in a potential Windows convert, and suggestions on alternate software rain from the sky like manna from heaven.

But express distaste with an arbitrary interface change though … and you’ll just have to get used to it. It’s not that big a difference. This is progress. It’ll grow on you. Keep your mouth shut. Conform. By all means, express your individuality, but keep within these confines.

That’s human nature I guess, and I blame biology more than culture or society. But out of deference to those who might prefer something else, I collected quite a few links on how to fix What Mozilla Hath Wrought.

Provided your machine has adequate muscle, you have the option to pull in a couple of extensions that supposedly give better control over the interface.

  • The Classic Theme Restorer not only allows you to resculpt the UI, it allows you a lot of control that otherwise is only avaiable through about:config. In that sense, it’s worth installing even if you like the new arrangement. This might be your best option overall, especially if Mozilla itself suggests this in their support pages.
  • Australis Slimmr also gives you control over spacing in titlebars and so forth, plus a few other options. I found I didn’t need this as much as I initially thought I would.
  • Tabs on Bottom restores the bottom tab arrangement, which to be honest, I’d be shocked if someone still used.
  • Status-4-Evar is another extenion worthy of including even if you like the new interface. This adds options for your status bar that you probably didn’t even know existed.

There are lots more, but I won’t list them because honestly, my hardware begins to suffer if I bog down Firefox, which is already a pig, with too many extensions. Start times grow longer, basic functions start to lag, and overall performance takes a hit.

Short of overloading Firefox with corrections to Mozilla’s tectonic drift, you could downshift and stick with Firefox 28 — and that was my initial reaction, if I must be honest.

Reinstalling in Arch is as simple as pointing pacman -U at the old package, and for what I hear, life is even easier with Debian and Iceweasel. Ubuntu users should probably read through this for advice.

It’s also worth thinking about complete alternatives. I gave Opera a try during my exodus from Firefox-land, and to be honest, it was quick, snappy and replete with features. If I had taken the time to find analogues for Disconnect.me, HTTPS Everywhere and Adblock Edge, I daresay I would I have stayed there.

But in the end, the browser that shyly held up its hand and stole my patronage was Pale Moon. Everything I had used in FF28 was (more or less) supported in Pale Moon, and I didn’t have to do much more setup than I usually do with a fresh copy of pre-29 Firefox. Extensions all worked, bookmarks obediently shuffled into place, and the UI was what I knew and wanted. Don’t bother wrapping it up, I’ll wear it out of the store.

Best of all, it seems lighter and snappier than FF28 was, on the same machine. I call that a bonus.

Arch users can get this with the AUR package; others might have to wrangle a little bit to wedge it into place. From my perspective, it’s worth it.

So there it is then. You’re allowed to have choices, and you’re allowed to ask for directions when the status quo becomes untenable. Nobody will shout you down or tell you to live with the changes.

And maybe if you’re lucky, the new kid in town will end up being an improvement. ;)

Animated gif previews of video files

A few months ago I won some bonus points with the boss, in a way that deserves keeping a note here.

Our office keeps a large collection of training videos on a networked drive, as part of the orientation program. New employees walk through them in sequence as part of their introduction, and I expect other companies may do something similar.

The files are named with a complex numbering system though, and it doesn’t lend itself to the actual topic of the video. We’ve mentioned that to the training staff and IT specialist, but it’s rather far down the list.

I took the initiative one afternoon and following a lead by Prashanth Ellina, found a way to thumbnail a video frame, and use that as a preview.

Prashnath’s command works like a charm for run-of-the-mill AVI files.

ffmpeg  -itsoffset -4  -i test.avi -vcodec mjpeg -vframes 1 -an -f rawvideo -s 320x240 test.jpg

My only addition would be the -y flag, which overwrites old images; that really only comes into play while you’re testing the output though. ;)

A picture is worth a thousand words, so a six-frame animated gif is probably worth about 6,000 — and that’s my contribution to Prashanth’s trick. You’ll need imagemagick for this part:

for i in {10..70..10} ; do ffmpeg  -itsoffset -${i}  -i test.avi -vcodec mjpeg -vframes 1 -an -f rawvideo -s 320x240 test-${i}.jpg ; done ; convert -delay 75 -loop 0 test-* test.gif

And the end result is a simple, six-frame looping gif that shows the content of the first minute or so of the video.

Since our videos all incorporate the same 10-second intro before showing a title screen and opening the lecture, this gives us a snapshot opening frame, followed by a few moments of content. It’s easy to see what the topic is, and remember where you are in the sequence.

You could adjust that loop to start at the 60 second mark and snap a frame every minute, or however you like. It’s convenient and flexible, and the end result can be seen in a browser or an image viewer, so it doesn’t rely on specific software.

The only downsides that I can see involve how ffmpeg tracks to those points in the video: it runs out to the 10-second mark, then snaps. Then restarts, spins out to the 20-second mark, and snaps. Then 30, 40 and so forth, taking a longer time to track out each time.

I don’t know if there’s a way to adjust Prashanth’s original command and just loop through once, and snap at every 10-second interval. I leave that to higher minds than mine.

But it might be inconvenient if you have a lot of videos to gif-ize. In our case it was about 50-60 videos, which was easy to loop through but took about half an hour to process. The end results were worth it though.

This worked for me in Arch, but I don’t see how it wouldn’t work with almost any other Linux flavor that includes ffmpeg. And since Prashanth’s command dates back to 2008, I think earlier releases would be fine as well. For what it’s worth, I’ve used this with Flash videos too.

And that … is everything. For now. :mrgreen: