An unlikely hero: Xubuntu

Part of that unfortunate rant from a day ago came about after spending a day or two in Xubuntu, after spending an equal amount of time in Kubuntu.

Originally my foray back into the *buntus was meant to give fair time to alternative renditions of Ubuntu, and avoiding tainting the entire set with tirades against the flagship.

On the one hand, it was important to do that. And it has been a while since I’ve used some of these versions, even if I still feel a tinge of disappointment when I try them.

 

I’m not going to dwell long on my failed relationship with Xubuntu, mostly because it’s ancient history. I stopped using it, that’s about all.

It appears to be working in a lighter direction though, so I will give it that. The default desktop, unless I am mistaken, is more “traditional” than straight Ubuntu 11.04, and relies only the native XFCE compositor for shadow effects.

It may just be that the desktop “style” is a cycle or two behind what vanilla Gnome does though. Like I suggested, I don’t follow the outlying desktops so I don’t know the plan.

It has a few peculiarities though. The pop-up notification boxes for networking or volume control don’t seem to vanish automatically for me, which means they completely block anything underneath, until it’s explicitly closed. Perhaps I just don’t wait long enough.

The pop-up launcher bar at the bottom of the screen is vaguely clever, in that it uses the standard XFCE panel and adjusts its settings to behave like wbar or AWN.

Speedwise, I can’t tell you if it’s necessarily an improvement over any *buntu, or even a past version of itself. The computer I tried it on is really too fast to make a comparison.

Granted, it’s a carrying a lot of Gnome already. But that just means the way it works and behaves is a little more conventional.

I can tell you that adjusting it to a little more traditional XFCE arrangement, like you see above, did make me more comfortable and make the desktop a little easier to manage.

Which means that ultimately — and it’s strange to say this — if the new Ubuntu desktop proves too cumbersome or counterintuitive for you, like it did for me, Xubuntu might be an answer.

Xubuntu to the rescue. Who would’ve thought? 🙄

Two small additions

I apologize if your blood pressure is up; I managed to stir up the pot quite neatly with that last post.

I don’t ever troll though: The questions were legitimate and quite clear to me when I wrote them yesterday.

And they still are. I feel no differently about the issue than I did a day ago, even with the swirl of opinions that came forth.

I do have a couple of ancillary points that I want to touch on, before I drop the topic for a while.

First, invariably someone mentions video editing as a task that requires beefy hardware and consequentially beefy software.

I suppose in it’s current state, desktop video editing does require a certain measure of supporting software to get the job done. To the satisfaction of casual computer user Joe Public? Sure.

But remember that video editing was not invented in 2005. Steve Jobs and iWhatever didn’t build the niche from scratch. In fact, Steve and Co. are johnny-come-latelys, if anything.

Chew on this for a little bit: Way, way back in 1990 it was possible to do professional-grade video editing on a machine that was running at just over seven megahertz with a cruel baseline of only 512 kilobytes — expandable to a meager nine megabytes.

Pause for effect.

Of course, that machine was the almighty Amiga 2000 paired with Video Toaster and some external hardware. I don’t need to tell you the pure, distilled genius of that ensemble. Amiga fans can take over from here.

Yes, your desktop video editing system might do more now and at a better pace. But please don’t hold out that singular task as some sort of validation of desktop software bloat. I don’t buy it.

And invariably someone mentions Game X running on System Y as a convoluted justification for increased desktop ballast.

Linux users aren’t as noisy in that category, but I did get one or two e-mails saying, “Ja, you’re right, but I play Game X which needs System Y which won’t run on anything less than Hardware Z.”

The mind boggles. Let me introduce you to my little friend.

That’s Oolite, an open-source OpenGL rendition of Elite, a game that holds more No.-1 spots on best-of-all-time lists than you knew existed.

This is a game that incorporated wire-frame 3D graphics, a trading and economics system that includes fluctuating market prices, a series of eight galaxies all with 256 planets each having discrete and predetermined characteristics, spaceflight and physics models, radar and target tracking systems, weapons arrays, ranking systems … you name it.

Original release date was 1984, and was initially written to fit into 14 kilobytes of machine code.

You don’t need to convince me. I am sure and completely confident that Game X is so, so, so much better than Elite ever was.

My point is that there is a history to computer software that goes back decades, and each generation did the same if not better with less.

The fact that you need hardware strong enough to run a certain grade of software that lets you do that same sort of thing as a decade ago … well, that’s where the logic fails for me.

And that’s where I’ll put a cap on this for a while. I have a lot of other things that need note, and I don’t care to spend too much time debating whether 1984 is an improvement over 2011. 🙄

A failure of logic

Here’s a legitimate question, and one you should consider: If your CPU is 20 times faster than hardware from a decade ago, why does it take the same amount of time — sometimes longer — to go from a cold start to online and reading e-mail?

In light of desktop advances over the past few months, and ones that are due over the next few more, it becomes more and more curious.

Unity, KDE 4.6, Gnome 3 … are they all improvements, if they’re requiring the same amount of time, but more powerful hardware?

If I rephrase the question, it becomes easily blurred: If you have thirty to forty times the memory your computer had ten years ago, why does it require a proportionate — or perhaps even greater portion — amount of resources to manage day-to-day tasks?

At this point, conventional wisdom says, “Well, it’s easy and cheap to bolster the amount of resources available to a computer, so the question is moot.”

Quite to the contrary. Jamming a PC to the brim with the fastest processor and biggest hard drive and most memory does not erase the fact that the software it runs is becoming less efficient.

Which is a point asserted by the original question.

Which, by the way, was not mine. It’s from a Linux Journal article.

From nearly ten years ago. 😯

It’s sad to think that, over the course of nearly a decade, the issue is still lurking. And oddly enough (or perhaps not), Marco Fioretti’s rationale for the RULE project is still lurking too.

But I’m just an end-user. I don’t code — I haven’t the time or the skills at this point in my life to make a meaningful and considerable contribution.

So perhaps pointing out the incongruity — the perversity, almost — of relying on stronger hardware to run heavier software to do the same tasks as a decade ago … well, maybe that’s rude.

But I am just an end-user, and that means I have the option of throwing my meager weight behind projects that don’t follow that trend.

It does not serve my interests to use or endorse software that needs additional hardware, not because of the financial implications, or ecological self-righteousness, or because of underprivileged communities elsewhere on the planet.

It’s because logically, at its core, the situation and popular prescription make no sense.

So no, I won’t be buying additional hardware to meet the demands of Unity or KDE 4.6 or Gnome 3, nor would I with Windows 7 or Mac OS … whatever Mac is up to these days.

When software and desktops follow a curve that suggests speed and efficiency over glitz and gluttony, I will be on board.

But until then, I am doing fine with a 15-year-old Pentium and a few razor-sharp console programs. To each his own.

Quick interlude: AssaultCube

Just a brief note right now: I’ve had a quick run at AssaultCube, and found it worthy.

I like that the original Cube/Sauerbraten game has gotten a makeover and gotten a little more … focused. It retains its speed and playability, and adopts a much more … focused … theme.

Give it a try. 🙂

Kubuntu, for better or worse

My two days with the Kubuntu 11.04 beta have been a mixed bag, with some trivialities that needed addressed, but mostly positive experiences.

I can honestly say that after about 48 hours of learning what equates to what in k-series software, there’s quite a bit I like, and some things I don’t.

For one thing, I feel a preference for KPackageKit over the obtuse Add/Remove software tool in Gnome Ubuntu, although I don’t really know if they equate or not.

KPackageKit seemed to be better prepared to find and install things, while the Gnome add-remove tool used to be vaguely useful, but took a giant step backward a year or two ago. Nowadays I just go straight to Synaptic.

Rekonq, if that’s the right name, almost supplanted Firefox for me, except that I had problems with its ad blocking system, and I unfortunately prefer an ad-less Internet.

I did, for a short while, use them side-by-side though, and didn’t find either one to be tangibly superior.

On the other hand, I had major problems with the screensaver in Kubuntu. Enabling any screensaver would behave as normal until I moved the mouse or otherwise tried to waken the desktop.

At that point the machine would fall back to a tty screen, flicker twice, then return to the login screen — killing everything that was running in the background, as well as the network connection.

After the first or second stunt, I just disabled the screensaver altogether.

Every desktop comes with its eccentricities though, and my homemade ones are no exception. And years of experience have taught me that 90 percent of these weirdnesses come at my own hand.

I still think, as I have long thought, that anything KDE comes up with is heads and shoulders above Gnome in terms of attractiveness and flexibility.

And so long as the Gnome philosophy says I can’t be trusted, then I’ll probably continue to avoid it. That, and its unnecessary weight, are the least attractive points about it.

So Kubuntu wins points this time around. Next, I’m going to revisit an old acquaintance.

I am cringing, even as I type.

The other *buntus

Guilty as charged, I generalize too much. I sling mud at Ubuntu without taking care not to splatter the other Ubuntus in the process. So I’m going to amend that over the next few days.

Some will call me a glutton for punishment for starting off with Kubuntu. Others will slap themselves soundly on the forehead and mutter, “Out of the frying pan, into the fire.”

And both will be right; straight Ubuntu might swallow a mighty chunk of memory just to get off the ground, but Kubuntu 11.04 gobbles it greedily.

But I won’t harass Kubuntu as much as Ubuntu for being a memory hog … even though my system is consuming as much as 290Mb on cold boot. 😯

(That’s on the core duo, of course. I dare not run it on anything else. 🙄 )

No, I’m not prejudiced, I just always seem to remember KDE eating up more. So I expected it. It’s not delightful, but it wasn’t a surprise either.

But I’m going to go back and really rub vanilla Ubuntu users’ noses in it, and repeat what I said years ago:

It’s already got all the effects, the gloss, the shimmer, the bling and the splash. It came prepackaged with everything you see above … and more.

So if you’re working hard to prettify Gnome — either as a developer or as an end user — why are you fighting it? What you really want is KDE.

Next stop is the purported lightweight of the family. 😈

Information, please

Some of the best console programs don’t do much except offer up information … in abundance. Technically that makes them tools and not so much applications, at least from my limited viewpoint.

sox bills itself as the “Swiss Army knife of sound processing programs,” and whether or not it lives up to that billing is for you to decide.

It does usually come with a nifty tool though — soxi, which will give you a rundown on the information available for an audio file.

It’s not terribly verbose, although it does show many of the key points, and will probably suffice for most purposes.

If you want more detail though, mediainfo might be a better solution.

Quite a bit more, I think you’ll agree. And if you look close you can see the twofold (threefold? fourfold?) beauty of mediainfo — that it’s not limited to audio files. It’s smart enough to sense an image file too, and give the information it can about that.

Torrent files are terribly contorted, and if you want to find out what it’s doing or where it’s going, it’s a bit inconvenient. Or maybe I’m just too used to *nix-ish plain configuration files.

torrentinfo can help with that though, deftly carving through the knot and surrendering the important stuff.

And a bonus — color! 😉

The last one here is not so much a tool as a perl script, although it does an admirable job. This is boxinfo.

Don’t be disappointed. boxinfo sends its best work to an html file which, when opened, looks something like this.

Nicely formatted, easy to read, with everything from hardware details to environment variables, arranged in clean tables.

So what’s the use in all these tools? After all, cuing each one from the command line any time you run into a mystery file … well, that’s not very convenient.

Ah, grasshopper. You must learn to think a little more creatively than that. Imagine what wonders you can achieve if you tie these simple information tools into your favorite file manager. Click on a file, see all the information available about it. … 😯

P.S.: A big thank-you to persea, who did 90 percent of the legwork for this, and suggested a three-way partnership between ranger, atool and mediainfo. But persea can explain that to us. … :mrgreen:

Why Tron Legacy fails for me

I had the opportunity to see the recent Tron movie last night. Already you’re probably wondering what this has to do with Linux on old computers, but bear with me. I can make this work.

I never held out any hope that the new movie would eclipse the old for me. I put the DVD in the tray knowing full well that my fondness for the original, which I saw in theaters probably three times when I was a kid, had doomed the new version from the start.

And that was more or less the case. I don’t point the finger at any one person for its failure to enthrall me. The actors were fine. The effects were up to par. The music was great (and I dare not say any less than “great” for fear all of the Internet’s love for Daft Punk will come down on me like a ton of bricks 🙄 ).

But it didn’t have nearly the charm or charisma (or camp) of the original. I can encapsulate my explanation by saying simply, that you had to be a kid in the early 80s to really appreciate the original.

The new one had every advantage the old one didn’t — technology has finally caught up with what the imagination demands. But ironically, after all these years, we’ve seen just about every trick Hollywood has to offer.

Even bullet time CGI is a decade old now. Glossy light cycles and actors collapsing into marbles? It was only a matter of time.

Which means the only thing left to save it was the story. And that failed, tragically. Nothing in the story was the least bit innovative for me.

They would have been better off just remaking the original, and putting to into place the effects we all just dreamed about, nigh-on 30 years ago.

Long on technology, short on creativity. That’s all it takes to be a big-budget Hollywood movie any more. It’s not about a good story, it’s about dazzling people with computerized glitter.

But I hardly blame Hollywood. Public conception of technology isn’t about function, it’s about flash. It’s not about quality, and what does the job, it’s about what impresses coworkers, costs the most, or sparkles when you hold it up in the sun.

Function and quality take a back seat to glitz and gloss, whether it’s a cellphone, a microwave oven, a laptop computer or the operating system you run on it.

So no, I don’t blame Disney for tearing off one more chunk from the corpse of Tron, spritzing it for the younger generation and offering it up for public mastication.

They’re just doing what the crowd wants. Maybe one day there will be a Tron movie that can engross us on the basis of its creativity and imagination, like the first one did.

But I don’t expect it. Our toys get simpler and shinier, and our movies get simpler and shinier. Our lives … well, let’s only hope they’re improving in different ways.

I’d hate to think there was less quality, and more superficiality, in life today.

Now if you’ll excuse me, I’m going to watch the original, one more time. On my 10-year-old laptop. 😈

How little I know

I do not know everything. Of course, knowing that I don’t know everything only means that I know how little I know, and that I know that I don’t know something when I see it.

Setting aside grammatical parlor tricks, this is an issue because the giant list of software that I found a few months ago is populated with a lot — and I mean a lot — of stuff that I just … don’t know.

Perhaps an example would be a better way to explain it. Arch has arpwatch in AUR, which comes bundled with arpsnmp — both of which are on that list.

But for the life of me, I can’t figure out how to use it, or why I would want to. Hence, no screenshots. 😦

I’m still pretty much a desktop user (even though my desktops are rather bizarre 😉 ), which means one of two things to me: Either I will never need arpwatch/arpsnmp, or I already use them and I am oblivious, because they’re buried deep under layers of other software that shroud them from view.

Both are possible, and both are fine with me. But what that means is that arping, arpoison and arpspoof are likewise mysteries to me. Maybe I will never in my life need them, and maybe I use them every day and just don’t know it.

And so I’m back to where I started, admitting exactly how little I know.

And a vim mystery

As if that wasn’t bad enough, I am having a hard time figuring out why it is necessary to include Mercurial, gtk2, gpm, libxt, libsm and desktop-file-utils in the construction of vim.

This all comes about because, in the process of stripping the guts out of ConnochaetOS and converting it to a framebuffer speed demon, I realized that vim choked without gtk2.

And libxt, and almost all the rest of that stuff above. And the dependencies, of course.

Mystifying, but I am no expert. I tried to rewrite (with considerable effort) the vim PKGBUILD to omit all that stuff, use the 7.3 tarball, and lose a little weight.

It’s not going very well though. Big, complex applications like this always seem to spin out of control and turn ugly when I write my own PKGBUILDs. I do better with simpler stuff.

I’ve searched AUR for something like a “vim-nogui” or Debian’s vim.tiny, but there are quite a lot to skim through. I’ll keep trying and keep searching. Maybe something has already been done. 😐