If thou beest he — but Oh how fallen! how changed. …

Day by day things get a little more busy, and the month of March will be a real grind for me. And so as small tragedies unfold and develop, I have to make quick mental notes and hope to tack them here as time allows.

For example, a week or two ago, this machine suffered the same fate as this one, with the simple act of opening the lid resulting in a thousand tiny shards of plastic bursting in a thousand directions.

And to add insult to injury, the torque of the splintering bent the LCD, and a fine prismatic web enveloped the lower right quadrant. So not only was the casing and chassis on the verge of complete collapse, but the display was suddenly valueless too. Fate — or the gods or whatever deity you prefer — had made its decision, and I was left with the pieces.

I blame the natural deterioration of plastic, which must, I suppose, slowly decay with age and give way to the laws of physics. It doesn’t dry my tears, but that is the way of all flesh, I suppose.

Ironically, not a day later, I’m handed a 700Mhz Thinkpad T22 with most of its components in working condition. A little elbow grease, a few memory chips, and the EzBook is mostly forgotten. Or at least transplanted into a new host. :roll:

And call me crazy, but just about any Thinkpad from pre-2012 or so is now a definite keeper, and that’s the real point of this little speech today. The golden era of the Thinkpad came to a thundering halt only days ago, with the revelation that Lenovo was preinstalling laptops with the faulty certificate tied to Superfish.

Any crudware so bad that the U.S. Department of Homeland Security tells you to uninstall it is an honest-to-god, plain-Jane screwup of colossal proportions. Not that the U.S. government has any real integrity on the subject of computer security, but you couldn’t sell me a new Lenovo now for holding shut the screen door to my back yard. I don’t think it could handle that much responsibility.

Of course, the slow decline of Lenovo machines was something I had sensed earlier than this week. The newest Thinkpad I have now is a T410 that’s already pushing four or five years old, and it’s only so-so in my mind. The newer Lenovo-branded laptops I’ve seen were even more disappointing.

A friend’s Y560 had so much bounce in the keyboard plate that I thought I was typing on a trampoline. And at a time when it was barely two years old, he’d already lost sound in one speaker, torqued the earphone jack so badly he couldn’t insert a plug to it and had a lot of play in the hinges.

I’ve seen a few Yogas and some high-end “Thinkpads” in the past year too, and nothing enthused. Most were not far off from mid-grade department store machines.

Sad, really, for the line that once held the mighty T61p, which was a solid full-size performer with fantastic graphics at a fantastic price. Or the X60, which is still available in resale and is probably one of the best portable dual-core full-feature laptops ever made. I’ve had one of those and an X61 too, and I’d be willing to part with cash — actual cash — to find another one in decent condition.

And that from a person who gets most of their machines for free or as trade.

I’d even be willing to reach back to the X120e, or the X31 or even the i-Series for positive experiences. In fact, I can think of only one Thinkpad I’ve had in the past decade that didn’t leave me smiling, and that one dated back to the early days of Pentiums, and the rise of the brand. From my perspective, the arc of the Thinkpad started around the Pentium II, tapered off four or five years ago, and came to a crashing halt last week.

Of course that does mean any Thinkpad you’re holding that wasn’t built post-2012 is a winner, at least in my book. I really don’t mind if it’s a Pentium, a II, a III or a 4; what you have is a bona-fide slice of computing history, made at a time when the Thinkpad was trustworthy, dependable, flexible and powerful. I’ve got more than my share right now, and I’ve had an even broader array in the past — and I’d still be inclined to invest in one that had potential.

I have to draw the line when the plastic shatters and tiny black splinters fly across the room, though. :| That is the way of all flesh, I suppose.

Reviewing the review, the reviewer and the reviewee

I had an interesting conversation by e-mail last week, around the role of software reviews, within the context of that last post when I said Linus was right. My correspondent wanted to know if my process or stance had changed since Gamergate, in the way I approach software, phrase my posts or communicate with developers.

I had to look it up. The only context I had for gamergate was leftover from my primary school biology classes. :\ But after I knew what the hubbub was, the short answer was no.

The long answer went something like this: No, because I wouldn’t call myself a gamer — or at least, not any game that’s likely to still be played on a large scale — so the events of the last six months were a minor surprise. It’s hard for me to relate to triple-A titles on the market now when I still think a few hours of Deus Ex is a good day of gaming. :roll:

So whatever swirl of sewage has engulfed the gaming market on any given day is a bit distant for me. I have no immediate context for the latest clickbait drama.

And I wouldn’t call myself a reviewer either. Sometimes people ask me specifically to “review this program” or “review that application,” and I don’t split hairs over the use of the verb. But “reviewer” to me implies the ability to apply a standard of measurement, which in turn implies an equitable level of prowess, which in turn implies a quality of wisdom and the ability to evaluate through comparison. And when it comes to writing and designing software, I have … hmm, let’s see … uh, okay … none of those things. :|

So what do I bring to the party? Only a perspective built on practical use, learned the hard way over a decade with Linux … and over a decade more with day-to-day computer use in general. It’s true, I draw on experiences that reach back to the dark days of mainframes and dumb terminals, but I don’t count that as any special talent for “reviewing.”

I also have the luxury of working with free software, and removing the money element from the equation changes things drastically. The author of a poorly designed, poorly executed script posted as a freebie on Github is unlikely to take umbrage if I tell the world it’s poorly designed and poorly executed. Chances are it does whatever the author wanted, and everyone else (including me) can go get stuffed.

But it also means people aren’t usually reliant on their work as a source of income, and that deflates the situation. So again, if I point out that text editor X doesn’t actually edit text, no one has really lost financially … except for me, in the sense that time is money, and I lost time to find out that it didn’t work. :roll:

And in a smaller sense, I do this for free. It’s the last vestige of a hobby that evolved from a diary that kept track of an experience which turned out to be life-changing, and was never monetized, and never will be. So both parties are on equal footing, in terms of money flow.

That’s not the case with big-name, high-budget software — the developer is always looking for money, and the public is always trying to keep it. A negative review might wave off a few potential buyers and put a dent in the corporate balance sheet, or it might wave off a few potential buyers and prevent someone from wasting their hard-earned dough on a real thumbsucker. I don’t have to worry about that here.

There’s one more thing though, and this might be the most important: You don’t know me, and I don’t know you. It’s not personal. There has never been a name — a real name, mind you — attached to “K.Mandla,” and there never will be. If I say something crass about your opus magnus, it’s not directed at you. It’s directed at your work, your product, and its overall usability.

It may be that I don’t have a frame of reference for what you want to do, and I usually acknowledge those cases. It may be that our systems are incompatible or improperly configured, and I acknowledge that too, when I suspect it. Or it might just be that your program is crap, and it’s plainly obvious that it’s crap, and everyone can see that it’s crap. But it’s got nothing to do with you as a person.

I don’t care if you are a man or a woman, just like you probably don’t care if I am a man or a woman. I don’t care if you’re an ancient Linux guru, or a 14-year-old programming kernel modules out of your dad’s basement. I don’t care if your skin is black, white, brown or green with flecks of purple that sparkle in the moonlight. I don’t even care if you’re actually a dog, and nobody knows it.

None of that really matters if your software is crap. Me telling you, and whoever else in the thousand people that might wander past this page in the course of a day, is really only pointing out a deficiency that’s plain to see. If it damages our relationship … well, we never really had one to start with.

I do try to be understanding. If someone is obviously learning, I can make allowances for that. If a program is obviously incomplete, I make allowances for that too. People occasionally write back to me and say, “You know, that program was never really intended for wider distribution. I just posted it to keep it handy.” That’s fine too, and if I know that, I try to be fair.

But there are times — oh brother, are there ever times — when software is junk, and we all know it’s junk. And you’re just going to have to deal with that fact. Me saying it here, and perhaps wounding your ego in the process, is only drawing a line under a fact that we all see.

So no, Gamergate had no real impact on the way I do business here. I don’t soft-shoe my posts or sugar-coat bad news, and I don’t think I take responsibility for program failures more often than I should. If your program is good — and we’ve already seen a lot of those this year — I try to say so, and we all can applaud it. And if it’s crap, I try to say so, and you can just deal with it.

Once again, welcome to the real world.

Linus was right

I’ve written and rewritten this post about four times this week, trying to get my thoughts on to the screen in the same way they appear in my head. I’m not having much luck, so I’m just going to blurt it out: Linus was right.

About a week ago Linus told a crowd in Australia “I don’t care about you,” which was dutifully reported by Ars Technica as though it were an on-air vulgarity uttered by the pope. A lot of people held that comment up to the light alongside his record for (ahem) strong language on mailing lists, and wept a little for the future of open source.

I’m not one of those people. I don’t worry about the future of Linux after last week; in fact, if anything, I’m reassured by the event.

Let’s be serious: Who do you want at the helm of a linchpin open-source project like the kernel? A dainty hand-holder who says a contributor’s code is “special in its own way,” or a zealot so focused on the project and its viability that the slightest mistake is going to prompt a royal beating?

Do you really want someone to pat a developer on the head and say, “Let’s see if we can do better next time?” Or should faulty and substandard code be met with a red-hot tongue-lashing and public shaming? Should developers be special little snowflakes? or elite programmers who have to consistently meet standards to contribute? Who do you want writing the code that runs your laptop? or your router? or your server? :|

I think it’s easy to isolate that one clause — “I don’t care about you” — and paint it in a difficult light. The public likes that: It’s drama, and the press wins a few more clickthroughs as a result. We’ve talked about that.

What’s more important though, was the rest of Linus’ sentence: “I don’t care about you. I care about the technology and the kernel — that’s what’s important to me.”

And that’s the part that reassures me — not just as a humble desktop end-user, either. It’s satisfying to know that the person in charge does it out of passion, and brooks no incompetence. If it trickles down to his contributors as uncaring or indifference, and that somehow bristles them, then they need to find new lines of work.

There’s another reason this little soap opera needs attention, and I’ll say it in brief because I’ve talked it over before: You can’t let the whim (or the whine) of the masses steer your project.

I mentioned this five long years ago under different circumstances but I stand by what I said then, and it applies even more now: You set the terms for your project, and if someone doesn’t like it, they can move on.

Personally I see that in a lot of the way Linus handles kernel development — “This is what we do. Contribute or fork, but don’t waste our time.” This is where the bus is going. Ride with us, or get off and walk. Our itinerary is not open to discussion. If you don’t like it, there’s the code, build your own. Don’t bother us with your mockups and wishlists. Get busy or get lost.

I can’t say it’s the best management style, but I think given the scope — both in technical and geographic terms — it’s probably the best solution. I’ve never managed an open-source project specifically, but there are definitely similar situations where an abrupt and off-putting management style is the best solution.

If you don’t like it, if it damages your sense of self-worth, or if it invalidates your image as a special and unique person capable of attaining your dreams in whatever form and fashion you desire … well, what can I say? Welcome to the real world.

The seven-year itch

I haven’t posted much here lately, mostly because of the holiday season but also because the flux of hardware through my house is either feast or famine — either I’m suddenly swamped with four or five new machines that depart equally quickly, or I sit and tap my fingers at the lack of something new to try.

But a week or two ago I thought through some small difficulties that arose with this machine, and made a change that is … rather unconventional.

If you know or remember much about the 8000 line of Inspirons, you’ll understand that the 8200 was really the capstone of the C-series. You could argue that the C840 and the M50 were its counterparts, and I wouldn’t disagree. But the home market (by my estimation) embraced the 8200 in a way that I feel outstripped the Precision or Latitude versions, even if they were compatible even down to the BIOS.

In that sense, the nVidia GeForce4 440 Go in its 64Mb renditions was about the best video card you could implant into an 8000-line machine. I’m not counting the Quadro4 700 Go GL mostly because I’ve never been able to find one — I saw one for sale on ebay early last year, and the price reached a level that could only be described as ludicrous. It’s too rare to compare, so to speak.

Point being, the nVidia card has always — even almost a decade ago, when I was rocking Crux Linux in a souped-up 1Ghz 8000 machine — been the card of choice. And even more so now, since the cards themselves are under US$20 used, and the lesser cards are basically giveaways.

There was an analogue in the ATI Radeon Mobility 9000. It had 64Mb of memory as well, but as we all probably remember as mantra from bygone days as lowly Windows users, if you want good graphics, you have to stick with nVidia. Or at least that was the rule a decade ago. Nowadays … I couldn’t tell you. It may or may not be the case. I’ve been out of the loop for a while, and I still think Neverwinter Nights is great fun in my spare time. :roll:

Back in October I think, I started having difficulties updating the kernel and keeping it in sync with the now dusty 96xx proprietary driver from nVidia. I’m not a newb when it comes to that driver; I’ve been building and installing it by hand pretty much since 2007, and while I can’t call myself an expert, I at least know when I’m up against a wall.

And that was the case for more than a few days, particularly when the switch to 3.17 came into Arch core (I think that was October, but I might be wrong). Nothing would build. Errors on anything after 3.17. 3.16.4 would build fine for me in Arch, but I wasn’t having much luck after that point. Something had changed, but I couldn’t see what.

The Internet, despite its unimpeachable pedigree as a clean and honest repository of truth, information and justice >:( :roll: , wasn’t much help. It may be that I am the last surviving user of a 440 Go card that prefers Linux, and I’d be comfortable with that. nVidia of course wouldn’t be interested in my issues, and if there were others in the same boat, I couldn’t find them. Or a solution.

So I did what any competent Arch user would do when confronted with a seemingly insurmountable inconsistency between hardware and new kernel — I added linux and linux-headers to the list of ignored packages in /etc/pacman.conf, and went about my business with the older versions I knew would work.

I should add a caveat to that though: I say “work,” but I should say “work acceptably.” Even with the last happy combination of proprietary driver and kernel, I would sometimes see tearing in gradients, corrupted edges on images or other tiny graphic defects. The nouveau driver, in case you’re wondering, was always worse — there, I got corrupted icons (icons? why only icons? :???: ) horrible redraws and a host of other issues. The proprietary driver wasn’t perfect, but between the two, it was the better.

And add one other thing to that — something that I didn’t even notice until last week: System resources were almost always pegged. Fans at full bore except at the lowest of moments, system load hovering around 50 percent even when untaxed, and a lagging sensation at the best of times. It was never a dealbreaker, but the issue was there, and I hardly realized it because that’s the way it had always been.

I mentioned before that the underlying motive to the press for online-everything is what will eventually drive a stake into the heart of any contemporary PC. This was different. I can compare the machine’s performance with others from the same era, and see a drag or a burden that wasn’t present in its peers. Something just wasn’t working right.

Fast forward to about two weeks ago. I saw the jump to the 3.18 kernel and thought, maybe my time has come. Booted to a console environment, updated the kernel, tried to rebuild the video module and … was left with the same error messages that had been cropping up for months.

At that point, I had a little soul-searching to do. I like the machine very much. I’m comfortable using older kernels, but there must come a time when other issues begin to build because I’m stranded on a three- or four-month old kernel. And the ancillary software (and here I admit I’m thinking about things like systemd) is growing at a breakneck pace, so at some point I’ll have to untangle other issues that are related to hanging back at an earlier kernel. Maybe not today and maybe not tomorrow, but someday soon, and for the rest of its life.

I could switch distros. But other distros generally performed worse with that hardware combination. And in any case, I’d be stepping back in software unless I went with something totally wild. So I’d do just as well to … not update at all.

It was the nuclear option really, but not a terrible idea. But still, it seems overblown, to run a machine (a perfectly usable machine) deliberately on out-of-date software (perfectly usable software) because of a failing in the proprietary software that runs the hardware.

And then the answer appeared to me, in a blinding spray of light, sort of like a Hollywood action movie. And I went to an online auction, and dropped about US$19 on the 440’s counterpart, the Mobility 9000.

It got here last week, and life changed immediately.

No more proprietary drivers. No more rebuilds at every kernel update. No more convoluted xorg.conf files. xf86-video-ati does everything I need, and with no more effort than typing out the name and hitting enter.

No more taxing the system resources. No more torn gradients or sluggish page draws. Yes, that Internet drag is still there, and it’s still annoying, but there’s nothing to be done about that. It’s like death and taxes — regardless of your machine, eventually bad code and deliberately obtuse web content will engulf and extinguish your machine.

But it’s amazing, really, and I wish I had changed the card out seven years ago. The logic is bulletproof: I don’t need the decade-old edge of nVidia-over-ATI any more, since I don’t play Windows games on this machine and desperately seek out that sliver of advantage in framerates. The 3D acceleration that might have tipped the nVidia cards a dozen years ago is pointless and irrelevant to me know, in a different operating system and with different needs.

And so there you have it — the epiphany of my past decade. After trading out one-dollar network cards at the local recycling shop because they had Broadcom chipsets, or setting aside entire decade-old laptops because they required too many screws to open conveniently, or abandoning a machine to the wheels of fate over something as trivial as a noisy fan … I finally set aside that ingrained prejudice against the second-place finisher in the graphics card duels of more than a decade ago.

And the world is a better place for it. Who would’ve thought. ;)

P.S.: 60fps with glxgears, and I’m satisfied with that. :D

Remembering the twins

I waited this long to recap the two Dell Latitude LMs I had as guests last month because I couldn’t find the pictures I took while I was poking and prodding. This doesn’t usually happen where I lose photos of something, but I did find one leftover on my camera, taken the day they left.

2014-11-13-84kbn-dsl

That at least proves it wasn’t a dream. :roll:

The picture, of course, shows the ironclad and eternally trustworthy DSL running in its most basic form on the prettier of the two machines, replete with a wireless network connection courtesy of an old WPC11v3 b-only card. That was not my most successful attempt — and I really wish I could find those other pictures :evil: — but DSL did at least tell me that the guts of the machine were working.

I don’t have a picture of DSL’s graphical desktop on that unit because I never got one. DSL isn’t picky when it comes to hardware, but I have seen more than one computer over the years that is less-than-visual. In this case, both the vesa and VGA attempts, in every variation, resulted in a scrambled video display.

Some of my other attempts were also less than successful, but a few bore fruit. I had better luck with early, early versions of Debian and Fedora, but some very bad experiences with … anything after 2002 or so (which thanks to this machine, did not come as a surprise). :\ And of course, I managed to get a blinking cursor on a Crux 2.4 installation, which I count as a flawless victory. :lol:

The biggest difficulty in working with these machines (I say “these,” but I did almost everything on the one you see in the picture) was twofold: First, these computers were not intended to boot from CD — only the primary hard drive or the floppy drive, of which I had none (and by that I mean the owner had none). Don’t even talk to me about a USB port. You know better than that. >:(

That’s a huge complication, but not something I haven’t had to work with before. I’m not above creating an entire system in an emulator, writing out the image file to a hard disk, and transplanting it physically into a target machine.

In that sense, these are great designs for that task. I’ve run into machines that were a bit curmudgeonly in that respect, but the drives on these laptops pull out of the front left corner like a drawer, connect firmly in a dedicated tray, and are more or less exchangeable in seconds. What’s more, there’s plenty of space in the tray for an IDE-to-whatever converter, which in my case was an SDHC adapter.

I did run into an additional mystery though, which constitutes Biggest Difficulty Part Two: a hesitance to boot from some systems, and I’m not sure why.

It may have been some sort of partitioning inconsistency, between the BIOS and the installation. Occasionally a system wouldn’t boot that I had written out via dd, but other times preinstalled or original installations wouldn’t boot either.

I don’t suspect hardware issues; instead, I suspect either (a) the old BIOS drive dimension limit, cropping up again decades after its relevancy, causing problems again in its passive-aggressive way of suggesting you should get a new computer, or (b) some misalignment between the way GRUB or LILO worked two decades ago, and what the BIOS expected.

I’ve seen machines — in fact, this EzBook 800 has it — that have a BIOS-switchable option for Windows-esque drive arrangements, with the other option as … “other.” :\ I know of one or two machines in my past that couldn’t boot an “other” system if the BIOS was set to Windows-ish, or vice versa. This old warrior was one of those.

I don’t have any way to document that, and I don’t know how or why it happens, but that’s my underlying suspicion. Since the BIOS in these Latitudes doesn’t have an option to switch, it was a crap shoot to see what will boot and what won’t.

Both of these issues, and their underlying problems, are magnified by the glacial pace of working at 133Mhz, and with the added time of swapping drives and bouncing between drive caddies. Plus, the constant risk of snapping or bending a 20-year-old pin array, or the natural brittleness of aging plastic. … I imagine even that caused a little hesitation on my part.

I can say with some honesty, if these were my personal machines, I’d probably be a little more aggressive in seeing what they were capable of. I tend to be a little antsy around other people’s computers though, for no other reason than general courtesy.

In any case, I gave them back a few weeks ago after giving each one a quick cleanup, and they returned to their home of record.

The irony of their departure is that the owner, when he came to pick them up, hinted that I might be able to keep them if I were inclined, and if my offer came within the range of what he thought they were worth.

I declined politely, partly in fear of bringing more wayward laptops into the house on a permanent basis, but also because I know he feels the pair together, with the power supply and a Dell-branded PS/2 ball mouse (woohoo!), are worth close to US$100. I put them around a quarter of that, maybe a little more. I doubt we could come to compromise, even if he were a little more realistic.

But if I were to find one of these in the recycling dump, I wouldn’t pass it over. It would be almost impossible (by my cursory research) to find replacement parts now, and even if you did, you’d likely be paying incredibly inflated amounts for something worth a fraction of the price tag. So you’d have to find one complete, unadulterated and in pristine condition to really appreciate it.

There are better machines of this era to experiment with. But treading the 20-year-mark on hardware this old, perhaps this exhibit machine and its “scavengee” comrade are a good investment. Maybe his offer isn’t far off the mark after all. :|

(And in a worst-case scenario, it’s reassuring to think that Dell actually still has Windows drivers for machines of this pedigree. That, in itself, is amazing, even if the nightmare of running Windows 95 on those machines is only partially massaged by the thought of rehashing a few sessions of Age of Empires.)

Who knows? Junk — ahem, I mean, vintage computing is nothing if not an unpredictable hobby. :mrgreen:

Tricks of the trade

I had to remind myself the other day that I’ve been using Arch Linux for more than eight years. I did my first trial installs early in 2006, and while I cut my teeth on Ubuntu, as time wore on, Arch eventually became more like home.

With the exception of strict i586 machines, I’m more likely to install Arch on any given computer, with Linux Mint coming in a close second. The logic there is that I can get it running faster with Arch, but there are some features that I use so rarely that I’d rather a distro like Mint take care of them — things like CD burning or encrypted file systems.

I can do those things in Arch, but it’s rare that I need them, and Mint usually sets them up quicker and better than I do.

Over the years I’ve jotted down a few minor notes on one-shot commands or quick-step loops to tackle oddball tasks. I keep them in a flat text file called “tricks,” and when I need to get to one of them, I just grep through it until the command I want appears. Adjust for minor differences or filenames, fire, and forget.

For example, a while back I found a command to list all the packages that are installed on your system, in alphabetical groups. I modified it a little bit, to:

for i in {a..z} ; do echo -e $(pacman -Qq | grep ^${i}) >> packages.txt ; echo >> packages.txt ; done ; fmt -w $(tput cols) packages.txt

The original post (which I don’t have a link to any more :( ) split them out differently, and words broke at the end of the line, sometimes making them hard to read. I solved that with fmt, which is more than happy to wrap text like a word processor, but likes to run together lines. Hence the extra echo. Oh, and it doesn’t seem to like to have text piped into it, so the external file was necessary.

I think the original post caught some flak for 26 iterations of pacman, but I don’t have a problem with that. Might was well put the system to use for a little bit. If it annoys you, feel free to adjust it.

Some of the “tricks” were my own creation. Back in September I got the (stupid) idea that I would dump all the executables from bsd-games, util-linux, coreutils and even binutils into my list of text-based software, just to be sure I hadn’t missed any hidden gems. :roll:

It turned out to be a real hassle, and the results of it were one of my biggest lists of oddball commands in the history of that blog or this. In any case, this was the command that got me in trouble:

for j in {util-linux,binutils,coreutils,bsd-games} ; do LEN=$(echo ${j} | wc -c) ; for i in $(yaourt -Ql "${j}" | grep bin | cut -c$(($LEN+10))- ) ; do echo "${j}" >> temp/"${i}".wiki ; done ; done

Most of that was done to trim off the extra stuff that appears with grep’s output; since the length of the name of each package was different, I had to check where the title ended and the actual binary name began. wc takes care of that, with the -c flag. And to keep this from polluting my home directory, it dumps everything into temp/. The .wiki suffix is just for the benefit of vimwiki. ;)

Not everything is Arch-specific in that file. Here’s one that I use more often than I thought I would: Taking a folder of files, and moving each one to prepend the original name with its date stamp:

for i in * ; do mv "${i}" "$(stat -c %y "${i}" | cut -d' ' -f1)-${i}" ; done

stat comes through this time, as a way of generating the timestamp of the file. cut that down to the first field only — the date — and voila, moved to a new name, listed by date. Suddenly your folder has organization.

I use yaourt on a daily basis, and with good reason. With such an abysmally slow Internet connection, I have a tendency to hoard software packages, which is not generally an advisable habit with Arch. Occasionally I make a mistake and wipe out my cache, or just slip after a failed upgrade, and need to pull in a fresh copy.

And so … download a new copy of everything that’s installed on your computer. This one took on a new meaning when I realized you could pipe yes through another program, if you wanted to automatically feed it a positive response:

yes | yaourt -Sw $(yaourt -Q | grep -v local | cut -d'/' -f2 | cut -d' ' -f1 | tr '\n' ' ')

Get a full list of packages from yaourt, which will be preceded by repo names. Filter out anything from “local,” since that’s built and not just downloaded. First cut off the repository name at the slash, then cut off the version at the space. Finally, tr substitutes every carriage return for a space, so we can give one giant command to the outermost yaourt, which will download the entire list. Better leave that one overnight. :|

If you don’t use yaourt, you’ll need to adjust that a little bit, since pacman -Q does not show group names. It also means your “local” packages will be mixed in with your downloadables. Just so you know.

Since we’re on the topic, yaourt lets you do some funky things with your cached packages, and also with the materials it downloads to build local software. If you look in /etc/yaourtrc, you’ll find:

EXPORT=2           # Export to 1: EXPORTDIR or PKGDEST
                   # 2: pacman cache (as root)
EXPORTSRC=1        # Need EXPORT>0 to be used
#EXPORTDIR=""      # If empty, use makepkg's connfiguration (see makepkg.conf)

I’ve adjusted those values to do a few things.

First, EXPORT=2 will send the built package to pacman’s cache, which is a wise move if you ask me. By default yaourt builds everything in /tmp, and it evaporates when you’re not looking. I’ve lost more than one hard-built package by not moving the final product out of that directory. :'(

EXPORTSRC=1 does the same thing for the source files and git trees that it downloads. This too can be a lifesaver if you lose the completed package. EXPORTSRC will send everything to EXPORTDIR, or in the absence of that, to the destinations listed in /etc/makepkg.conf. And what are those?

#-- Destination: specify a fixed directory where all packages will be placed
PKGDEST=/home/packages
#-- Source cache: specify a fixed directory where source files will be cached
SRCDEST=/home/sources
#-- Source packages: specify a fixed directory where all src packages will be placed
SRCPKGDEST=/home/srcpackages

By default, all those folders are commented out, so makepkg’s configuration leaves everything in the same folder where it was made. Change those values above, and it will shuffle packages and source files to those directions. It will also create a symlink to the package it just built, so you’re not wasting space.

Let yaourt use those directories, and it will follow those rules, and tar up its $srcdir as well, before sending it off to that destination. In that case, yaourt will tidy up its efforts and leave you all the pieces you need to do it all over again.

And yaourt is generally smart enough to check those directories for source files before re-downloading them. Or so it seems, in most cases. :\

Two more smaller non-secrets I should share:

yaourt -G pkgname

will create a folder named “pkgname,” download its PKGBUILD and all the patches or installation scripts for that package, ready for building. It’s another reason I use yaourt, to be honest. And:

pacman-optimize

It’s not often that your pacman database will need optimizing, but I can vouch for it on slower machines as a way to speed up searching and filtering. Again, just so you know. ;)

Clearing out the bookmarks … again

I did it again: I collected a mass of bookmarks that I figure I’ll need at some time in the future. Maybe I used them and will again … and maybe not. Either way, they may still prove useful. I do refer back to this site when I can’t remember a page or a topic, you see. :roll:

So here we go again: More links for future reference.

  • I sometimes keep links to pages that have instructions for lightweight systems for old distributions; here’s one for Debian Lenny and one for Crux 2.7 (in the i686 flavor, which doesn’t really matter). That might seem counterintuitive, but I will fall back on old distros when working with old hardware, before making the leap to current flavors of Linux. For an example, peek here.
  • Along those same lines, I found a fairly coherent tutorial on how to install Exherbo. I had a link to another one, but apparently the author took it down. :( I have been wanting to spend a little more time with Gentoo (and possibly Exherbo) but I’m always attracted to the way Crux handles things. That being said, Crux dropped i586 support years ago, and hasn’t had i686 ISOs (unless they’re hiding) for a year or two at least. :( Story of my life. …
  • I use dd a lot, not just to blank drives or scramble the contents of files, but for other things too. To that end, a speed comparison at different block sizes is actually very useful. Of course, I’ve seen some posts on StackExchange that might offer different solutions.
  • Along those same lines, this page gave me a little insight on how to mount a specific partition in a disk image. It saved me a little time with a copy of an old 10Gb hard drive, since I didn’t have to write it back out to a drive to get at the files I wanted. On the downside, counting out all those offsets was a trick. I’m surprised Linux hasn’t thought up a more straightforward way to do that. …
  • I used to be real nit-picky about fonts, but these days I don’t really mind. I did find a good collection of font suggestions for Arch on Reddit, but I’m not the kind of person who installs two dozen font packages just to see a few extra characters in my terminal emulator. Now if we were talking about fonts for virtual consoles, I’d be much more interested. …
  • Since I’m in fix-it mode, here are a few pages about
    • installing python programs to different directories with pip, which is interesting because I’ve thought for a long time that there is no setup.py uninstall;
    • checking to see if directories exist with bash, which came in handy just a day or two ago;
    • how to install Arch from within an existing Linux installation, which I want to try sometime, just to see if it works; and
    • the difference between single brackets and double brackets to bash, which I never knew but explains why some of my long-ago scripts didn’t work as expected.
  • emacs fans would probably love to run just emacs on a Linux kernel with nothing else, and this post can tell you how. It reminds me of my long-ago attempt to trap Midnight Commander within a tty session, much like could be done a long time ago with rtorrent.
  • I should take the time to set up mutt with multiple GMail accounts, like this. I don’t dislike alpine, but I only keep it around because I’m too lazy to set things up. :\
  • From the Department of Ancient Awesomeness comes three flasbacks that just made me nod my head: one on the best distros of the year 2000, another of the best window managers of the year 2000, and perhaps best of all … a complaint from 2008 about how Firefox is utter bloat. The more things change, the more they stay the same. …
  • I watch the Debian systemd soap opera with only a little interest. I’ve been using Arch for quite some time now, and I have no complaints about the newcomer. All the same, if you’re wondering where you’ll stand when the revolution comes, raymii’s chart from earlier this month might be helpful for you, as might this systemd vs. sysvinit cheatsheet. Neither page will convince you one is better than another, but might help you understand how they each handle the startup task. Knowledge is power. :twisted:
  • You won’t hurt my feelings if you find some Linux know-how somewhere else; even I found this list of tech podcasts rather interesting. I don’t really get into podcasts much, but from time to time I will grab one and spin it up.
  • Finally, from the Completely Unrelated to Anything Else Department, here‘s an interesting project: An Android browser that displays web pages (believe it or not) by way of relaying the content through SMS messages. O_o Now I’ve seen everything.

And now I’ve listed everything. If those are at all useful to you, please bookmark them in your own system. Hold on to them for about four months, and then yell “I gotta do something about these bookmarks!” and offload them to your own blog. It seems to work for me. … ;)