Two small additions

I apologize if your blood pressure is up; I managed to stir up the pot quite neatly with that last post.

I don’t ever troll though: The questions were legitimate and quite clear to me when I wrote them yesterday.

And they still are. I feel no differently about the issue than I did a day ago, even with the swirl of opinions that came forth.

I do have a couple of ancillary points that I want to touch on, before I drop the topic for a while.

First, invariably someone mentions video editing as a task that requires beefy hardware and consequentially beefy software.

I suppose in it’s current state, desktop video editing does require a certain measure of supporting software to get the job done. To the satisfaction of casual computer user Joe Public? Sure.

But remember that video editing was not invented in 2005. Steve Jobs and iWhatever didn’t build the niche from scratch. In fact, Steve and Co. are johnny-come-latelys, if anything.

Chew on this for a little bit: Way, way back in 1990 it was possible to do professional-grade video editing on a machine that was running at just over seven megahertz with a cruel baseline of only 512 kilobytes — expandable to a meager nine megabytes.

Pause for effect.

Of course, that machine was the almighty Amiga 2000 paired with Video Toaster and some external hardware. I don’t need to tell you the pure, distilled genius of that ensemble. Amiga fans can take over from here.

Yes, your desktop video editing system might do more now and at a better pace. But please don’t hold out that singular task as some sort of validation of desktop software bloat. I don’t buy it.

And invariably someone mentions Game X running on System Y as a convoluted justification for increased desktop ballast.

Linux users aren’t as noisy in that category, but I did get one or two e-mails saying, “Ja, you’re right, but I play Game X which needs System Y which won’t run on anything less than Hardware Z.”

The mind boggles. Let me introduce you to my little friend.

That’s Oolite, an open-source OpenGL rendition of Elite, a game that holds more No.-1 spots on best-of-all-time lists than you knew existed.

This is a game that incorporated wire-frame 3D graphics, a trading and economics system that includes fluctuating market prices, a series of eight galaxies all with 256 planets each having discrete and predetermined characteristics, spaceflight and physics models, radar and target tracking systems, weapons arrays, ranking systems … you name it.

Original release date was 1984, and was initially written to fit into 14 kilobytes of machine code.

You don’t need to convince me. I am sure and completely confident that Game X is so, so, so much better than Elite ever was.

My point is that there is a history to computer software that goes back decades, and each generation did the same if not better with less.

The fact that you need hardware strong enough to run a certain grade of software that lets you do that same sort of thing as a decade ago … well, that’s where the logic fails for me.

And that’s where I’ll put a cap on this for a while. I have a lot of other things that need note, and I don’t care to spend too much time debating whether 1984 is an improvement over 2011. :roll:

About these ads

20 Responses to “Two small additions”


  1. 1 ScannerDarkly 2011/04/23 at 9:49 AM

    You make some interesting points, whilst I think they contain valid points, I think they are unfair, and not without knowledge – but without empathy. What it comes down to is we do because we can. Software production moves so fast because we can produce it so much quicker. If I need to draw a circle, I’ll use what someone else has written. We work in high level languages, because we want to concentrate on solving the problems at hand without worrying about trivial matters like memory management.

    We tend to computationally map out problems and code it in a high level language. If it’s vital, I’ll fine tune it. How much time is that going to take me though, and who will benefit from it? What if it needs to be portable? Aren’t the advancements in the free software community consuming the efforts of many volunteers as it is? Essentially this all falls back to the UNIX philosophy of re-using code so we can make technological advancements. Surely this is the very essence of technology. If I can rationally implement a feature in my software, its because I believe users shouldn’t hunt down more trivial tools to do that for them. Tools are essentially what software is. If I can give you a tool to do 10 jobs, it saves you finding 10 separate tools. Strip your kernel to pieces, compile what you need? No thanks. This might connote towards “maximalism” for your computer, but what about you? You are part of the system.

    I know your post was an innocent query rather than meaning to accuse and point the finger at us. Computer Science students, people in the IT industry and other professionals are going to get insulted if you’re telling them new features, support and opportunities they’re developing aren’t good enough. If we all used Amiga 2000s then our code would be a lot faster, but we’re developing for different processor architectures, platforms, and supporting thousands of hardware peripherals. It’s all about features. Gnome offers less than KDE – how do I tile in Gnome again?

    A lot of the ‘bling’ you refer to in your criticism of interfaces is unfair to strictly call bloated and unnecessary and implies the science behind the research in Human Computer Interaction is all a shambles. There are lots of fields that require thought such as behaviour models: Fitts’ Law, Buxton’s 3-State, Guiard’s; areas of perception and information processing e.g. GOMS. Openbox pretty much goes against all of those… :-)

    • 2 Sam 2011/04/23 at 5:28 PM

      I think you’ve hit the nail on the head there. The fact is that it takes a LOT longer to develop a given application using assembly, or even something like C than it would in a high level language like Python or Java.

      As a result, today’s software is much heavier and a bit slower, but it also takes a lot fewer man hours to develop a given feature set. This means more software, with more features.

      That’s not to say that software written using modern methods can’t be fast, as is demonstrated by a lot of the cool applications on this blog. However for the average user, a slightly slower application which has a simple and intuitive user interface and does anything you want it to do and more, is more suitable.

  2. 3 Moose 2011/04/23 at 2:34 PM

    While I do agree that software should potentially be able to run as quickly as ancient software did, adding more features invariably results in a bigger file size, and almost always higher hardware requirements. You of couse almost certainly realize this. On a different note I’ll need to Oolite. I love space games and older games, so this seems ideal.

  3. 4 imgx64 2011/04/23 at 3:45 PM

    On a brighter side, this blog really enlightened me. Before I started reading it, my philosophy was “buy a computer every 2-3 years”. Now it’s “never replace a computer unless it costs more to fix it than to buy another one”.

    Keep up the good work.

  4. 5 Linuxbakkie 2011/04/23 at 4:08 PM

    As a Amiga-user I remember another video-editing machine. A Philips NMS 8280 MSX2 computer; 3.5 Mhz, Z80a, 128k RAM and 128K VRAM. As far as I know this machine had a video-digitiser and genlock. Not too bad for a machine that’s 25 years old.

    But there is hope, I just downloaded OOlite and it was much faster than loading Elite from the datarecorder :P

  5. 6 koleoptero 2011/04/23 at 8:26 PM

    I agree with you on some points and disagree on others, but it’s a fact that today’s DEs and other programs must have a LOT of badly written code when you see an OS without office suits or codecs or various stuff needs a DVD and 18GB of hard disk space to be installed. They just don’t care anymore.

    In the meantime you can also show the gamers this: http://www.theprodukkt.com/kkrieger
    Sure it has some nasty (for your way of thinking) requirements, but at least it will only take 1 sec to download :D

  6. 7 Hippytaff 2011/04/24 at 1:33 AM

    Right on

  7. 8 eckeroo 2011/04/24 at 5:35 AM

    You mentioned Elite so I would like to recall Mercenary 1, 2 and 3 written by Paul Woakes. Mercenary 1 was all wire-frame 3d and ran on old 8-bit machines. Mercenary 2 (Damocles) and 3 was released for the 8MHz Motorola 68000 processors and had solid 3D graphics. The game depth was incredible: you could explore a solar system of planets, each with their own islands, cities, buildings, public transportation systems, etc. The programming must have been genius.

    However, machine code I gather isn’t easy and I can only get to grips with Python. Therefore I would need a computer that can handle Python. That for me would set my hardware requirements.

  8. 10 Luca 2011/04/24 at 6:21 AM

    Thanks for the history lesson on Video Toaster. I never even wondered how this was done in the past… interesting stuff!

  9. 12 YU1QRP 2011/04/24 at 9:31 AM

    Just simply said everyone needs a tool for the job.
    This days i am playing with my android based phone, 500mhz 190MB ram, i have found many usefull software that can do what i need to do every day, but to honest with myself, i hate the fact of doing css, html coding and image croping etc on a 320×240 screen with more then tiny keyboard.

  10. 13 TME520 2011/04/24 at 9:42 PM

    I feel the same about today’s IT : bloated OS, bloated software, waste of time and money because of bad coding habits…

    Well, I usually keep this feling for myself since everytime I speak my mind, I feel like an old fart ranting about “how everything was better before”…

    The truth is small software is good software and Keep It Simple, Stupid !

    By the way, I like your blog, thanks for taking the time to write all those nice posts.

  11. 14 Felice 2011/04/25 at 3:11 AM

    It could also be that the overall computing experience is subject to the so called square-law of automotive transport.
    My old legs are still enough to push a bicycle at 50 km/h for a short time, but to reach 100 km/h something between 15-20 kw/h power is needed, for 200 km/h we are around 80-100 kw/h, and for 400 km/h be prepared for immense power and cost and complexity. We have reached sort of a wall where nothing more is really worth.
    Anyway it would be only eight times as my old legs performance, and for moving in an urban area I could be easily a winner if you consider cost and noise and parking.

  12. 15 Ray 2011/04/25 at 10:01 AM

    Exception: Starcraft. One would rarely find a fast-paced RTS that runs lighter.

  13. 16 demonicmaniac 2011/04/25 at 10:57 AM

    in the vein of kkrieger and 64k demos i urge everyone to take a look at kolibriOS and menuetOS. fitting onto a floppy, requiring 8mb of ram with a 3d stack and a full desktop environment. That is efficiency.

  14. 17 Jens Ayton 2011/04/25 at 5:16 PM

    Interesting choice of example. But then, as lead developer, I would think so. :-)

    According to Ohloh, Oolite is almost half a million lines of code. That doesn’t include SpiderMonkey, GNUstep, SDL, Mesa or the other lesser dependencies, but does include a number of several declarative data files and scripts, as well as a bunch of tests and support tools. Call it 250–300kLOC of actual game code.

    I don’t have a Linux build of Oolite handy (BerliOS is down… again), but the Mac “test release” binary weighs in at 19.6 MB (with three architectures in one file). This doesn’t including game resources; the data files, excluding textures and sounds, are another 680 kB. That’s a whole DOS box full of text!

    From time to time, I’ve wondered where all this comes from. After all, most of that code is at least used for something (not all of it, since it’s written in a dynamic-dispatch language that can’t be dead-code stripped). I have several thoughts:

    * Firstly, 8-bit machine code is inherently smaller than 32-bit or 64-bit machine code.

    * Diverse hardware (and software). Oolite is supported on three operating systems and three processor architectures, and has been built for others (for example, IRIX on MIPS). It supports OpenGL 1.1, but makes use of features from 1.3 and 2.0, including shader support. This of course complicates rendering code.

    * Graphics support. Drawing all those newfangled textures and shaders doesn’t just require texture files, it also needs loading, resource management, scaling and support for various hardware capabilities (see above); for instance, it can convert cube map textures for planets to equirectangular projection on the fly for systems without cube map support, which is very rarely useful but allows us to keep the ridiculously low hardware target.

    * Abstractions. (The points below are special cases of this.) A dynamic-dispatch OO language adds overhead for class objects, method tables and so forth. Many components are written to be reusable in various ways. Writing specialized procedural code for each particular case will generally produce smaller code, but makes it harder to maintain and add new features.

    * Parsing. Instead of hard-coding data, we use text-based representations which are parsed at runtime. In addition to the actual parsing (much of which is handled by GNUstep or Cocoa), there’s string constants, sanity checking and caching. In order to integrate with these data files, the procedural generation code has to generate abstract representations and merge them with data files rather than just return an array of numbers.
    On the other hand, this stuff is the foundation of Oolite’s expansion pack support. Without the hundreds of expansion packs and active mod community, the project would have died years ago.

    * Scripting. For historical reasons, there are three scripting engines in Oolite – one for AI state machines, an old deprecated system used for various things, and a newish, quite powerful JavaScript interface. Just the JavaScript interface glue is twenty thousand lines of code, not counting the interactive console support. The original purpose of scripting support was to express some game behaviours as high-level abstractions, but of course it’s also important for modding.

    * Debugging support. There are many testing facilities and a complex logging system, primarily to help expansion pack developers, although some of it is also useful in debugging the game itself.

    I guesstimate that with care and dedication, Oolite could be rewritten to be a third of the size while supporting the same functionality, and also be more efficient. (Especially if you raised the GPU requirements!) The important thing, though, is this: that hypothetical game would never be written. It would take man-years of work, and I’m not talking about the sort of weekend hobby work that built Oolite in the first place.

    When you write a game in pure, tight assembly, every design change requires large parts of the work to be thrown away. The ability to use high-level abstractions and organically grow a game is what makes it possible for a bunch of hobbyists to write Oolite in the first place. (Well, I say a bunch; Giles Williams wrote it more or less himself, then the rest of us refined it.)

    One thing you didn’t mention about Elite is that it’s one of the biggest non-modular, optimized assembly programs ever written. Much – not all – of the perceived bloat of modern software comes from this: abstractions make it possible to write more complex systems faster, and that’s the biggest benefit of more powerful computers.

  15. 18 Digit 2011/04/26 at 7:46 AM

    ffe3d though… oooooh. the russian remake. … oh golly thats tasty.

  16. 19 anarkii 2011/05/09 at 11:57 PM

    Hi there,
    just stumbled on your blog and wanted to point out Jevons Paradox in relation to this post (and previous post) on why all the added resources to computers don’t add up to actually faster / more efficient running systems.

    here’s the intro blurb on Jeevons Paradox from Wikipedia:

    In economics, the Jevons paradox, sometimes called the Jevons effect, is the proposition that technological progress that increases the efficiency with which a resource is used tends to increase (rather than decrease) the rate of consumption of that resource.

    basically what happens is that when resource efficiency increases it sets off a bunch of effects that increase the load on that resource (instead of decreasing the load) even to the point where the original load on the resource was MORE efficient.

    check out the wiki article its very interesting.

    enjoying your blog!


  1. 1 Another gray area: Ascii Sector « Motho ke motho ka botho Trackback on 2011/05/22 at 7:43 AM

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




Welcome!



Visit the Wiki!

Some recent desktops


May 6, 2011
Musca 0.9.24 on Crux Linux
150Mhz Pentium 96Mb 8Gb CF
 


May 14, 2011
IceWM 1.2.37 and Arch Linux
L2300 core duo 3Gb 320Gb

Some recent games


Apr. 21, 2011
Oolite on Xubuntu 11.04
L2300 core duo 3Gb 320Gb

Enter your email address to subscribe to this blog and receive notifications of new posts.

Join 405 other followers

License

This work is licensed under the GNU Free Documentation License. Please see the About page for details.

Blog Stats

  • 3,961,193 hits

Archives


Follow

Get every new post delivered to your Inbox.

Join 405 other followers

%d bloggers like this: