Actually, it’s not much of a secret at all — it’s more of a clarification of sorts.
Here it is: More memory won’t make your computer faster.
Now before you cry foul, or hiss, or jab at the comments box, let me append that statement once or twice.
More memory won’t make your computer faster unless …
- it’s already short on memory when it starts up, or
- you try a faster grade of memory, or
- you’re overwhelming your computer and it can’t keep up with your demands.
I keep seeing on the forums where someone — not necessarily a newbie — complains that Ubuntu runs slow, or their machine is dragging, or things just don’t seem perky. And invariably, the first suggestion is to add more memory.
That’s not necessarily the right answer, and here’s why.
Let’s suppose you work in a post office. And every day people come by and drop letters through a slot. You catch them and sort them in your hand. Once you have them ordered and sorted, you put them into a bin to be delivered. You’re done with those letters, and you can pick up some more.
If you can only hold so many letters, you’re going to come to a point where you can’t sort any faster because you can’t hold as many letters as you need to. So you have to wait to pick up some more letters, until you have space in your hand to hold them.
Now suppose your supervisor comes along and sees you’re having trouble. She brings in a table and says you can use that space for your sorting.
Fantastic! Now you don’t have to hold the letters to sort them! You can set them down on the table and keep sorting without waiting to pick up more and arrange them. Sure, it’s a little bit slower than if they were all in your hand, but there’s only so much you can do with your hands, and there’s lots more space on the table.
So now let’s suppose there are a lot of letters — more than you can handle even with your table. Well, like before, the new letters dropping through the slot are going to have to wait until there’s space in your hands, or on the table.
And again, your supervisor sees you’re having difficulty. Now she says she has a room down the hall with banquet tables in it, and if you need to use that space, go ahead. There’s plenty.
Well, that’s useful, but that means any time you have too many letters for your hand, and too many for the table, you’ll have to run down the hall to the banquet tables, get the letters you need from there, and rush back to your office to get them sorted. That can take a while. It works, but it’s a real drag, and it slows things down considerably.
Now let’s tweak the analogy a little. Let’s suppose you have enough space on the table to keep sorting. In fact, it’s more than enough space to handle your load of letters. You don’t ever need to run down the hall.
If your supervisor brings in another table for you, will it speed up your work? No, of course not. That would only be useful to you if you had run out of space on the table you already have. If you had two tables and the first one was full, the second table might be useful. But so long as you have space on the first table, there’s no need to run down the hall. So a second table doesn’t make things faster. It just gives you more space to work with — and only if the need is there.
Do you see where I’m going with this? If not, I’ll clear it up quick. In this analogy, the letters you have in your hand represent your CPU’s cache. It’s small, but very fast. You can get those letters sorted quickly, because they’re in your hand — in the same way your CPU can do on-die functions super-fast: It doesn’t have to leave the chip to get it done.
The tables are your memory — your installed memory in your computer. When your PC runs out of space in cache, and it needs more room to work, it puts things into memory.
Running down the hall to the banquet tables in the back room … is your virtual memory (your swap file or paging file, since the idea is the same in Windows or Linux) on your hard drive. It’s useful space, but it takes a lot longer to work at that distance, running back and forth and then continuing sorting. If you run out of memory, you have to resort to the virtual memory on your hard drive. And that’s when things start to get bogged down.
But remember: swap files and virtual memory only come into play if your CPU is overloaded and your system memory is full (or at least it should, unless you have things set to behave differently).
And here’s the crux of my argument: Unless your PC is completely overloaded by all the demands you’re placing on it, adding more memory isn’t going to make it faster. Remember: Adding another table isn’t going to help you sort letters faster if your first table isn’t already full.
That’s why I set the conditions I listed above. If your machine isn’t already stressed for memory, adding more memory isn’t going to help. Most of the systems I work with rarely need more than 64Mb to get to the desktop (although I admit I don’t use Gnome- or KDE-based systems, so I have a caveat for my argument).
But even my full-scale Xubuntu builds need less than 128Mb to get started, and unless I’m watching a DivX movie on Mplayer while running a bittorrent client in a terminal window and playing Frozen Bubble as I burn a DVD with Graveman … I have yet to see one of my machines peak over 256Mb. Can it use more than that? Of course. Does it usually? No, almost never.
So how can you tell if you really need more memory? Well, start up everything you use. Everything. All of it. Now switch between programs. Is there a lag? Is it stuttering? Can you tell it’s struggling? Is the hard drive access light flickering like a madman?
If yes, then in that case you’ve run out of physical memory and you’re relying on the virtual memory on the hard drive to keep up. And again, like running down the hallway to grab more letters to sort, it takes a longer time to communicate at a distance and as a result … lag. Stutter. Previously aforementioned loss-of-perkiness.
Now using faster memory — in other words, memory that actually is made to perform faster, whether it’s through magic or higher grade components — will make your computer run faster. But by how much? Well, that’s for you to decide. If you’re lucky enough to run a machine that will accommodate a variety of memory speeds, you could test it. I once put 512Mb of PC133 into a machine that had been running PC100, and it seemed a little faster. A little.
So what’s the logic in adding more than 256Mb? or 512Mb? or 1Gb? And for that matter, why does the public consensus for curing slow computers immediately assume you need more memory?
Well, I don’t have a for-sure answer to that. Market hype would be among my suggestions, and the Windows behemoth putting higher and higher demands on hardware would be another. But for now, I’m leaving it unanswered.
At least until I can come up with another analogy.
Edit: Here’s a good layman’s description, without a pointless analogy: http://computer.howstuffworks.com/question175.htm