Create large files of random information

I wanted to test that mischevious hard drive, to avoid problems when I move stuff on to it.

The plan was to fill it with dummy files and if there was an issue at the 40Gb point, I would know up front. Which would have been better than finding out a week later.

To that end, the weak-sauce tip for the day becomes this one: Creating a series of 2Gb files made of random gibberish, just to take up space.

for i in {1..20} ; do time dd if=/dev/urandom of=test-{$i}.file bs=268435456 count=8 ; done

The results, after a considerable amount of time (/dev/zero is faster), will be twenty files all 2Gb in size, filled with gunk. Good gunk, that is.

A little tip there: The block size multiplied by the count gives you the size of the file. So what?

So simply setting the block size to one gigabyte (or gibibyte, since I seem to be drawing flak on the issue these days :) ) might cause memory errors on a machine with only 512Mb or less. It did for me.

The size of the file in my case wasn’t really important I guess, but I did get that quick primer for performing this stunt on low-memory machines. Reduce the block size, magnify the count, get the same results.

Oh, and the hard drive? It’s fine. I filled it all the way to the brim, and Arch didn’t complain. Good to know.

About these ads

5 Responses to “Create large files of random information”


  1. 1 jimmy 2011/04/01 at 6:23 AM

    Is there any particular reason you use random gibberish? I wanted to fill a disk the other day and simply used dd to create multiple files from /dev/zero.

    • 2 K.Mandla 2011/04/01 at 8:02 AM

      No real reason, I guess. At the time my fear was that only zeroes wouldn’t trigger an error, if there was some sort of limit to the drive. Now that seems silly though. :oops:

    • 3 iss 2011/04/11 at 2:13 AM

      Doesn’t filling files with zeros make them sparse? It would be pointless to test disk with sparse files.

  2. 4 Sassan 2011/04/14 at 12:05 AM

    It would probably be more appropriate to use a smaller block size and a larger count, since all the larger blocksize is achieving is wasting huge amounts of ram during the process.


  1. 1 A bash loop, for pacman « Motho ke motho ka botho Trackback on 2011/04/01 at 9:47 PM

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




Welcome!



Visit the Wiki!

Some recent desktops


May 6, 2011
Musca 0.9.24 on Crux Linux
150Mhz Pentium 96Mb 8Gb CF
 


May 14, 2011
IceWM 1.2.37 and Arch Linux
L2300 core duo 3Gb 320Gb

Some recent games


Apr. 21, 2011
Oolite on Xubuntu 11.04
L2300 core duo 3Gb 320Gb

Enter your email address to subscribe to this blog and receive notifications of new posts.

Join 405 other followers

License

This work is licensed under the GNU Free Documentation License. Please see the About page for details.

Blog Stats

  • 3,958,871 hits

Archives


Follow

Get every new post delivered to your Inbox.

Join 405 other followers

%d bloggers like this: