The History of a Dream: How the Ultimate PC Has Evolved In 15 Years

Alex Castle

As we worked on this years 15th Dream Machine, we couldn't help but think about how far we've come. From the original 200MHz, 8MB-of-RAM 1996 Dream Machine up to this years 12-core, 24-thread, 24-gigs-of-RAM version, the ultimate computer has grown exponentially more powerful. But that's not much of a shocker (we've all heard about Moore's Law, and all) so we decided to delve deeper into the history of the Dream Machine. We collected data about the vital statistics of each years machine, and made a bunch of graphs showing how they've grown. (You can also see our 2004 predictions for this years Dream Machine here !) Some of what we found out surprised even us.

Keep reading for all the charts, as well as our thoughts about why they look the way they do. And since it wouldn't be any fun if we couldn't gawk at the old beige-box beasts, we've included a gallery of every year's Dream Machine cover at the end.

Processor

Since almost every Dream Machine has featured the fastest CPU available you can see the influence of clock speeds over the last 14 years. From Pentium to Pentium II and Pentium III clock speeds grew pretty slowly. In 1999, for our basic build it, we used a simple 500MHz Pentium III when a faster 650MHz chip was available. But fast forward to 2001 and you see the influence of the Pentium 4 and the NetBurst microarchitecture. Bred by Intel to climb to higher clock speeds rather pack in more instructions per clock, Dream Machines with NetBurst took off like a rocket from 2001, to 2002, 2003 and finally peaked in 2004. That year,  we took a stock 3.6GHz Pentium 4 560 and clocked it up to a stable 3.97GHz. Yeah, those Prescott’s really clocked up something fierce eh? Not.

That 90nm Prescott signaled the end of NetBurst. Intel’s 3.8GHz Pentium 4 570 would be the end of the line for the clock speed race. The next year saw clock speeds plummet back to earth yet we still saw higher performance. In 2005, we built a dual-Opteron machine with clock speeds of 2.2GHz. and the clock speeds have pretty much stabilized since then.

Here we take into account the overclocked speeds of each Dream Machine. It pretty much reflects the same pattern we saw with the stock CPU clocks. From Pentium to Pentium III the clock speeds gradually increased and then took off like a rocket with the Pentium 4. What’s really interesting is the our flatlining from 2008 to 2010 at 4GHz. Much of that is due to the platform we choose. In 2008, we featured another dual-processor machine using a pair of Xeon, ahem, Core 2 QX9775 chips. In 2009, in a world being swept away by financial disaster, we were happy to have electricity to power the Dream Machine. And we were happy to take our 3.33GHz Core i7-975 Extreme Edition to 4GHz – the point where it would pass our stress tests.

Look at this chart and then look at the one that shows the climb of clock speed. It’s no coincidence that in 2004, when the Pentium 4 hit the wall at 1,000MPH that we started to see the push for more cores. In 2005, we used a dual, dual-core machine.  The year after, a dual-core Core 2 Extreme X6800. From there, we’ve been trying to get as many cores in the Dream Machine as possible. This year’s is truly triumphant though with 12 cores and well, another 12 virtual cores thrown in for good measure.

RAM

It may look flat, but the chart is deceiving. The amount of system RAM has increased exponentially several times. Hell, 8MB in 1991! Seriously? We do admit, there were some long stretches where system RAM did not increase. It’s one of the things that led Intel to push so hard for Rambus in the late 1990s. You see, since main memory wasn’t going to explode, users were going to need the super duper fast Direct RDRAM which offered incredible bandwidth. Yes, you can be a hater on Rambus and Direct RDRAM (we were once the same) but RDRAM was actually ahead of its time and it is a shame politics and legal shenanigans muddied it up. It took DDR several generations and years to surpass the first iteration of PC800 RDRAM in performance.  The Dream Machine actually featured RDRAM in 2000, 2001, and the most insane implementation of it in 2002. That’s the year that we used a crazy 512MB RIMM4200/PC1066 module. In essence, the RIMM4200 module combined two RIMMs into one to give you dual-channel performance in one slot. Ultimately, the idea was to have two dual-channel RIMM4200 modules in a PC that would give you a PC with quad-channel memory. Alas, we know how that ended. By 2003, Dream Machine moved on to DDR and never looked back. Our write up in 2002 even acknowledged that the days of RDRAM in the PC were done for once DDR emerged. DDR, DDR2 and DDR3 are the lingua franca of today’s PC.

You should also note that main memory is tied into the OS. That spike you see in 2005 came from the use of a dual-core, dual Opteron machine (dual’s usually demand more RAM than single processor machines) and our dual booting of Windows XP Pro and 64-bit Windows XP Pro. Of course, 64-bit became more of a reality with the introduction of 64-bit versions of Vista (Windows XP 64-bit was nothing more than a science experiment and even we recognized that) in 2006. But the resistance to Windows Vista was so high, that we bypassed 64-bit Vista in favor of Windows XP Professional. Things were still so bad for Windows Vista in 2007, that Dream Machine 2007 dual-booted the machine with the primary OS being Windows XP. This kept the system RAM down at 2GB, albeit, high-clocked. We weren’t comfortable with Windows 64-bit Vista until 2008 when we finally used it as the primary OS for the Dream Machine. With its 64-bit capability and most of the early bugs squashed by SP1, it was finally the primary OS in Dream Machine 2008 which featured 8GB of RAM for its dual processor configuration.

Up Next: video cards, power supplies, price, and the gallery.

Video Card

Old timers will wistfully recall the late 1990s when the pathetic amount of graphics cards frame buffer sizes led to such ideas as Intel’s direct memory execute which was implemented in many AGP cards. Since, you know, videocard frame buffers would always be so pitifully small, DME allowed textures to be accessed from main memory and directly accessed from the graphics core. This would allow games to grab large textures across the awesome 266MB/s AGP port without the need for huge (and at the time, prohibitively expensive) frame buffers. Obviously, frame buffer sizes have shot up. The big uptick was in 2006 when Nvidia and ATI began a war to see who had the largest frame buffer. From there, frame buffers have advanced at an incredible pace. What’s that 2GB frame buffer tick? That’s ATI’s Radeon HD 4870 X2 which features two GPUs and separate 1GB frame buffers. In reality, the frame buffer for that generation of card was 1GB. Of course, cards with freakishly large frame buffers have been available for many years but for the most part but the frame buffer sizes didn’t always match the GPU’s performance.

Look at this chart and you’d think that Dream Machines have been running SLI or CrossFire since the late 1990s. That’s not actually true though. The 1997 and 1998 machines had dual-cards, but not in a traditional SLI/CrossFire manner. Instead, both of those early rigs had 2D cards (from Matrox and ATI) combined with 3D cards using the Voodoo and Voodoo II graphics. With 3dFX a goner by the turn of the millenium, the world had decided that cards that were fast in 3D and also had 2D functionatlity were the rage. It wasn’t until 2005 that dual-cards made a comeback to the Dream Machine with the GeForce  7800GTX cards in SLI. From there, dual-cards and more have been a standard check off list for any power hungry machine. Here’s a trick question though, which machine had the most GPUs? Not 2010. The correct answer is 2008’s dual ATI Radeon 4870 X2 cards. Each card featured two GPUs and 1GB of frame buffer.

Power Supply

OK. It may not be the internal combustion car that’s causing global warming. Instead, maybe it’s our incessent need to have ever faster computers. From 1996 to 2010, we’ve gone from 300 watts in the most pimped out PC to 1,650 watts. Who do we blame? The GPU. You can overlay this chart with the frame buffer size and number of GPUs and you’ll see that as GPUs went from singel card to dual card and tri-card, the power requirements have seriously jumped up. It wasn’t always so. The 1997 and 1998 machines featured multiple graphics cards too. But by 2005, multicard configurations were a must have in powerful computers. The CPU doesn’t get a total pass though. The spikes in PSU sizes in 2000, 2005, 2008 and 2010 also coincide with our dual-processor builds. Realistically, if this year’s machine had been a single processor box, we could have gotten by with a 1,200 PSU or on the highend, a 1,500 PSU. Still, that’s no salve if you’re looking at a 1,650 watt or 1,500 watt requiement and and you have a wall socket rated at 15 amps. Even those with more modern homes and 20 amp are going to wonder what happens when you have a Dream Machine cranked up on a summer day and someone on the same circuit decides to microwave a Hot Pocket. Poof! We can’t make predictions, but will this year finally be the end of the insane power consumption by the graphics card?

Storage

Oh how far we’ve come eh? From 2.1GB of storage in 1996 for $729. The two 2TB drives used in this year’s machine cost $400. Looking at the chart, you can see that densities really took off in 2005. That year four 500GB drives gave us our 2TB RAID array. By 2007, those four drives became four 1TB drives. In 2008, we actually used five drives with lower density to get to 3.6TB – two were VelociRaptors and three 1TB drives as backup. The financial melt down of 2009 is apparent in our chart where the capacities dropped all the way to a mere 1.7TB of storage using a 1.5TB Seagate and a 256GB Corsair P256 drive. That Corsair drive, however, was the first apparence of the SSD. We actually think that this year would have featured 6TB or 9TB of storage but none of the hard drive vendors are willing to yet ship internal 3TB hard drives due to booting issues with current motherboards.

Price

The pursuit of waton performance. The obsession with computing power. The incredible amount of money you can blow on a computer is easily seen by looking at the ever increasing prices of the Dream Machine. We actually looked at the price of each  system adjusted for inflation and even then, some of the prices are hard to explain. So what explains some of the blips? The 1999 machine flipped the standard Dream Machine story on its head and showed readers how to build a powerful (but realistic) PC in step-by-step manner. The price of 2000’s machine broke shattered previous prices though with its $12,000 price tag. The big ticket items were the pair of 1GHz Pentium III processors ($2,200), the 512MB of PC800 Direct RDRAM  ($1,980!). The three hard drives also drove prices prices up with $2,115 for the pair of 15K Barradua drives and $615 for the 75GB Deskstar (Yes, the ill-fated 75GXP). Another big ticket item: the Sony F500 CRT for a cool $1,900. Makes you feel pretty good about how much a 30-inch LCD cost today doesn’ t it? The price of the Dream Machine actually settled down from there. The most expensive Dream Machine ever, however, was 2008’s. The most expensive component was the custom nickel plating job at $5,000. That’s not even to mention convincing HP to essentially sell the case from its Blackbird 002. It’s no surprise that the record breaking 2008’s machine was followed by a financial collapse that had us wondering if we weren’t going to be running a Pentium Pro in the 2009 rig.

Finally, the gallery:

1996

1997

Next Page: All the rest of the dream machines!

1998

1999

2000

2001

2002

2003

2004

2005

2006

2007

2008

2009

2010

Around the web

by CPMStar (Sponsored) Free to play

Comments