The personal computer has a storied history, stretching all the way back to the days of the Commodore 64 and IBM PC. But for us, the most interesting PC hardware developments really started about 15 years ago. Along with the eminent arrival of Windows 95, this was when Moore's law would really kick into high gear and bring us amazingly fast PC components like Intel's front side bus-multiplying Pentium, AMD's gigahertz-breaking Athlon, and yes, the wonderful world of 3D graphics accelerators.
We take an in-depth look back at the 50 most important pieces of PC hardware in the modern computing area. From CPUs to videocards and even monitors, these components were the envy of every PC enthusiast, whether you could afford them or not. They might not have been the fastest parts at the time, but they sure were the most notable. And before you ask, many of these entries were used of our Dream Machines. Join us as we journey with the ghost of PC past, and share your own favorite PC parts in the comments section!
Intel's Pentium processor brought the x86 architecture to new heights, as well as brought along a new naming scheme. Unable to patent numbers, Intel avoiding dubbing its newest chip the 586. The Pentium introduced several improvements designed to address the performance bottlenecks of previous processors. Chief among them was a 64-bit wide date bus, two execution units, a much improved floating point unit (FPU), and faster clockspeeds. Intel's Pentium processor launched at 60MHz, but it didn't take long for faster chips to follow before it eventually topped out a 233MHz. The 90MHz version was the first Intel CPU to use a FSB multiplier – the FSB was clocked at 60MHz, multiplied by 1.5 to achieve the 90MHz Clock speed. From this point forward, Intel virtually dominated the CPU market until AMD’s Athlon debuted five years later.
The last clone in the true Clone Wars, AMD's Am486 arrived almost a full four years after Intel's 486 came out, and one month after the Pentium. To compete with the existing 486 chip, AMD undercut the competition by selling its version for less, while clocking it higher than Intel's 486. The DX4-100 cost less than Intel’s 486DX2-66, but its 8K write-back cache provided a speed advantage of up to 50%.
The Quantum Fireball ST3.2A was the first good drive to use the UltraATA/33 interface, theoretically capable of reaching transfer speeds of a whopping 33 MB/s. It was available in capacities up to 6.4 GB, which—since it was significantly higher than the 2 GB ceiling for FAT16 partitions—ushered in the superior FAT32, which is of course one of the file systems still with us today.
Circa: July 1997
The Diamond Monster Sound was the first card to fully support the then-burgeoning DirectSound 3D API, offering convincing directional sound effects through a pair of headphones or a 2.1 speaker setup. Even though it was fantastic for playing games that made use of the new 3D sound technology, it was a hard sell at the time, because of its poor performance with older, DOS-based games. In fact, this weakness was enough that we (and this is back while we were still Boot) originally gave the first Diamond Monster Sound card a review score of 7, saying “Assuming that game support for DirectSound 3D materializes, the Monster should become a coveted part of the ultimate gaming system.” It did, and it was.
Until the Voodoo Graphics, gamers were trapped in a 2D world. Sure, there were a handful of so-called 3D accelerators from S3 and ATI that were nothing more than old 2D videocards equipped with hardware to accelerate texture filtering. The original Voodoo graphics added much more horsepower, which wasn't fully tapped until GLQuake.
When paired with GLQuake, the OpenGL-accelerated version of Quake, the first-person shooter came alive. The difference in graphics was astounding, instead of fighting to get 15fps, a Voodoo-equipped system could hammer a solid 30fps, at a higher resolution, no less. While many vendors sold Voodoo Graphics cards, the Canopus Pure3D was the Cadillac of the bunch. With double the texture memory of other Voodoo cards, the Pure3D let you crank the texture settings in all your games, for maximum visual quality.
The Deschutes version of the high-performing Pentium II marked Intel’s big jump into the triple-digit front-side bus. Klamath (the original PII), topped out at 266MHz with a 66MHz FSB, while the PII 400 (with a 100MHz FSB) was the sweet spot for high-end system builders – it performed at almost twice as fast as older Klamath parts.
Circa: February 1998
While the original 3DFX Voodoo card was the first consumer-level 3D accelerator, it's successor the Voodoo 2 showed the first hint of the potential for overkill lurking within the nascent market. You see, the Voodoo 2 allowed users to slave two cards together using SLI (Scan Line Interleave) to nearly double performance. The Obsidian X24 packed two complete Voodoo 2 chipsets on a single board, and paired them with a then-massive 24MB framebuffer. This was the only Voodoo 2 board that supported resolutions up to 1024x768, and was actually used in many 3D arcade cabinets (Cruisin' USA, anyone?)
Circa: April 1998
When old farts talk about the “good old days” of chipsets, they’re talking about Intel’s 440BX. With its 2x AGP that actually worked and a massive 1GB(!) of SDRAM support, the 440BX’s reign was literally years. Even better, plebians could buy 233MHz or 266MHz Pentium II’s running on the 66MHz front side bus and overclock them to 100MHz or higher. The 440BX was so successful, that it eclipsed its intended replacement from Intel: the ill-fated RDRAM-only 820 chipset.
Circa: May 1998
Continuing the success of the K6, AMD's K6-2, released in 1998, brought another MMX unit to the table as well as a new SIMD instruction set famously known as 3DNow! This gave AMD a slight head start in tearing through 3D applications before Intel fired back with its SSE instruction set. The K6-2 held appeal as a cost-conscious upgrade for Super Socket 7 motherboard owners. Later on, AMD would follow suit with the K6-2+, which added 128KB of L2 cache and a smaller manufacturing process (180nm versus 250nm).
Circa: May 1998
Built on Intel’s 440BX chipset, the legendary Asus P2B board helped build up the company’s reputation as a performance motherboard maker. Boasting three ISA slots, four 32-bit PCI slots and an AGP 2x slot, this Slot 1 Pentium II had such long legs, that some variations that were able to run Slot 1 were able to run Slot 1 Pentium III CPUs too. Sure, it didn’t have the sexy soft FSB of the also popular Abit BX6, but the jumper configuration actually supported a wider frequency range than the BX6. The P2B also had the advantage of being a rock-solid board, with many likely still seeing duty today – more than 10 years after it was introduced.
While it lacked the raw 3D power of the 3DFX Voodoo 2, ATI's All-in-Wonder had a different trick up its sleeve--it came equipped with a TV tuner. The Rage Pro was a mid-rate 3D accelerator, and in lieu of the PVR software that we all expect with our TV tuner cards today it included ATI's Digital VCR software, which let you record TV shows on your PC. It even worked with WebTV for Windows 98. Pretty fancy, eh?
Circa: August 1998
The difference of just one letter can sometimes be amazing. Intel’s Celeron 300A was a 300MHz part that became infamous for its overclockability. Based on the 2nd generation Mendocino design, the 300A featured 128k of on-die cache running at a full clock rate. Even at the same clock speed of the Covington Celeron 300, the 300A performed twice as fast as the cache-less counterpart, and that was without any overclocking. Coupled with a decent motherboard, you could bump the 300A to 450MHz by tweaking the FSB from 66MHz to 100MHz, making the 300A the fastest available x86 processor.
Circa: August 1998
The Sound Blaster Live! Was the card that first brought Creative Labs’ EAX API to bear against (and eventually doomed) the dominant A3D API. What gave the Sound Blaster and EAX an edge was the ability to apply real-time effects like reverb and echo to music and in-game sound effects. Though the effects were sometimes not as subtle as they could have been, they added a whole new layer of realism to sound in 3D games, capturing the dramatic acoustic differences between—for instance—a basketball court and a dank cave.
Circa: June 1999
Arguably the most significant series in AMD's CPU history, and certainly the most important in the company's recent history, AMD's Athlon line hit Intel square between the eyes and was such a success, even the Intel faithful found themselves building an AMD system for the first time. Dirk Meyer, who would later rise in rank as AMD's CEO, led the design team that developed Athlon, at first a cartridge-based processor with 512KB of L2 cache. Debuting at 500MHz, AMD beat Intel to the 1GHz mark with its Athlon processor, an important (and much anticipated) milestone at the time.
Circa: August 1999
Ah, the 56k modem. Like the little engine that could, the 56k modem got us onto the internet slowly but surely. And hey, at the time, 56k seemed pretty zippy, really, even if 56k modems could never actually reach speeds of 56 kilobits per second (the FCC mandated that no dial-up modem could achieve speeds of greater than 53.3 Kbps). It was also a relief that the v.92 and v.90 standards ended the standards war between K56flex and X2. Vendor competition is usually a good thing for consumers, but having to buy a new modem when you switched ISPs is something we don’t miss a bit.
Circa: September 1999
Sure, overclockers may think they’re outlaws by thumbing their nose at Intel’s authority but that was like shoplifting from the candy store. Running ABIT’s BP-6 was like pulling an armed robbery and then leading the LAPD on a high-speed chase for three hours. The BP-6 was, after all, the first board to allow you to run two Socket 370 Celeron’s in SMP mode. At the time, Intel had made it strictly verboten to run Celerons in dual processor mode but the company obviously didn’t do enough with the Celery to limit its 2P functionality. That made anyone who ran the BP-6 the ultimate bad ass because he or she not only flipped the bird at Intel, but he or she also had to run a “real” OS like Windows NT, 2K or Linux to get dual processor support.
In 1999, anti-buffer underrun technologies like Sanyo’s BurnProof weren’t yet available, so SCSI drives like the Plextor PlexWriter 8/20—with their immense 4MB cache allotments – were de rigueur for coaster avoidance. In fact, in our May 1999 issue, we called the 8/20 “the flat-out best CD recorder that’s ever passed through our lab.” And we followed up that declaration with the qualifier that the drive created a 650MB data disc in an astonishing 11:14 minutes. Of course, today’s best burners churn out full 8GB dual-layer discs in under that time. But we still have a soft spot in our hearts for the Sexy Plexy.
Here was a keyboard that we would’ve happily taken with us to the grave. We adored the Microsoft Natural Keyboard Pro so much that we used it until our handprints were firmly imprinted onto its once-white surface – it was really that comfortable. The Natural Keyboard Pro was Microsoft’s second mass-market ergonomically-split keyboard, but one-upped the Elite model with its two USB ports, programmable shortcut keys, and familiar “inverted T” arrow key configuration (the Elite used a non-standard cross-like arrow key arrangement). We can do without modern “swiss-army” keyboards with full-color LCD screens and dozens of programmable macros keys – this Carpal Tunnel buster is all we need.
For us, the 3Com 3c905 Network Interface Card is the piece of hardware that symbolizes the dawn of the broadband era. Sure, it wasn’t the first NIC to connect your computer to a high speed network, but it was the first card that was fast, reliable and worked with nearly any OS you could throw at it. Combined with early DSL or cable internet, this was the card that, for many people, opened up the door to the internet.
Circa: January 2000
Plagued by voltage issues and generally a buggy board, we note FIC’s SD11 not for being stable, well designed or even being great. But it was the first four-layer board that supported AMD’s new Athlon CPU. And FIC was, arguably, the first one to actually push an Athlon motherboard. Most board vendors, we were told, feared the wrath of Intel for supporting the K7. The support of FIC (and others such as MSI Gigabyte) helped legitimize the award winning Athlon chip. The fear of Intel may have all been imagination, but we still believe that without the support of the SD11 and other early Slot A boards, the Athlon and its offspring would not be here.
Circa: Jan 2000
The release of the Klipsch v.2-400 was a turning point for computer speakers. Before that, “multimedia speaker” was a sort of euphemism for “crappy little plastic box,” and anyone looking for high fidelity sound from their computer was left without any options beyond plugging it into their stereo. With the Klipsch v2.400, all that changed. Suddenly, for $250, you could get a setup that combined high-quality satellites with a top-notch sub, providing sound quality that was simply unheard of. It’s no surprise that the Klipsch v.2-400 graced our Dream Machine in 2000, and its successors the Klipsch ProMedia 5.1 and 2.1 powered all three of our Dream Machines in 2001.
Remember the days of ugly plastic PC cases that came in an off-white eggshell color? We do, but not fondly. Cooler Master changed the PC chassis industry with its all-aluminum case that paid as much attention to look as to functionality, and started the trend in fashionable enclosures with this aluminum beauty. Countless imitators and successors have improved on the original ATC-100 design, but we laud Cooler Master as the first to prove that cases need not be boring. Would people really shell out more than $200 for a box that merely stores their PC’s innards? Two ATC-series equipped Dream Machines prove that the answer is yes.
We wouldn’t be caught dead using a CRT monitor today, but back in 2000, this 21” Sony Trinitron was one of the most coveted pieces of hardware in our lab. By the early 2000’s CRT development had basically peaked; the F520 actually sat at the top of the CRT ladder for four years, making its way into four different Dream Machine configurations. Here’s why: the F520’s .22mm grille-pitch spec had yet to be challenged by the NEC-Mitsubishi competition. The upshot is that the F520 offered the finest image detail around – games and images looked better on this than any other display.
Circa: December 2000
It’s amazing to think that USB thumb drives have been with us for less than a decade now. In that time they’ve become a ubiquitous nerd commodity, made floppy disks obsolete, and took most of the wind out of the sails of writable optical media. And why wouldn’t they? They’re available with capacities that trump any competing media, miniscule form factors, and highly rugged construction.
But it all started with the humble IBM DiskOnKey, manufactured by Isreali company M-Systems. Released with just 8mb of storage, the DiskOnKey’s capacity left something to be desired, but it nevertheless heralded the advent of truly portable data.
As the first card to support DirectX 8, the GeForce 3 marked the opening move of 3D accelerator technology away from the old fixed-function pixel processing pipe toward the more general, programmable hardware we use today. A later update to the original GeForce 3 series introduced the tiered pricing model to the GPU market, using the same core in multiple models. Cheaper cards featured slower or less capable GPUs as well as less memory.
Circa: July 2002
It’s been said that Nvidia chipsets are like Star Trek movies – only the even ones are worth a damn.
And we heartily agree as that would make nForce 2 the Star Trek II: The Wraith of Kahn (without the great Ricardo Montalban). Equipped with a high performance dual-channel memory controller, dual Ethernet ports and hardware-based real-time Dolby Digital encoding, the nForce 2 shook up the chipset world by shoving VIA to the side and cementing AMD’s Athlon XP as the chip to have in performance computing.
Circa: December 2002
As the first DirectX 9 3D accelerator, the Radeon 9700 Pro introduced programmability into parts of the 3D pipeline that had previously offered a limited number of fixed functions. The series of GPUs that followed were all variants of the original R300 design, and managed to hold the graphics performance crown through an entire generation of Nvidia graphics cards. From the launch of the Radeon 9700 in August 2002 until the GeForce 6800-series launched in April 2004. RV300-based designs dominated the GeForce 5000 series GPUs in legacy DirectX 8 apps and the demanding new DirectX 9 games.
The Radeon 9700 series is also notable because it delivered sufficient memory bandwidth that gamers could run most games with antialiasing and anisotropic filtering enabled without dropping below playable framerates.
Circa: April 2003
You may not know it now, but at one time, Intel was firmly against using DDR memory and instead tried to push the entire PC industry to adopt Direct RDRAM, an incredibly fast, serialized RAM technology. Unfortunately, it also came with a price tag for RAM makers who were already faced with keeping the factories running so as not to lose even more money. The battle raged for years and only after the RAM makers rallied behind AMD and its DDR-using Athlon did Intel relent. What does this have to do with the 875P? As Intel’s first performance chipset in the post-RDRAM days, it helped allay fears that Intel would try to intentionally sandbag DDR to make RDRAM seem like it was the right direction. Instead, the 875P was a winning chipset. With AGP 8x, dual-channel DDR400 and a dedicated port in the northbridge for Gigabit Ethernet communications, the 875P went a long way toward patching things up with enthusiasts.
Circa: May 2003
In 2003 Western Digital released the 360GD Raptor, the first ATA drive to operate at 10,000rpm platter speeds. Sure, it cost as much as drives with 5 times the capacity, but it was still the first real chance for power users to get enterprise-class hard drives for their desktop machines, and they ate it up. Capacity was low, (only 32 GB on the first Raptor) but if you had one in your computer, you were pretty much guaranteed to be the first player on any server to load a map, and that felt good .
While it certainly wasn’t the first optical mouse to hit the market – not by a long shot – Microsoft’s IntelliMouse Explorer 3.0 was the critter that put the nail in the ball mouse’s coffin. The third iteration of the IntelliMouse Explorer brand upgraded its sensor to capture images at 6000 times per second, curing the legendary skipping problem that first-person shooter gamers suffered from with most other optical mice. Perfect button placement, an ergonomic "hump" design, and blissful responsiveness made this mouse superior to a ball mouse in every way. Gamers revolted when Microsoft pulled the 3.0 from shelves to push other models, and the 2006 comeback just didn’t have the same impact.
Sony was first to the market with a dual-layer DVD burning solution with their DRU-700A burner, but that model was marred by well-publicized compatibility problems stemming from the budding nature of the DVD+R DL format. It wasn’t until its successor, the DRU-710A, was released late in 2004 that enthusiasts could take advantage of practical dual-layer burning. The drive also boasted the fastest 16X burning speeds at the time, completing 4.5GB burns in less time than it takes for some of us to run a mile. Later firmware updates added support for more media formats, keeping the 710A relevant and our favorite DVD drive for quite some time.
Circa: June 2004
When AMD held the CPU crown with the FX-53, they had seemingly no place to go after that: the 130nm process had hit its limit. It was a surprise, then, when they released an Athlon 64 4000+ with pretty much the same specifications as the 2.4GHz FX-53, complete with 1MB of cache. The FX line needed a new flagship chip, which came in the form of the FX-55. What made this chip special was AMD’s “strained silicon” process on the parts of the die that limited frequency growth, allowing the FX-55 to achieve 2.6GHz. Performance-wise, this top AMD part bested its rival Pentium 4 Extreme Edition in most benchmarks, making it the preferred chip for anyone with a grand to spend on their CPU.
Circa: January 2005
Oh enthusiast computing. It’s a little wonky sometimes but it’s never boring. That can be said of Asus’ AMD Athlon 64 Socket 939, A8N-SLI Deluxe board. Built on Nvidia’s nForce 4 SLI chipset, the A8N-SLI Deluxe was perhaps the first board that we saw that supported SLI (that is if you forget about the Intel Xeon-based board that Nvidia developed SLI on and then promptly pretended never existed). At the time, naysayers (some of us included) never thought multi-GPUism would succeed. And why not? To even get it to work optimally, you had to use a whacky-ass card to reconfigure the PCI-E 1.0 slots from x16 and x1 to dual x8s. Surprisingly, enthusiasts were willing to do it and multi-GPU functionality is a must have on any enthusiast motherboard today.
The age of mainstream dual-core computing was cemented by AMD’s Athlon 64 X2, led by the 4800+ chip. AMD's Athlon 64 X2 series consisted of two CPU cores on a single die sharing a crossbar that connects them to the integrated memory controller. These internal data links paid huge performance dividends compared to Intel's dual-core configuration, which had each core pushing communication through a shared frontside bus. SSE3 instructions were added to the X2 series, but most importantly, AMD managed to keep the new chip on Socket 939. While not all boards were compatible, many 939 mobos could handle an X2 upgrade with nothing more than a BIOS update, which meant AMD could tap into an existing install based for its new processors. True multi-tasking (meaning playing Half-Life 2 while running background apps) was finally possible!
Cooler Master’s self-contained watercooling system was a cut above the crowd with its integrated waterblock and pump. The Aquagate Mini R120 was the first water-cooling system we saw targeted for entry-level users who wanted the advantages of liquid cooling without the mess and hassle that usually accompanied those configurations. The radiator, pump, block, and reservoir were all integrated into one pre-assembled two-piece unit – even the coolant came pre-filled! All you had to do was latch the block to the CPU and attach the radiator to the side of your case. No tangling with long tubes or messy distilled water was necessary. The Aquagate wasn’t necessarily the best performer when compared to other cooling solutions, but its all-in-one design was undeniably innovative.
The introduction of the PCI-Express bus in 2004 removed the one videocard per system limitation imposed by AGP. It didn't take long for Nvidia to resurrect the SLI acronym--although this time it stood for Scalable Link Interface. By pairing two GeForce 6800-series cards, such as this PNY Verto GeForce 6800 GS, you could nearly double your performance in most games. Having the fastest videocard in your rig was no longer enough, now you needed a pair of the fastest videocards. Shortly thereafter, ATI announced Crossfire and the videocard arms race between ATI and Nvidia continued.
We were perfectly content with computing on 20” 1600x1200 CRTs until this game-changer came along. Dell’s reasonably priced 2405FPW brought 1900x1200 gaming to the masses and made wide-screen a must for enthusiasts. Not only did this panel deliver excellent image quality (with minimal ghosting and color banding), the 24” 2405FPW sported an array of alternate video inputs so you could plug PC, consoles, and DVD player into one monitor. Not to mention that it was several hundred dollars cheaper than the [slightly smaller] Apple Cinema HD display, and included USB and memory card slots to boot.
Waking out of its Netburst slumber, Intel took the CPU world by storm with its Core 2 architecture. Instead of remaining fixated on higher clockspeeds, Intel refocused its attention on being more efficient with its pipeline. This meant a return to lower clockspeeds, however it also meant a return to prominence as the performance king. After Prescott failed to live up to its hype, the media remained cautiously optimistic that Core 2 could live up to Intel's promised performance gains, but much to the chagrin of AMD, Core 2 lived up to its billing, and then some.
The first Core 2 Conroes burst out of the gates with 167 million transistors, a 65nm manufacturing process, 2MB of L2 cache, and a 1,066MHz frontside bus. Despite debuting at just 1.86GHz and 2.13GHz (E6300 and E6400, respectively), Core 2's performance made it instantly attractive, and Intel's aggressive pricing sealed the deal.
Circa: November 2006
Once it solidified its lead over ATI in graphics cards, Nvidia made its move on Intel’s performance chipsets – and for the most part succeeded. With its emphasis on overclocking, advanced southbridge features and its SLI support, the 680i SLI was the chipset to have if you wanted to build an enthusiast PC.
Sure, there were teething pains, but for most enthusiasts it was worth the sacrifice. Besides, what was the alternative? Running two Radeon X1950 cards? Feh. The 680i SLI continued to be popular until a lack of compatibility with Intel’s new 45nm quad-cores and lack of PCI-E 2.0 pushed it aside.
Featured in our 2006 Dream Machine, the shockingly fast Core 2 Extreme X6800 CPU marked Intel’s return to the “brainiac” design that emphasized performance per clock rather than insanely high clock speeds (as characterized by the Pentium 4). In a nutshell, the X6800 was wider, faster, and cooler. It’s wider because its microarchitecture was designed to process four instructions per cycle. Faster describes the Core CPU’s ability to process a 128-bit SSE instruction in a single cycle instead of the two cycles its contemporaries require. And it was designed to run cooler than its smoking-hot processors. Pit against the Athlon 64 FX-62, the X6800 took every CPU-intensive benchmark by a huge margin.
Circa: January 2007
Billed as one of the first DirectX 10-capable 3D accelerators, the GeForce 8800 series of GPUs are nearly as memorable because they represented a quantum leap in performance for DirectX 9 games. Graphics cards like this overclocked ASUS EN8800 GTX first went on sale in early 2007, but there weren't any DirectX 10 titles to play. There was, however, a glut of incredibly system-intensive DirectX 9 games. Games like Oblivion and Company of Heroes crushed the mightiest of DirectX 9-era 3D accelerators; but the 8800 GTX provided enough GPU juice to run every DirectX 9 title at the highest resolutions. And, later that year when the first true DirectX 10 titles shipped, that same GPU also gave us the first taste of fully programmable graphics, although likely at a pretty low resolution.
Circa: May 2007
The Hitachi Deskstar 7K1000 was the first SATA hard drive to break the 1 TB barrier. Sure, a terabyte isn’t really that much bigger than 750 gigabytes, but there’s definitely something to be said for the psychological impact of moving up into a whole new unit of measurement. Also, the 1 TB Hitachi Deskstar 7K1000 was the drive that really demonstrated the capabilities of the new perpendicular magnetic recording technique, the technology that allowed for the higher-density platters needed for a 1TB drive, and birthed a series of delightfully trippy flash cartoons .
When consumer-targeted 30-inch desktop LCD monitors emerged in 2007, we were supremely disappointed that these monstrous widescreens lacked an internal scaler. Convention monitor-scaling technology wasn’t powerful enough to drive the 30” panels’ 2560x1600 resolution, so they were all restricted to dual-link DVI interfaces with no on-screen display options. Gateway surprised us by being the first company to release a 30” panel with a built-in scaler, in this case a Silion Optix HQV Teranex Realta processing chipset. In layman’s terms, that meant this chip allowed the XHD3000 to support numerous interface options, an onscreen calibration controls, and even picture-in-picture functionality. HDCP support let us play high-def video in its intended resolution, and gaming on this monster was a pleasure. And at launch, the XHD3000 was actually cheaper than alternatives from Apple and Dell.
This blooming, copper-finned successor to the already awesome CNPS 9500 added more than enough performance to justify its 200 point model number change. Not only was this air cooler easy to install (by 2007 standards) and looked slick, its adjustable 2800rpm 110mm fan provided plenty of cooling to let us overclock our Athlon FX-60 testbed. On a full CPU load, the 9700 was 14 degrees cooler than a stock cooler, earning it a coveted 10-Kick Ass score in our February 2007 issue. Zalman’s current flagship cooler, the 9900NT, inherits most of the features of the 9700, and remains one of our favorite coolers.
Circa: February 2008
“Who you tryin’ to get crazy with ese? Don’t you know I’m loco?” Well, actually, we didn’t. We never actually thought that when Intel said it was loco, that it really was loco. But the company showed just how insane in the membrane it could be with enthusiast Skulltrail platform. The D5400XS mobo continues possibly one of the most over the top boards ever made. Featuring two sockets for LGA771 Xeon’s (rebranded as Core 2 Extreme QX9775) chips, this Extended ATX board had four x16 PCI-E slots, supported up to 16GB of FB-DIMM RAM and even let you overclock those Xeons, err, Core 2 Extremes. The ultimate insane stunt Intel pulled, though, was swallowing its pride and integrating not one, but two nForce 100 bridge chips on it. That made the Skulltrail the only retail board capable of running CrossFire and SLI until the arrival of the X58.
Circa: September 2008
Like a lot of the items on this list, Intel’s X-25M SSD is notable for being the first piece of hardware to really deliver on the promise of a new technology. There had been a lot of buzz about SSDs right from the get-go—transfer rates that would blow your mind, they said—but when the X-25M was released, there still hadn’t been a single piece of affordable hardware that lived up to the hype. Intel ‘s SSD made believers out of us, though, landing benchmarks that blew our go-to performance drive, the WD VelociRaptor, out of the water.
Released: October 2008
Putting two GPUs on a single PCB, in a single PCI-Express slot wasn't anything new when ATI released the Radeon 4870 X2. The problems we'd had with similar boards from both ATI and Nvidia left us wary. But, after careful examination, we found that the X2 board came with no gotchas. It worked with multiple monitors, had moderate power requirements, made an acceptable amount of noise, and was fast as hell. Best of all, you could drop a pair of them in most any motherboard and get four GPUs worth of performance for the price of two.
Circa: November 2008
Paired as the launch chipset with Intel’s rocket Core i7 CPUs, the X58 is, in many ways, far less of a chipset than previous Intel chipsets. With the memory controller moved into the CPU you wouldn’t think that the X58 would get into the history books, but the X58 marked a major sea change for performance computing: the reunification of multi-GPU graphics. With no Core i7 chipset offering, Nvidia decided to license motherboard makers to run SLI, rather than have everyone build new enthusiast boxes with ATI’s reenergized Radeon HD lineup.
Circa: November 2008
Why is this CPU more significant than the 8088, Pentium, or Pentium M? As the second new chip produced after a series of embarrassing losses to archrival AMD, the Core i7 will answer for the world whether Intel is prepared to ride the momentum of its Core 2 launch with another winning chip or if it’s content to rest on its laurels, as it did with the Pentium 4. Core i7 also represents a major new direction for Intel, which has stubbornly clung to the ancient front-side-bus architecture and discrete memory controller for years. Indeed, with its triple-channel integrated DDR3 memory controller and chip-to-chip interconnect, the block map of a Core i7 looks more like an Athlon 64 than a Core 2 chip.
Why is the Atom processor important? Because despite a global economic downturn, worldwide PC sales have remained on an uptick thanks in large part to the explosive growth of netbooks, the vast majority of which sport an Intel Atom processor inside. On the hardware front, these low-power chips only boast 47 million transistors, 512KB of L2 cache, and a top clockspeed of 1.86GHz. A dual-core variant exists for the desktop, but so far not for mobile PCs. Even though most power users would consider the Atom N270 underpowered for many computing tasks, 15 million Atom-based netbooks shipped in 2008 alone counts for something.
Think we missed something? Comment below or send us a Twitter !