It’s been a long time since the zero-point system and benchmarks we use to test PCs and other components have been updated, and it shows. The enthusiast world has switched from AMD to Intel and a new OS is upon us.
To select our new hardware and benchmarks, a committee of editors sat around a box of doughnuts and debated the direction of performance computing. We discussed the typical tasks power users perform and how we could make our benchmarks pertain to those needs. Then, we discussed what PC configuration to use to test all new hardware in the coming year. Our zero-point rigs represent the basic level of hardware we expect a power user to have six months from now. These machines serve not only as a reference point for readers of our system reviews but also as test beds for almost all the hardware and software we review.
Generally, we update our zero-point config and all our benchmarks every 12 to 18 months, but this time, we’re breaking from convention. We’re sticking with old gaming benchmarks for the time being. Why? With high-profile titles like Crysis on the cusp of release, we decided to continue running Quake 4 and FEAR benchmarks until newer, more graphically intensive DirectX 10 titles are available. At that time we’ll fold those tests into our benchmark suite.
Although considered high end by most, our zero-point system really stacks up as a midrange machine
When we spec’d our new test machines, we decided quad core was a must-have feature. We would have considered both AMD and Intel, but as you know, AMD is a no-show in the consumer quad-core game. We normally reach for the top-tier CPU, but this year, we selected a CPU that most enthusiasts on a budget would buy, not what we all want. Intel’s fast, new QX9650 was out of our price range, so the company’s Core 2 Quad Q6700 got the job. At $500, it’s pricey but not a wallet breaker. For our zero-point, we’ll run it at its stock 2.66GHz clock even though we know it’ll run at 2.93GHz all day without breaking a sweat.
|EVGA’s nForce 680i SLI board marks the fourth Nvidia-based chipset that we’ve adopted for our zero-point systems.|
EVGA’s 680i SLI will soon be supplanted by Nvidia’s follow-up to the chipset, but we’ve chosen it for its affordability and proven ability as a workhorse. Plus, BIOS updates from Nvidia have been timely and the chipset supports SLI. Maximum PC historians will note that the 680i SLI is the fourth generation of nForce chipsets we’ve adopted since we stopped being an Intel-only chipset shop.
|A pair of EVGA 8800 GTX boards gives our machine potent gaming capability at all resolutions.|
The best-performing card right now is EVGA’s GeForce 8800 Ultra, which sells for more than $700, making SLI cost prohibitive. That’s why our machine uses a pair of EVGA GeForce 8800 GTX warhorses. The duo gives us smooth performance at just about any resolution you’d play games at.
8GB? 4GB? Nah. Within the limitations of a 32-bit OS, the sweet spot for system RAM is sadly still 2GB. To meet our zero-point system’s needs, we reached for 2GB of Corsair DDR2/800 Dominator 8500C5D modules. The modules are rated to run at 1,066MHz, which will be useful when an individual machine has to be overclocked to test cooling gear.
Because we constantly wipe our test beds with a clean hard disk image, we’re eschewing a RAID setup (disk imagers work inconsistently with RAID). We didn’t want to totally give up on performance, though, so our main boot drive is a single 10,000rpm 150GB Western Digital Raptor drive. A supplemental 7,200rpm 500GB WD Caviar pulls bulk-storage duties for holding drivers, benchmarks, and image files.
Burning Blu-ray and HD DVD discs isn’t critical for every editor, but viewing high-resolution movies is an important part of testing many products. With that in mind, we reached for LG’s GGC-H20L drive, which reads both Blu-ray and HD DVD discs and gives us DVD and CD burning capability. The drive has a SATA interface and will likely mark the end of PATA in our Lab.
As good as the EVGA 680i SLI boards are, they still use Realtek’s onboard audio, with its fake-ass EAX support. To fill the void, Creative Labs’s X-Fi XtremeGamer gives us hardware audio support in XP (and the Vista drivers almost work too!).
We’ve long used PC Power and Cooling’s PSUs in our zero-point machines. In almost 10 years of testing, we’ve had only one supply ever fail, and that was due to impact damage that no editor ever owned up to (Josh!). In a shocking move, we’re stepping back from our previous test bed’s insanely high wattage in favor of a quieter Silencer 750 quad supply.
Our benchmarks continue to be 100 percent synthetic-free tests. If a machine gets faster scores in our benchmarks, it’s because it’s faster, not because of an esoteric driver hack.
|We use the same HDV content we previously used, but now we’re outputting it to a Blu-ray-friendly MPEG-2 format instead of WMV.
We decided to reuse much of our project from the previous Premiere Pro 2.0 benchmark suite, but we’ve upgraded to Premiere Pro CS3. Additionally, we’ve tweaked our output options. Instead of outputting the file to WMV9, we take our HDV-res video and spit it out to a 1080i Blu-ray-compatible file in MPEG-2 format. The project continues to use multiple effects, both CPU and GPU, and multiple video overlays. The benchmark really highlights the improved multithreading support in CS3. The test favors fast CPUs and scales well with clock speeds, but not as much when you move beyond four cores.
The only major change to our Photoshop test is the jump from CS2 to CS3. For this benchmark, we take a RAW file (shot with a 12MP Canon EOS 5D) and apply a ton of filters to it with multiple reverts along the way. Our Photoshop script tends to be CPU intensive, but disk I/O and the amount of system RAM also influence the result. Multicore support in Photoshop is better than in previous versions, but for the most part, this benchmark prefers clock speed over the number of cores.
|ProShow is one of the top choices for professionals who want to make video slideshows from their still images.|
New to our benchmark retinue is Photodex’s popular ProShow Producer application. The application is a popular slideshow program among professionals and advanced amateurs. We like it because it not only represents real-world workloads but is also extremely multithreaded and will even load up a dual quad-core machine. In our benchmark, we build a slideshow using 130 12MP images shot with an EOS 5D at 3200 ISO. We apply a random selection of transitions and effects to the images and two MP3 files are used for background music. The entire show is then rendered as a 1080p MPEG-2 file. The benchmark likes clock speed and gets a good bump from quad-core CPUs, but our tests show that Intel’s eight-way Xeon platform doesn’t scale as well as we’d expect.
|MainConcept is a popular multithreaded codec maker that’s embedded in many consumer and commercial applications.|
Also new to our benchmark suite is MainConcept’s Reference. You might not be familiar with the MainConcept name, but you probably use one of its products. Corel, Adobe, and AverMedia all use MainConcept’s codecs. We use MainConcept’s freely available Reference demo to transcode the 1080p MPEG-2 file created in our ProShow Producer benchmark to the AVC/H.264 codec at 1920x1080 resolution. The Reference demo uses the same codec as the fully licensed version but includes a watermark in a corner. The benchmark gets a healthy bump from quad cores and scales well with clock speed. Interestingly, this is one of the few benchmarks that run significantly faster under Windows Vista than XP.
FEAR: First Encounter Assault Recon was a punishing game and benchmark when it shipped two years ago, but it’s no match for today’s hardware. It is still very much a GPU benchmark at higher resolutions, but at 1600x1200, a combination of GPU and CPU influence the score. As a compromise, we run FEAR’s demo with soft shadows enabled and 16x anisotropic filtering. Hardware audio support, if available, is enabled for the benchmark runs as well. We’ll be replacing FEAR with a more current game within the next three issues.
This venerable Doom 3–engine game is OpenGL-based and generally exposes poor OpenGL drivers. We run our custom timedemo at 1600x1200 with 4x AA and 4x anisotropic filtering. The game is one of the first to support dual cores and it scales well with CPU support. We’ll also be replacing this benchmark within the next three issues.
Windows XP isn’t going away, so our new benchmark suite supports both OSes, but the speed differences are surprising.
|Our current desktop test bed is a Windows Vista Ultimate machine using a quad-core 2.66GHz Intel Core 2 Quad Q6700, 2GB of Corsair DDR2/800 RAM on an EVGA 680 SLI motherboard, two EVGA GeForce 8800 GTX videocards in SLI mode, Western Digital 150GB Raptor and 500GB Caviar hard drives, an LG GGC-H20L optical drive, a Sound Blaster X-Fi soundcard, and a PC Power and Cooling Silencer 750 Quad PSU.|
|Our current desktop test bed is a Windows XP Professional machine using a quad-core 2.66GHz Intel Core 2 Quad Q6700, 2GB of Corsair DDR2/800 RAM on an EVGA 680 SLI motherboard, two EVGA GeForce 8800 GTX videocards in SLI mode, Western Digital 150GB Raptor and 500GB Caviar hard drives, an LG GGC-H20L optical drive, a Sound Blaster X-Fi soundcard, and a PC Power and Cooling Silencer 750 Quad PSU.|
We selected all of our benchmarks because they run on both Windows Vista and Windows XP Professional. As performance hounds, we lean toward Windows XP Professional, so we considered running our benchmarks in XP and simply comparing Vista-only machines that we receive on the same scale. After lengthy debate, we decided that would be unfair, so our zero-point is a dual-boot system with Windows XP Professional SP2 and Windows Vista. We ran the benchmarks on each OS independently.
Wonder why enthusiasts are skipping Vista? Look at our benchmark chart. Vista performance generally dragged behind XP except in two tests: FEAR and MainConcept. We were particularly surprised by FEAR. Vista drivers have been horrible since launch, but apparently Nvidia has finally turned a performance corner.
There’s no such speed increase elsewhere, though. ProShow Producer showed a 14 percent performance decrease in Vista, and Photoshop was about 8 percent slower. OpenGL performance was atrocious in Vista, as well, with Quake 4 scores about 18 percent slower than XP’s. Ouch.
How does the new zero-point stack up against a high-end machine? You can read December’s system review for details, but a faster CPU, RAID 0, and faster graphics cards amounted to as much as a 50 percent increase in performance.
Our zero-point machine is not intended to best the machines we review but to provide a frame of reference for readers who wonder just how fast a 4GHz Penryn is compared to what’s in their own rigs.