ATI Radeon HD 2900 XT Benchmarks

Michael Brown

When you’re the lead dog, the only thing the rest of the pack sees is a bull’s-eye. And the folks behind AMD’s ATI brand have been staring at that target for more than two years.

Having failed to defeat their graphic rivals the high end, AMD has lowered its sights to focus on the middle range—the price point where they figure most people actually buy new videocards. And so today they’ve introduced the top end of their new ATI Radeon HD 2000 series: The 2900 XT. This GPU can’t compete with Nvidia’s top-end GeForce 8800 Ultra or even the slightly less-powerful GeForce 8800 GTX. No, the $400 Radeon HD 2900 XT is aimed squarely at Nvidia’s GeForce 8800 GTS.

It took a while for that message to sink in when AMD briefed us several weeks back, especially when Rick Bergman, senior vice president of the AMD Graphics Group opened his remarks with “Yes, we’ve lost market share over the past couple quarters, but you’re going to see AMD come out swinging.”

Uh, Rick, seems your bat is a wee bit small.

But when you study the HD 2900 XT, you realize there’s some pretty solid wood in it. And in terms of benchmarks, it does outrun Nvidia’s 640MB 8800 GTS. But here’s the key question: Can AMD take market share away from Nvidia without even trying to compete at the high end. After all, people tend to perceive the winner at the high end to be the winner across the board. More importantly, this quarter’s high-end product is the next quarter’s mid-range—and this quarter’s mid-range is the next quarter’s bargain. What’s AMD going to do when the much-faster 8800 GTX falls into mid-range territory?

For now, let’s see what AMD has to offer today, looking at the new chip’s architecture first and its benchmark performance second. It should surprise no one that the entire HD-2000 series features a unified shader architecture. DirectX 10 and Shader Model 4.0 all but demand it, after all, and ATI has previous experience developing this type of graphics processor (for Microsoft’s Xbox 360). This means all the chip’s stream processors (there are 320 inside the 2900 XT) can perform vertex, pixel, or geometry shading as the need arises.

The Radeon HD 2900 XT is only a trifle longer than a GeForce 8800 GTS

These components, however, are limited to operating at the same speed as the core (up to 740MHz for the 2900 XT). The 8800 GTX, in comparison, has just 128 stream processors, but they run at more than twice the speed of the core: 1.35GHz versus the core’s clock speed of 575MHz. But as we’ll see when we look at benchmark numbers, this speed trick doesn’t help the 96 stream processors in Nvidia’s 8800 GTS out-gun AMD’s 2900 XT.

AMD’s engineers have doubled the width of the ring-bus memory architecture they introduced with the Radeon X1000 series to a full 512 bits. The theory behind the ring bus is that you can speed up memory transfers by decentralizing the chip’s memory access. In AMD’s design, four “ring stops” surround the GPU; and each ring stop has two 64-bit memory channels (2x64x4=512) over which memory reads and writes can occur.

But the 2900 XT has only 512MB of memory, compared to 640MB on the higher-end 8800 GTS and 768MB on the 8800 GTX and 8800 Ultra. Is this smaller frame buffer a disadvantage? Not according to AMD. They claim that the vast majority of game developers have settled on designing for a maximum footprint of 512MB, and that the only reason Nvidia’s cards have larger frame buffers is because their GPUs have “odd-sized” memory interfaces (of 320- and 384-bits, respectively).

The X2000-series will of course support CrossFire—AMD’s trademark for operating two videocards in a single PC. As with latest spins of the X1000 series, the company has eliminated the master/slave concept. All X2000 GPUs for the desktop market will have a compositing chip baked right into the silicon. We didn’t have time to test a CrossFire configuration, but we wouldn’t be surprised if such a rig were able to outrun a single 8800 Ultra—and cost less, too. But when you throw a second 8800 Ultra into an Nvidia box, AMD doesn’t have an answer.

The HD 2900 XT will require two power connections: One 6-pin and one 8-pin.

One thing Nvidia doesn’t have an answer for—at least not at the high end—is a next-gen video decoding. All of ATI’s new GPUs will be capable of offloading all HD DVD and Blu-ray video decoding from the host CPU. Boards based on these chips will also feature HDCP decryption over dual-link, DVI enabling them display the new video formats at their native resolutions. These are both feats that only Nvidia’s new—and otherwise low-end—GeForce 8500 and 8600 can match. The new and otherwise supremely powerful 8800 Ultra is stuck with the Nvidia’s first-generation PureVideo HD engine, which relies on the CPU for some decoding chores and offers HDCP decryption over only single-link DVI.

The Radeon HD 2000-series also features an integrated audio controller, adhering to Microsoft’s DRM edict that forbids splitting the audio output in order to provide HDMI out. AMD’s cards will come with a DVI-to-HDMI adapter that carries both digital video and digital audio signals on one connection.

Another unique feature common to the entire Radeon HD 2000-series is a programmable tessellation unit; again, based on technology developed for the Xbox 360. A tessellation unit takes the polygons in a crude model and subdivides them to create additional, smaller polygons in order to add detail without requiring a huge boost in GPU or CPU horsepower.
Film animators—a la Shrek and Finding Nemo—have been using tessellation technology for years, but the hardware has been lacking in PC graphics. ATI hopes to change that, but this tessellation unit is unlikely to have much of an impact until tessellation is exposed in DirectX. Chas. Boyd, a DirectX 10 graphics architect at Microsoft, has indicated that such an event lies in DirectX’s future, but it’s unclear whether that will come with an update to DirectX 10 or if the industry will have to wait for DX11.

AMD's new top-end GPU has 320 stream processors on tap.

Okay, enough jawbonin’ about specs and features. Let’s get to the question on everyone’s mind: Just how fast is the ATI Radeon HD 2900 XT. The core on the evaluation sample we were provided was clocked at 743MHz, while its 512MB of GDDR3 memory was set to run at 828MHz. We tested it in an Intel 975XBX2 motherboard with a Core 2 Extreme X6800 running at 2.93GHz and 2GB of DDR2 RAM.

That and the 2900 XT’s 320 stream processors delivered Quake 4 at a healthy 76.1 frames per second in Ultra Quality mode—nearly three frames per second faster than a stock-clocked 8800 GTS could manage running in the less-demanding High Quality mode. That’s impressive performance compared to a stock-clocked 640MB 8800 GTS, which delivered Quake 4 at just 65.4fps in Ultra Quality. And when we dropped the 2900 XT to that level, its numbers catapulted to 84.7fps.Our FEAR benchmark results were also notable, with the card pumping out 63fps compared to the 8800 GTS’s measly 52fps.

AMD is hoping to get a lot of mileage out of the fact that cards based on their best GPU will not only retail for just $400 (we found boards from PowerColor and Sapphire selling for $410 each today), but that each card will also come bundled with three brand-new Valve games: Half-Life 2: Episode 2, Team Fortress 2, and Portal. Bundling brand-new games is a good move on AMD’s part, but it doesn’t change the fact that cards based on GeForce 8800 GTS with 640MB frame buffers were selling for an average on $332 (not including mail-in rebates ranging from $20 to $30) on the day of AMD’s announcement.

It also doesn’t change the fact that the GeForce 8800 XTX leaves AMD’s part in the dust in terms of gaming performance—and at an average price premium of just $111. And then there’s the 8800 Ultra and the whole matter of CrossFire to consider. Intel’s 975 BX2 boards support CrossFire today; but now that AMD owns ATI, who knows how much longer Intel will want to support its primary competitor’s technology.

The 2900 XT looks a lot better when you eliminate all consideration of dual videocards in your rig. It’s certainly a better solution for delivering both high-def video and solid gaming performance than anything Nvidia currently has to offer. But when you consider how little penetration the next-gen optical-drive formats have managed to achieve, I don’t know that it will be enough to generate the sales volume AMD needs.

Around the web