Graphics cards have gotten faster and added more features. So we have to ask the question: is it really worth adding a second GPU to your system? Will you get enough of a performance boost to justify the extra power draw and added cost? The answer is more complex than a simple yes or no. It all depends on what games you’re running, how much you dial up features like anti-aliasing, whether you’ve dived into the world of stereoscopic 3D and what monitor you’re running.
Perhaps the most important factor in the decision is display resolution. If you’re running a 1680x1050, 22-inch display, a single midrange or high end card will get the job done. Adding a second GPU is overkill. If you’ve got a 30-inch, 2560x1600 display and want to crank up the AA and postprocessing features, then that second GPU can be a big help.
Stereoscopic 3D, like Nvidia’s 3D Vision, demands more performance as well, since you’re effectively doubling the frame rate requirements of a game. Most 3D displays currently available max out at 1920x1080, however, so the performance demands aren’t overly onerous.
Games themselves are evolving and adding more features. You can see that in a title like Just Cause 2. Still a DirectX 10 title (and it requires DX10 or higher), Just Cause 2 adds a host of postprocessing effects that can demand much from your graphics card. Toss in DX11 titles, like Aliens versus Predator or STALKER: Call of Pripyat, and frame rates can plummet as you add features. So that second GPU can indeed contribute to the overall experience.
With these thoughts in mind, let’s take a look at our SLI and CrossFire X candidates.
We looked at seven different DirectX 11 capable cards, including four based on Nvidia GPUs and three using AMD graphics chips. We also tossed in the fastest single GPU card from the previous generation, Nvidia’s 285 GTX, in the form of a pair of eVGA GeForce 285 GTX SSC cards. Note that not all cards were identical. With the exception of the GTX 460 768MB, Radeon HD 5870 and GeForce GTX 480, all were run at reference speeds. The GTX 460s, HD 5870s and GTX 480s were factory overclocked by a few percentage points, as were the previous generation 285 GTX cards.
Here’s the lineup; note there were two of each, and we call out the models where appropriate.
While the core and memory speeds of some of these cards may be higher than stock reference cards, it’s been our experience that actual game performance increases only slightly. So we can still make judgment calls on performance, even though all of these aren’t stock cards.
Our test bed is a 3.33GHz Core i7-975 Extreme Edition in an Asus P6X58D Premium motherboard with 6GB of DDR3/1333 and a Corsair TX850W PSU. The OS is 64-bit Windows Ultimate. All games were run at 1920x1200 with 4x AA. In point of fact, we ran at 2560x1600 with and without AA. But our tests were still pretty demanding at 1920x1200, with 4x AA and all eye candy cranked up. So those are the results we’re reporting here.
We installed the latest release drivers -- 258.96 WHQL for Nvidia and Catalyst 10.7 for AMD. We also made sure each card’s dual GPU feature was enabled. AMD will enable CrossFire X by default if it detects two cards, while you’ll have to manually turn on SLI with Nvidia cards.
Make sure you manually enable SLI in the Nvidia control panel
Let's dive right in.
First, let’s take a look at games and benchmarks that run on DirectX 10. This allows us to see how much we’ve gained since last generation. We’re using the fastest single GPU card from last generation, a (once) speedy eVGA GeForce 285 GTX SSC, factory overclocked to 702MHz (versus the stock 648MHz) core and 2646MHz memory (the stock memory clocks are 2484MHz.) Let’s see how the new kids on the block handle the now geriatric 285 GTX:
First up is the hoary old 3DMark Vantage, running the “extreme” test. The older 285 GTX just wins out over the newer Radeon HD 5830, but falls short of everything else. Note how even two GTX 460 768MB cards, with their relatively low memory bandwidth, still spanks the older card.
Crysis is still a demanding test, even after three years. Just when you begin to believe it’s CPU bound, tossing in a second, new generation card pumps up the frame rate. A pair of GTX 480s are the overall winner, just falling short of the magical 60fps. It’s impressive how the $230 cards – GTX 460s and Radeon HD 5830s – generally keep up with the once mighty 285 GTX.
Next up is a pair of Far Cry 2 benchmarks.
The longer Ranch benchmark was once a demanding GPU test, but it’s become pretty easy by today’s standards. Once again, AMD’s Radeon HD 5830 is the odd duck, but note how even the low cost 460 GTXs are essentially at performance parity with the pricier Radeons in this graphics intensive test.
Far Cry 2’s action benchmark is more indicative of actual gameplay, so the test is CPU limited on higher end cards – in this case, anything that’s not an older card (285 GTX) or memory limited (GTX 460 768MB). Note how tight the grouping is here – that’s because the benchmark throws of ton of physics around, plus numerous AI characters running around trying to kill you. We wouldn’t make any graphics card decisions based on this benchmark, but it’s worth noting that GPUs aren’t the entire ball of was in PC gaming.
Tom Clancy’s HAWX is an action flight sim; we used the DirectX 10 version in our testing. Once again, AMD’s Radeon HD 5830 is the weak sauce, illustrating what an odd duck AMD’s lower end midrange card really is. While the HD 5870 wins out over the two GTX 460 cards, it’s no match for the paired GTX 470s or 480s.
Just Cause 2 is one of the few titles that requires DirectX 10 as a minimum, as well as running only on Windows Vista or Windows 7. The Concrete Jungle test throws a lot of postprocessing effects at the card, and can hammer frame rate – and it’s not particularly CPU bound. Note that we disabled the Nvidia-specific Bokeh and water effects to keep the playing field even.
AMD stays in the game with Just Cause 2, at least with paired Radeon HD 5870s. The Radeon HD 5830 and 5850s don’t fare quite as well. Once again, the 5830 is the weak sister of the bunch.
Next Page: DirectX 11 Benchmarks...
DirectX 11 adds demanding new features, including hardware tessellation and new lighting effects. We turned up everything we could in our DX11 benchmarks. Note that the 285 GTX is now out of the picture, since it doesn’t support DirectX 11’s new features.
We’ll begin with a synthetic test, Heaven, based on the Unigine game engine. Note that we’ve cranked up tessellation to “Extreme”. It’s interesting to see how the 768MB GTX 460 falls off a cliff, probably due to that cards smaller frame buffer, exacerbated by the lower bandwidth available with the 192-bit wide memory interface.
More importantly, AMD is roundly trounced in this benchmark. It’s worth noting that discussions with developers have noted that AMD does pretty well with shader heavy scenes, but Nvidia cards are – and we’ll quote an unnamed source here – “tessellation monsters.” It certainly shows here.
Okay, so Nvidia can pull out its self-described “can of whoop-ass” on AMD in a synthetic, tessellation heavy test. Let’s see how it does in actual game benchmarks.
There’s not a lot of hardware tessellation in BattleForge, but a lot of lighting effects – particularly SSAO style effects. You’d think an RTS like BattleForge would be CPU bound, but it scales pretty well with GPUs. AMD just can’t keep up here; the HD 5830 falls behind everyone else, while the HD 5850s and 5870s are beaten down by the higher end GTX 470 and 480 cards in SLI mode.
It’s another Nvidia sweep. While the HD 5870 comes close to the GTX 470, it still falls just short. The HD 5850 is in a dead heat with the less expensive GTX 460 1GB cards. Note how the HD 5830 once again brings up the rear.
The Aliens versus Predator benchmark uses DX11 tesselation. It’s worth noting that AMD can stay pretty close to Nvidia if you leave AA dialed down, but once you add anti-aliasing, the big green machine turns up the nitro and leaves the boys in red far, far behind.
New game, same old story: Nvidia wins. Note that it appears that the GTX 470 beats the GTX 480, but it’s really a dead heat – what we’re seeing here is the GPU waiting for the CPU in a classic CPU-bound situation.
So it’s apparent the Nvidia’s high end cards rule the roost in performance. Why not just slap a pair of GTX 480s in your system and be done with it. Well, first there’s the price. Two GTX 480s will set you back a solid kilobuck or more. A pair of Radeon HD 5870s will set you back less than $800, if you shop for standard clocked cards.
Then there’s power. This chart suggest why you might not want to just drop in two GTX 480 cards.
It’s worth noting that two GTX 480s seem to use slightly less power than a pair of GTX 470 cards when the system is idling, though it’s not a large difference. But look at those numbers for system under load. Those power numbers were captured using a Watts Up Pro meter connected via USB to a PC to collect power data in real time. Power usage was captured when running the Unigine Heaven benchmark at 2560x1600 at 4x anti-aliasing.
If the GTX 480 is one power hungry cards, paired factory overclocked GTX 480s are power hungry monsters . Note that our 850W Corsair power supply had no problems delivering the 665W of system power needed under full load. A pair of Radeon HD 5870s consumed fully 230W less than the GTX 480s. At idle, two HD 4870s ran nearly 40W cooler.
It’s worth noting that neither the GTX 480 nor GTX 470 running in SLI mode ever had any problem with any game test. However, they did get extremely loud – enough so that you’d want to either use a headset to hear your game audio or really pump up your speaker volume. You’ll want a solid case with good airflow, which will mitigate both heat and noise somewhat.
AMD has invested substantial time and resources improving CrossFire X, but it’s clear that Nvidia’s SLI is currently the superior dual GPU solution. Nvidia’s generation of DX11 cards scale very well when adding the second card, and SLI still works with more titles than CrossFire X, thought that gap is narrowing.
One thing that’s quite evident from all our testing: the Radeon HD 5830 is very much an odd duck. It’s been our experience that both single card HD 5830s and 5830s running in CrossFire X mode just don’t have the horsepower to justify its current cost point.
On the other hand, there might be times you might want to consider paired Radeon HD 5850s or 5870s, particularly if power, heat and noise issues are of concern. However, both the GTX 470 and 480 in SLI seriously spank AMD’s best in sheer performance.
In many ways, the 460 GTX impresses us more. Running in SLI mode, these cards scale well, never seem to get particularly loud or too hot, and can either stay close in terms of performance to AMDs high end card. They can certainly outperform the HD 5850 in most cases. And if you’re looking at Nvidia specific features, like PhysX or 3D Vision, the GTX 460 are affordable, even if you buy a pair. We’d strongly suggest stepping up to the 1GB version, however.
So are dual GPUs worth the cost and hassle? If you’re a serious gamer with 1080p displays or better, the answer is definitely “yes.” But take a look at what you plan on running, your resolution and other factors before you drop in that second card.