We rely mostly on objective benchmarks here at Maximum PC, especially when it comes to evaluating videocards. It’s just easier to defend a verdict that’s based on frames per second because—assuming you’re using a good benchmark and the same parameters—you’ll get pretty much the exact same number with every run.
Of course, frame rate isn’t everything, especially if your PC’s primary mission is something other than gaming. If you use your computer for editing video, watching movies, or manipulating digital photographs, you’re much more interested in visual quality. Judging image quality, however, is much more difficult because it’s necessarily a subjective task.
But we’ve been hearing whispers from sources (who wish to remain anonymous, although we can tell you they represent neither AMD nor Nvidia) that ATI GPUs deliver better image quality than what Nvidia has to offer. ATI product managers made a similar claim while rolling out their AVIVO technology initiative, but neither AMD nor Nvidia have had much to say on the topic for quite some time.
Never ones to let sleeping dogs lie, we decided it was time to settle this issue Maximum PC style: We gathered a bunch of our game-playing, movie-watching, photo-editing colleagues and challenged them to a blind taste test. Would the consensus opinion favor AMD or Nvidia, or would anyone be able to discern any differences at all?
A lot of thought went into developing our test methodology. Here are the details regarding the hardware and software we used, along with our rationale for making these choices.
The fact that we awarded HP’s VoodooPC-designed Blackbird 002 a 7 verdict in our Holiday 2007 issue didn’t dissuade us from using the innovative rig for this challenge. Although we panned the particular eval unit we received because it included Radeon HD 2900 XT cards in CrossFire, instead of the much faster GeForce 8800 GTX or Ultra cards, we lavished praise on its innovative industrial design, supremely quiet cooling apparatus, and—most significantly—its ability to run either two Nvidia cards in SLI or two Radeon cards in CrossFire on an SLI motherboard.
HP's Blackbird 002 enabled us to compare SLI and CrossFire videocard performance in otherwise identical rigs.
That unprecedented flexibility prompted us to request a matched set of Blackbirds from HP, each equipped with an Intel Core 2 Extreme QX6850 quad-core CPU (3GHz, overclocked to 3.33GHz), 2GB of Corsair Dominator XMS2 DDR2 RAM, and three Western Digital 160GB Raptor hard drives (in RAID 0). All of this was plugged into an Asus Striker Extreme Nvidia nForce 680i SLI motherboard.
We asked HP to configure one rig with two ATI Radeon HD 3870 cards in CrossFire and one with two Nvidia GeForce 8800 Ultra cards in SLI. We also asked HP to provide us with two additional cards from each camp (more on that later). We chose the Radeon HD 3870 cards because they’re based on the best GPU that AMD currently has to offer.
We soon realized we’d made a mistake in configuring the machine with 8800 Ultra cards, however, because those cards don’t support HDCP on both links in their dual-link DVI connectors. Without that, you can’t view encrypted HD video content in high definition on a 30-inch LCD (our screen choice for this challenge). The other problem was that the Ultras were too fast for our purposes: We couldn’t come close to synchronizing frame rates in our gaming tests on the ATI and Nvidia machines.
So we moved down to Nvidia’s 8800 GT. It supports HDCP on both links, the frame buffers on the cards we selected are the same size (512MB) as those on the 3870s, and the ATI and Nvidia cards would run our game benchmark at approximately the same speed (our objective being image-quality comparison, not frame-rate measurement).
We paired the Blackbirds with identical HP LP3065 30-inch LCD monitors. We set the brightness controls to the same values, and then calibrated the two monitors using a Pantone HueyPro calibration kit.
We chose the cinematic built-in benchmark from World in Conflict to test DirectX 10 gaming performance (we remain unimpressed with Vista, but these GPUs were ostensibly designed for DirectX 10 performance). In order to achieve the smoothest frame rate, we reduced the game’s resolution to 1280x800, set most of its values to medium, and turned off the water-reflection settings. This enabled both cards to run the demo at about 40 frames per second.
We had our test subjects view a sequence from the HD DVD disc Blue Planet to evaluate high-definition video quality. This IMAX film features spectacular clips filmed from above and around the planet, which made for a much more diverse viewing experience than a Hollywood movie would have provided.
For our final test, we asked our test subjects to examine a very high definition (2592x3888 pixels) portrait of a female model, shot with a Canon EOS-1D Mark III (we obtained the photo from Canon’s website). See representative samples from each of our tests on the next page.
We recruited our 21 evaluators from the ranks of the Future US staff, including editors and art directors from other print and online publications. We chose these individuals because of their in-depth expertise at evaluating image quality in all three of our test criteria.
We set up the two Blackbirds in the Maximum PC Lab, with the monitors placed side by side, at the same height and at the same angle to the viewer. We sat each test subject on a rolling stool, so he or she could easily roll back and forth between the two monitors in order to avoid visual distortion caused by off-axis viewing angles.
The test administrator told each subject only that we were evaluating image quality; the subjects were not informed that we were evaluating videocards or any other hardware. Neither of the test rigs were outfitted with speakers. The test administrator asked each subject to express a preference for the image displayed on monitor A or monitor B or to express no preference for either. Subjects were expressly told that “no preference” was a perfectly valid opinion, but if they did choose A or B, they were asked to explain their rationale for that decision.
To reduce random chance, we conducted nine tests with our CrossFire system labeled as A and our SLI system labeled as B. We reversed the order for our next six tests, and we established two control groups of three tests each in which both A and B were CrossFire and then both A and B were SLI.
A closer look at our benchmarks - and why we selected them.
High-Resolution Digital Photo: Eustace
We selected this portrait because it was shot by a professional using a very high-resolution digital SLR camera (Canon’s 10MP EOS-1D Mark III). We anticipated that our test subjects might discern differences in skin tone, hair color, black levels, and similar details.
DirectX10 Game: World in Conflict
The shipping version of Massive Entertainment’s sumptuous RTS World in Conflict has more DX10 eye candy than the beta version we’ve used in the past. We selected this game because it has a built-in benchmark that uses the game’s engine to render an action-packed animation sequence. We thought our test subjects might see differences in color rendering, antialiasing, and lighting. We expressly told them not to evaluate frame rate or animation quality.
High Definition Video: Blue Planet
We chose this disc for several reasons: The film was originally shot in IMAX format, and the digital transfer is excellent. We wanted video clips with diverse content, and this movie provides an abundance of it, ranging from sequences shot from the International Space Station to farmers setting fires in the Amazon rain forest to clear land for farming. We expected our test subjects might see differences in color rendering or spot decoding artifacts.
A breakdown of our test subjects’ preferences when comparing content on ATI CrossFire with Nvidia SLI (control group results not included)
The first chart shows the subjects’ responses for the DX10 game, the high-definition video clip, and the high-res digital photo. You can see whether they consistently picked one vendor over the other, or if they preferred different GPUs for different applications.
The second chart sums up the total number of responses for each videocard and the total number of no-preference responses in each category. A quick glance shows a slight overall preference for CrossFire, but read on for a more detailed analysis of the results.
|DirectX 10 Game||HD Video||Digital Photo|
|Subject 2||CrossFire||No Preference||CrossFire|
|Subject 3||SLI||SLI||No Preference|
|Subject 4||SLI||No Preference||CrossFire|
|Subject 7||No Preference||CrossFire||SLI|
|Subject 8||CrossFire||No Preference||No Preference|
|Subject 9||CrossFire||CrossFire||No Preference|
|Subject 13||SLI||SLI||No Preference|
|Subject 15||CrossFire||CrossFire||No Preference|
|HD Video Preferences||7||5||3|
|Digital Photo Preferences||6||4||5|
Unvarnished opinions from our test subjects.
We went to great lengths to avoid influencing our test subjects one way or the other. We gave them minimal instructions, and we made it clear they shouldn’t feel pressured to choose A over B or vice versa.
Our first subject, a female art director, immediately pointed to monitor A, which happened to be the CrossFire rig (we randomly reversed the test setup between subjects) and said, “That one looks sharper. Monitor B looks a little fuzzy, and I think monitor A has better color quality.” Moving on to the HD DVD test, this same subject pointed to monitor B, the SLI rig, and said “The video on monitor B seems more saturated, but the color in monitor A looks more accurate.” This same subject voted for CrossFire in the portrait test, making a clean sweep for AMD.
Many of our other subjects had a more difficult time choosing a favorite. A male editor at one of our gaming magazines preferred Nvidia’s gaming visuals, saying, “The tank looked as though it had more detail, but the difference is very small.” He picked AMD, however, when it came to evaluating the digital photo: “The color temperature is very subtly higher,” he said, “and the highlights in the model’s hair look more golden on monitor B (the SLI rig).” When it came to the HD video test, he expressed no preference at all.
Although none of our subjects picked the SLI rig as their preference across the board (three did so for the CrossFire machine), that didn’t stop some individuals from expressing strong preferences for Nvidia in each of the three categories. “The colors on monitor B look richer,” said another male editor, referring to the digital photo displayed by the SLI rig, “and I feel like I’m seeing more texture because of that.”
While watching a segment on slash-and-burn farming practices on the HD DVD, many of our observers noticed that the flames on the SLI rig were deep red, while the CrossFire machine rendered the fire more orange. The majority of these subjects expressed a preference for the CrossFire rig, explaining that the orange fire looked more natural.
Despite all our assurances that expressing “no preference” was a valid opinion, nearly everyone in our control group insisted they could see differences in image quality, despite the fact that three of them were unknowingly comparing SLI to SLI, and three others were comparing CrossFire to CrossFire.
Beauty is in the eye of the beholder, so everyone's a winner!
We pride ourselves on making binary recommendations, but that’s impossible in this scenario unless we also take speed into account. While it’s true that our test subjects leaned slightly toward AMD’s image quality, awarding AMD 21 wins to Nvidia’s 15, the margin of victory in each category is just two votes. That gives AMD a slight edge at the $250 price point, but it leaves Nvidia unchallenged at every higher segment.
Our subjects had the strongest opinions when it came to gaming performance, with eight expressing a preference for ATI CrossFire, six preferring Nvidia SLI, and only one citing no preference for either solution.
But does it really matter? The Radeon HD 3870 is the best GPU that AMD has to offer today, and Nvidia has three pricier SKUs capable of beating the 3870 to a bloody pulp: the revamped 8800 GTS, the 8800 GTX, and the 8800 Ultra. If you’re a hardcore gamer, would you be willing to take a major performance hit in order to render your game experience just a wee bit more shiny and colorful? We didn’t think so.
Seven of our experts preferred AMD’s video performance, compared with five who fancied Nvidia’s; three expressed no preference for either solution. As odd as it sounds, many people had strong opinions about the color of the fire in the video sequence we showed them. But if you decide to go with one of Nvidia’s faster GPUs, be aware that the 8800 GTX and 8800 Ultra are incapable of offloading all the HD video-decoding chores from the host CPU. More importantly, neither of these cards is equipped with dual-link DVI connectors that have HDCP support on both links (the Radeon 3850, Radeon 3870, GeForce 8800 GT, and G92-based 8800 GTS all do), making HD DVD and Blu-ray a no-no.
It’s a coincidence that the same margin of two opinions separates those who preferred AMD’s digital-photo performance (six favoring) over that of Nvidia’s (four favoring). We find it more interesting that five people expressed no preference in this category—more than the other two categories combined. We had predicted that having the subjects stare at a static image would result in nearly everyone judging one or the other solution to be superior.
Only four of our 15 evaluators gave either AMD or Nvidia a win across the board; of the 11 with more mixed opinions, five leaned toward Nvidia (giving the 8800 GT the nod in two out of three categories) and four leaned toward AMD (all of whom favored its game-image quality). Three people’s opinions were truly mixed, expressing a preference for CrossFire in one category, SLI in another, and no preference in a third.
We were surprised that only three of the six people in our control group expressed no preference between display A and display B—and in only one category each at that. Since they were unknowingly comparing identical rigs, we thought nearly everyone would admit there was no difference between the two displays. Since most of our subjects are professional critics, we suspect that they felt an inherent obligation to discern some difference between the two displays they were staring at (despite our assurances to the contrary).
The control group did help eliminate the display itself as a variable: If the monitors had colored our evaluators’ opinions, the votes would have been lopsided in favor of one or the other. Of the control group’s 18 opinions, nine favored monitor A, six favored monitor B, and three expressed no preference.
The good news for anyone shopping for a new videocard is that you don’t need to sacrifice image quality for performance. Based on our blind tests, the GPUs from both AMD and Nvidia deliver similar visual quality with games, high-definition video, and digital photos.
That’s good news for AMD, too, because now the company need only worry about catching up on one performance metric: frame rate. Unfortunately, we don’t think CrossFireX is going to be a panacea in the interim. Running four moderately powerful videocards in one box will never be as cost effective as building a rig with one super-powerful GPU—especially if the CPU in that box is an Intel quad core. Sorry, Phenom.
That leaves Nvidia in the catbird seat—again. But it won’t have the perfect solution either until it replaces the 8800 GTX and 8800 Ultra with parts that support HDCP on both links of their dual-link connectors and that are capable offloading all HD video decoding from the host CPU.