AMD's Hawaii-based flagship GPU has finally arrived to take on Nvidia's super-sized GK110. This is a GPU grudge match that fans of hardcore PC performance have been waiting for, as both companies have been ratcheting up the tension ahead of today's announcement for the past few weeks.
The bottom line is this - at just $549 , the R9 290X represents a very serious threat to Nvidia's single-card GPU dominance. Read on to see how it fares against Nvidia's top-shelf silicon, and what it all means for PC gamers.
The Radeon R9 290X is the first GPU from AMD based on its new silicon dubbed Hawaii . This new chip is a tweener model, bigger than the previous flagship die named Tahiti, but not quite as big as Nvidia's Big Kepler core named GK110. The Hawaii die measures 438mm squared compared to GK110's 551mm, and packs 6.2 billion transistors compared to GK110's 7.1 billion, so it's just a bit smaller than GK110 but larger than Tahiti. AMD recently argued in an interview with Forbes magazine that it didn't need a die as big as GK110 to deliver the same performance, and as it turns out, they were right.
AMD's reference cooler design isn't quite as fancy as Nvidia's flagship design, but it still has a menacing look to it.
The R9 290X will sit at the top of AMD's product stack for the near future as its flagship 28nm GPU utilizing its GCN architecture. It strikes an imposing figure at 11" long, but only requires one six-pin and one eight-pin power connector instead of dual eight-pins like we've seen on some of the Tahiti boards. AMD isn't saying what the card's TDP is, but did mention to us in an offhand manner that it's in the neighborhood of 250W, but it certainly seems based on our testing it's a bit higher than that. Hawaii will be offered in two SKUs - the R9 290X you see here today, and the R9 290 model that is still to come, so hopefully we'll have more info on that board in the coming weeks. For now, let's examine the specs of the card along with its competitors.
As you can see, on paper this is one hell of a competitive offering from AMD, and the surprisingly low price of $549 is the clearest sign yet that AMD is dead set on hitting Nvidia where it hurts, in price-to-performance. Nvidia has been fortunate to remain unchallenged in the upper stratosphere of pricing for all of 2013, dual-Tahiti cards excluded, but it appears those days are now over. Nvidia will certainly have to respond with price cuts for either its GTX 780 , GTX Titan , or both. We'll update this post when and if that happens.
When AMD first shared the specs of the R9 290X with us, the clock speed section read "up to 1GHz," and we figured it was just a matter of this information coming before the release of the card when AMD was still finalizing the details of the card. We've seen clock speeds change at the very last minute in the past, from both AMD and Nvidia, so this is nothing new. What we now understand after testing the card is that the clock speed really is "up to 1GHz" thanks to the new power settings AMD has granted the board, which is known as PowerTune. Basically the card will run at the highest clock speed that it can given its operating temperature, all the way up to 1,000MHz. If it gets too hot, which just so happens to be exactly at 94C, it will lower its clock speed dynamically in order to maintain that temperature. It's very similar to what Nvidia is doing with its GPU Boost 2.0, and lets you control variables including clock speed, maximum temperature, and maximum fan speed. You can tweak all these variables in a semi-confusing matrix, shown in the photo below.
This is the interface for AMD's PowerTune utility. You'll also be able to use 3rd party overclocking tools as well.
The R9 290X ships with a teeny, tiny switch atop its PCB that lets you switch the card from Uber to Quiet mode. In Quiet mode, which its set to by default and is what we recommend, the fan will never spin faster than 40 percent, which keeps the card relatively quiet. It's not "whisper quiet," but is certainly more quiet than the HD 7970 GHz edition we have on hand, that is for damn sure. AMD has made tremendous strides in keeping the R9 290X quiet, and to quote Gordon during testing, "It's like it's not even an AMD card." In Uber mode the fan spins up to 55 percent, and you can definitely hear it. However, the primary benefit to this mode is that the extra cooling afforded by the additional fan speed lets the GPU operate at 1,000MHz all the time. However, in our testing we found almost no difference whatsoever between peformance in the two modes, and even in the AMD-supplied benchmarks there was very little difference. Therefore, our recommendation is to just run the card in quiet mode at all times. Your ears will thank you, and you won't notice a significant performance loss running in this mode at all. Also, one annoyance is that in order to switch the card you have to shut the system down and hit the switch while it's off, then make sure to hit the "default" button in the Catalyst Control Center to make sure the changes have been applied. It's too klunky to really be useful, and seems like a feature AMD tacked on at the last minute. Calling it "Uber" is just silly too since it provides a barely noticeable performance increase.
The Dual Bios switch on the R9 290X toggles between Uber and Quiet modes.
That about covers the basics, so hit page two to see how it performs!
As you can imagine we were pretty stoked to benchmark this card. After all the build up we could not wait to see if it was actually going to be competitive with, or possibly dethrone, the GK110 GPUs from Nvidia. We strapped the R9 290X to our Core i7 test machine, installed Catalyst 13.11 Beta 5 drivers, and began our testing at our standard settings, which are at 2560x1600 with all settings at maximum, and 4XAA.
As you can see from the chart, any question we had as to whether or not this card could compete with the GTX 780 evaporated as soon as we started testing. It trades blows quite well with the GTX 780, and was faster in 3DMark, Crysis 3, Tomb Raider, and Battlefield 3. In several other tests, it was just a handful of frames slower, and came extremely close to topping the almost twice-as-expensive GTX Titan in a few tests as well, which is impressive. It doesn't hand the GTX 780 a crushing defeat, but it lands several hard punches, and is very close in performance, and when you consider it costs $100 less, it then becomes an obvious win for AMD. It is clearly the fastest card at the $550 price point, no question.
Both AMD and Nvidia have really been stressing how 4K is the future and that its new GPUs are more than ready for this kind of action. We concur, it is the next level for hardware junkies, and offers an amazing level of detail that is simply not possible using our current 2560x1600 displays in the lab. To test this card and the Nvidia equivalents we used the new 32" 4K panel from Sharp (PN-K321), which runs at 3840x2160 and is functionally the exact same display we used for Dream Machine 2013. In the interest of full disclosure, AMD sent us the panel for testing, but we examined it thoroughly to make sure it wasn't a "cheater" panel, as if that is even possible. Also, astute readers will notice we swapped motherboards for the 4K tests, and this was due to our original test platform experiencing a malfunction on one of its PCIe slots. Without further ado, here are the benchmarks.
Best scores are bolded. Our test bed is a 3.33GHz Core i7 3960X Extreme Edition run on an Asus Rampage IV Extreme board. We used 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games were run at 3840x2160 with AA disabled.
Having a look at the benchmark chart it's easy to see why AMD is stressing 4K peformance with this card - because it beat the GTX 780 in Unigine Valley, Crysis 3, Far Cry 3, Tomb Raider, and Battlefield 3. Of course, the R9 290X still loses in most tests to the GTX Titan, but it did match it in Crysis 3, Tomb Raider, and even beat it by two frames per second in Battlefield 3. We know those are all AMD-badged titles, so perhaps this isn't a surprise, but again when you examine the price disparity between the cards it is clear AMD has a real winner on its hands with the R9 290X. It is just as fast, and faster in a lot of games, than its competitors from the Green Team, so well done AMD.
It's all been quite positive up until now, so here's a little bit of bad news. This card runs hot, as in it sat at 94C the entire time we were testing with everything set to default settings. As hardware testers used to seeing around 80C on a GPU this was quite a surprise to us, as we've never seen a card run this hot before. Even on the Nvidia cards if we set their fans to 20% or so they would never get hotter than 85C or so typically. To AMD's credit, even though the R9 290X ran hot it was 100 percent stable throughout testing, and never rebooted, shut down, showed on-screen artifacts, or exhibited any sign of overheating whatsoever. We even left it running Heaven 4.0 over a 3-day period only to come back and find it purring right along, pegged at 94C, with the GPU's clock speed fluctuating around 925MHz or so. The air around the card was also quite cool, so the reference cooler AMD has built does a very good job of exhausting air out the back of the card.
The R9 290X ran at 94C during testing, but was totally stable and didn't overheat at all (surprisingly).
Also, we asked AMD about these temps since we were iniitally concerned we had missed a setting somewhere, and it assured us the card was designed to run at 95C until the cows come home, and was in no danger of overheating or exhibiting any weird behavior. As crazy as it sounds, AMD is right. This card is happy to run at those temps all day long. There's just one problem with this situation - since the card was already running at max temperature in stock trim, we couldn't overclock it at all. In order to do that we'd need some headroom on the temperature side of things, and that just wasn't happening. We could have turned up the fan sped, but it gets very audible very fast, as in going from 40 percent to 47 percent changes it from semi-quiet to very audible. We talked to AMD about this and it told us that when water blocks are available that will offer at least 20C or so of headroom, which will grant the user some overclocking ability. We also expect add-in-board manufacturers to come to market with custom coolers, which could accomplish this goal as well, however AMD told us the card will only be available with the reference cooler initially.
Based on everything we heard about this card before its release, we were fairly certain it would be competitive with the GTX 780 and possibly the GTX Titan, but we weren't sure just how competitive it would be. Now that we have those numbers, we can say it's obviously extremely competitive with both of those cards, and with its support for AMD's new Mantle API as well as its TrueAudio support, it's a contender for sure with exclusive features not available from Nvidia at this time. Of course, the same can be said for Nvidia too, with its new G-Sync technology, PhysX, generally excellent drivers, GeForce Eperience and ShadowPlay, TXAA, and so forth. However, what Nvidia can't provide (at this time at least) is a similar price as the R9 290X. Its GTX 780 is $100 more expensive, and the GTX Titan is $450 more expensive, which makes both of those cards seem overpriced in comparison. Nvidia recently announced the GTX 780 Ti, but has been tight-lipped about specs and pricing so far, so it will be very interesting to see how it prices this card. If it puts the GTX 780 Ti at $550 to match the R9 290X, that would mean it would have to go even lower with the GTX 780, which would be unprecedented. AMD also still has the R9 290 waiting in the wings as well, which could do some serious damage to the GTX 770 and prompt a total shakeup in Nvidia's lineup.
All in all, AMD has a winner on its hands with the R9 290X. It's faster than Nvidia's GTX 780 in a lot of our tests, and costs $100 less, making it by far the fastest GPU available at its $550 price point, and the best price-to-performance GPU available on the upper end of the market as well. It's been a long time since we've seen AMD deliver such a decisive victory, and we're certainly glad to see competition ramping up. It'll be interesting to see how Nvidia responds, that much is certain.