gpu http://www.maximumpc.com/taxonomy/term/324/ en AMD Will Discuss Next-gen GPUs ‘Later This Quarter’ http://www.maximumpc.com/amd_will_discuss_next-gen_gpus_%E2%80%98later_quarter%E2%80%99300 <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u46168/radeon_graphics_logo_0.jpg" alt="Radeon R9 300 Series" title="Radeon R9 300 Series" width="228" height="193" style="float: right;" />Nvidia’s latest and greatest GPUs remain unchallenged</h3> <p>Advanced Micro Devices will talk about its much-awaited (and long overdue) next-gen graphics cards later in this quarter, the company’s CEO Lisa Su said in an earnings call earlier this week. This is significant because <strong>if there was ever a need for a graphics card refresh, it’s right now.</strong></p> <p>Until recently many people were wondering if the launch would happen in time for GTA V’s PC release, but AMD remained noncommittal, offering nothing more than a vague, albeit tantalizing, response. “We’re giving finishing touches,” is all it said. GTA V is already here, of course.</p> <p>To be fair to AMD, it’s not as if they promised something in a particular timeframe and failed to deliver on time. Even during the previous quarterly earnings call all Su would say is that the company had some “very good” graphics cards lined up for the second quarter of 2015. But the company is not doing itself any favors by letting <a href="http://www.forbes.com/sites/jasonevangelho/2015/03/17/with-titan-x-nvidia-has-introduced-4-unanswered-gpus/">Nvidia’s latest and greatest GPUs go virtually unchallenged</a> for so long (over 9 months and counting).</p> <p>“So as we go into the second half of the year, we would like to see some regain of share in both the desktop and the notebook business,” Su said, responding to a question from Credit Suisse’s John Pitzer.</p> <p>“I've talked about Carrizo being a strong product for us, I talked about some of our graphics launches that we'll talk about later this quarter. So from our standpoint, I would say the first half of the year, we had some, let's call it, some of our issues that we were correcting in terms of the channel, and then a weaker than expected market environment.”</p> <p>The <a href="http://www.anandtech.com/show/9172/amd-posts-q1-2015-results-180-million-net-loss">first quarter wasn’t a particularly good one for the company</a> as it slumped to a $180 million net loss. Although a significant improvement over the previous quarter in which it reported a net loss of $364 million, this is several times worse than the $20 million loss it recorded during the same period last year.</p> <p>Hopefully, things will improve once the new graphics cards are finally available. Kitguru <a href="http://www.kitguru.net/components/graphic-cards/anton-shilov/amd-reiterates-plans-to-introduce-new-radeon-r9-300-series-gpus-in-june/" target="_blank">expects them to be unveiled at Computex in June.</a></p> <p>In the meantime, we will have to make do with unverified reports and rumored specs.<br />Coalescing relevant bits from the various R9 300 rumors we’ve heard so far, one can paint a <a href="http://www.maximumpc.com/xfx_radeon_r9_370_core_edition_leaks_web_higher_end_r300_series_cards_follow">somewhat detailed picture of the company’s next-gen GPU lineup</a>:</p> <ul> <li>AMD Radeon R9 390X: 28nm Fiji XT GPU, 3,584 cores, 224 TMUs, 64 ROPs, 4GB memory, $599</li> <li>AMD Radeon R9 390: 28nm Fiji Pro GPU, 3,328 cores, 208 TMUs, 64 ROPs, 4GB GDDR5, $399</li> <li>AMD Radeon R9 380X: 28nm Hawaii XTX GPU, 2,816 cores, 176 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 380: 28nm Hawaii Pro GPU, 2,560 cores, 160 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 375X: Tonga XT GPU, 2,048 cores, 128 TMUs, 32 ROPs, 2GB GDDR5, 384-bit, price unknown</li> <li>AMD Radeon R9 375: Tonga Pro GPU, 1,792 cores, 112 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370X: Trinidad XT GPU, 1,280 cores, 80 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370: 28nm Trinidad Pro GPU, 1,024 cores, 64 TMUs, 24 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R7 360X: Bermuda GPU, 896 cores, 128-bit GDDR5, price unknown</li> <li>AMD Radeon R7 350X/340X: Oland GPU, 320 cores, DDR3 and GDDR5 memory, 128-bit</li> <li>AMD Radeon R5 300: Caicos GPU, 160 cores, DDR3 memory, 64-bit</li> </ul> <p><em>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></em></p> http://www.maximumpc.com/amd_will_discuss_next-gen_gpus_%E2%80%98later_quarter%E2%80%99300#comments amd fiji xt gpu graphics card Hardware nvidia radeon r9 300 News Mon, 20 Apr 2015 06:13:05 +0000 Pulkit Chandna 29751 at http://www.maximumpc.com Choosing the Best AMD Graphics Card http://www.maximumpc.com/choosing_best_amd_video_card_2015 <!--paging_filter--><h3>Prefer the Red Team over the Green Team? We’ve got your back</h3> <p>There was a time when a dozen different companies were selling video cards and vying for your hard-earned cash. But, at least when it comes to gaming, the field has narrowed to just two: Nvidia and AMD. If you’re just doing spreadsheets, surfing the web, and playing the occasional Flash game, you’ll be fine with integrated graphics. But if you spend a lot of time shooting, racing, and flying, a dedicated graphics card is the way to go.</p> <p><a href="http://www.maximumpc.com/choosing_best_nvidia_graphics_card_2015" target="_blank">We covered Nvidia a couple of days ago</a>, and now <strong>we’re turning our crosshairs to AMD. What follows is a streamlined buying guide.</strong> No benchmark charts, diagrams, or spec sheets. We’ll link to places where you can get that stuff if you want, but here is where we condense the product line into a few pages of advice. Dig in!</p> <p><strong>Options #1, #2, and #3: Radeon R7 260, 260X, and 265</strong></p> <p>First, we’re looking at the cards in the $100–$130 range. Overall, AMD has a denser collection of options than Nvidia. This creates some overlap, so we’re combining these cards into one tier that’s roughly equivalent to the Nvidia GeForce GTX 750 and 750 Ti tier. Like that pair, these AMD cards are fine for playing at 1080p most of the time.</p> <p><img src="/files/u160416/r7_265.jpg" alt="Sapphire Radeon R7 265" title="Sapphire Radeon R7 265" width="620" height="473" style="vertical-align: middle;" /></p> <p><a title="Sapphire Radeon R7 265 review" href="http://www.maximumpc.com/sapphire_radeon_r7_265_dual-x_review_2014">We’d recommend the R7 265 with two gigabytes of VRAM to get the best performance.</a> Naturally, you’ll find those at the top of the price range, but we think it’s worth the extra bucks. These three Radeon cards also need only one 6-pin PCI Express cable, so you should be fine with a power supply in the 400–500 watt range. If you have around $200 to spend, though, there are better options from both AMD and Nvidia.</p> <p><strong>Options #4 and #5: Radeon R9 270 and 270X</strong></p> <p>Performance-wise, these cards are roughly equivalent to the Radeon HD 7850 and 7870, and the Nvidia GeForce GTX 670 and 760. So their price range is about $130–$180, if you include mail-in rebates. Like the other R7 cards, we recommend 2GB of VRAM for the best gaming experience at 1080p. These GPUs also will let you use higher visual settings than the R7 cards mentioned earlier. However, the 270 and 270X need two PCI Express power cables, and that’s rare to find with power supply units that are rated for less than 500 watts. So, you may need to upgrade your PSU or factor a more expensive one into your budget. Overall, we’d go with the 270X for a little extra oomph, unless you can afford something even speedier.</p> <p><strong>Options #6, #7, and #8: Radeon R9 280, 285, and 280X</strong></p> <p>The 280 and 280X are basically respun versions of the Radeon HD 7950 with Boost, and the HD 7970 GHz Edition. Which are also roughly comparable to a GTX 680 and 770. The 280 and 280X have 3GB of VRAM, which is a lot for 1080p, but not unwelcome. In fact, it’s enough to handle 1440p fairly smoothly, though you might want a second card in a Crossfire configuration to keep up at that resolution. The 285, however, has 2GB of VRAM and uses a newer, more power-efficient GPU core. Its performance falls in between the 280 and 280X, but because it generates a lot less heat, you can find it in sizes designed for a mini-ITX case.</p> <p><img src="/files/u160416/strix_285.jpg" alt="Asus Strix Radeon R9 285" title="Asus Strix Radeon R9 285" width="620" height="451" style="vertical-align: middle;" /></p> <p>It can also be much quieter, with some versions not even spinning up their fans until the GPU hits a certain temperature. It uses one PCI Express cable, while the 280 and 280X need two. Overall, <a href="http://www.amazon.com/b/ref=s9_acss_bw_hsb_PCGDEC14_s1_s?_encoding=UTF8&amp;ie=UTF8&amp;node=10510329011&amp;pf_rd_m=ATVPDKIKX0DER&amp;pf_rd_s=merchandised-search-2&amp;pf_rd_r=010AS36QFWGFNJAZTZ3T&amp;pf_rd_t=101&amp;pf_rd_p=2005577422&amp;pf_rd_i=8588809011">our favorite at this tier is the 285</a>, despite having less VRAM, because it can run cooler, quieter, and in a wider variety of cases.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/choosing_best_amd_video_card_2015?page=0,1">Click here to see the rest of your options, and the overall winner</a></h4> <hr /> <p><strong>Options #9 and #10: Radeon R9 290 and 290X</strong></p> <p>These are AMD’s top-performing single-GPU cards, and their performance will be within spitting distance of a GeForce GTX 970 and 980. The GTX 980 is a consistently faster card overall, but some gamers still opt for the 290X because it’s about $200 cheaper. However, the 290 and 290X need a lot of watts. We’d recommend 600 or more watts for one of these cards, and 850 watts or more for 2-way Crossfire.</p> <p><img src="/files/u160416/tri-x_290x_0.jpg" alt="Sapphire Tri-X Radeon R9 290X" title="Sapphire Tri-X Radeon R9 290X" width="620" height="284" style="vertical-align: middle;" /></p> <p>We’d also definitely avoid the “reference” cards, because they run quite hot and noisy. <a title="Sapphire Tri-X Radeon R9 290X review" href="http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review">Cooling designs like Sapphire’s Tri-X</a> or Vapor-X, ASUS’s DirectCU II, Gigabyte’s Windforce 3X, MSI’s TwinFrozr, and XFX’s Double Dissipation are highly recommended to keep these cards running quietly and relatively cool. If you’re okay with those stipulations, the 290 and 290X will give you a lot of bang for your buck—about $250 for the 290 and about $350 for the 290X.</p> <p><strong>Option #11: Radeon R9 295X2</strong></p> <p>This card is basically two R9 290X GPUs on a single card. Since these GPUs need a respectable amount of cooling, it should come as no surprise that the 295X2 has a closed-loop liquid cooler (CLC) built into it. This uses a 120mm radiator (bundled with a fan) that you must install on a fan mount somewhere in your case. The card is also 12 inches long. <a title="Radeon R9 295X2 review" href="http://www.maximumpc.com/amd_unleashes_dual-gpu_radeon_r9_295x2">And it's got the break-neck performance to justify all this.</a></p> <p><span style="color: #888888;"><img src="/files/u160416/295x2.jpg" alt="Radeon R9 295X2" title="Radeon R9 295X2" width="620" height="304" /></span></p> <p>So while it will get you more performance than a single GTX Titan X, for a few hundred dollars less, it needs a lot of real estate and a lot of power. As we mentioned earlier, an 850-watt power supply (or more) is highly recommended when dealing with multiple 290 or 290X GPUs. That said, the card runs pretty quietly, thanks to the CLC, and it takes up half as many motherboard slots as a 2-way Crossfire config. Like the Titan X, there are no third-party cooling designs, but the “reference” version here is quite good.</p> <p>You may sometimes see this card listed as having “8GB,” but Crossfire, like SLI, mirrors your VRAM, instead of letting you add the two card’s VRAM together. In practice, you will have a capacity of 4GB, just like the R9 290 and 290X, and the GTX 970 and 980. The GTX Titan X has a whopping 12GB, but we haven’t encountered a game or screen resolution where that felt like a necessity. 4GB is fine even with a 4K display (though you’d still want multiple GPUs to smoothly game at that point).</p> <p><strong>And the Winner Is...</strong></p> <p>Like Nvidia, AMD has a wide range of options that make picking a single winner difficult. The R7 265 is our pick at the entry level, and the R9 295X2 packs a ton of performance into about $700, enough to get decent frame rates at 4K and definitely plenty for 1440p. If there were a happy medium here, <strong>we’d go with the R9 290X as the best overall AMD GPU</strong>, provided that you get one with a large heatsink and multiple fans, and you have a sufficient power supply unit.</p> <p><img src="/files/u160416/r9_290.jpg" alt="MSI Radeon R9 290X" title="MSI Radeon R9 290X" width="620" height="452" style="vertical-align: middle;" /></p> <p>As far as PSUs go, we’ve had good experiences with EVGA, Corsair, Antec, Enermax, Silverstone, and SeaSonic. That’s not an all-inclusive list, just the brands that come to mind most often when we need a reliable PSU. The Rosewill Hive and Capstone have good reps too, but we haven’t had as much direct experience with those. You should expect to need two 8-pin PCI Express cables, and possibly an additional 6-pin connection if you want top-end cards like MSI’s “Lightning” edition. But if you can manage that, we think it’s worth it.</p> http://www.maximumpc.com/choosing_best_amd_video_card_2015#comments amd ati best hardware buying guide gpu graphics card radeon shopping guide Features Thu, 09 Apr 2015 20:23:38 +0000 Tom McNamara 29715 at http://www.maximumpc.com Choosing the Best Nvidia Graphics Card http://www.maximumpc.com/choosing_best_nvidia_graphics_card_2015 <!--paging_filter--><h3>Doing a little GPU shopping? We know what to put on your list</h3> <p>When you’re trying to figure out the next PC upgrade you should buy, there are at least two ways to go about it. Some people like going through lots of pages of benchmarks, analysis, galleries of the component in various states of disassembly, forum debate, and pictures of fluffy kittens. And that’s great, when you have the time. But not everyone does. For people who want a quicker breakdown of choices like <strong>which Nvidia video card you should buy</strong>, we can condense that into just a couple of pages. We’ll give you a quick tour through the various choices that you have at different price points, and what the pros and cons are at each stage. Then we’ll select an overall winner.</p> <p>For simplicity’s sake, we’ll be sticking to the current "Maxwell” generation of Nvidia’s cards. It has some features not available in the older Kepler generation, like Multi-Frame Sample Anti-Aliasing (MFAA), which is a highly efficient way of smoothing out jagged edges on 3D objects, and Voxel Global Illumination (VXGI), which creates shadows with a degree of realism that we hadn’t seen occurring in real-time before. So that means that our breakdown will be sticking to the GeForce GTX 750, 750 Ti, 960, 970, 980, and the recently released Titan X.</p> <p>If you’re wondering why we’re not doing a breakdown of AMD cards, don’t worry—that’s coming soon.</p> <p class="MsoNormal"><strong>Choices #1 and #2: GTX 750 and GTX 750 Ti</strong></p> <p>We’re combining these two cards because of their overall similarity. These are the entry-level enthusiast cards; the 750 comes in at about $100, and the Ti flavor starts at about $125. These are positioned as the next step up from integrated graphics. You are assisted by the fact that the regular GTX 750 does not even require a PCI Express power cable. It gets all the power it needs from the slot it’s plugged into, which provides up to 75 watts. Although that’s just average for an incandescent light bulb, it’s plenty to get some respectable gaming performance at medium settings. A few versions of the 750 Ti require a PCIe cable, but you still shouldn’t need serious power; a 400-watt power supply will be just fine.<img src="/files/u160416/evga960.jpg" width="620" height="499" /></p> <p>&nbsp;</p> <p>If this is the kind of card that you can afford, we recommend going for the 750 Ti, since it will give you some extra oomph needed to hit that magic mark of 60 frames per second in your games. And we definitely recommend the versions with 2GB of VRAM instead of 1GB, since current 3D games will happily take advantage of the additional capacity. Since these are entry-level cards, we can’t declare them as the "best," but <a title="GeForce GTX 750 Ti benchmarks" href="http://www.maximumpc.com/GTX_750ti_Benchmarks_2014">they’re fine for 1080p gaming most of the time</a>. These two cards are roughly comparable to the AMD Radeon R7 265 or 270.</p> <p><strong>Choice #3: The GTX 960</strong></p> <p>While the GTX 750 and 750 Ti are technically Maxwell cards, they don’t have the full feature set, so they don’t get MFAA and VXGI. So we sometimes refer to the cards above them as "Maxwell 2.0.” The GTX 960 is the least expensive version, setting you back around $200. It comes in 2GB and 4GB versions, with the latter costing around $240. This card’s performance is roughly comparable to AMD’s R9 Radeon 280 or 285. If you want some benchmarks for reference, <a href="http://www.maximumpc.com/nvidia_geforce_gtx_960_video_card_review?page=0,1" target="_blank">we have them in the GTX 960 review here</a>.Like the 750 and 750 Ti, the GTX 960 does not draw a lot of power. You can find versions that use the same 6-pin PCIe connection that some 750 Ti cards do. But some versions need two such connections, in which case you need at least a 500-watt power supply unit—that’s the threshold where PSUs start having multiple PCIe cables. Since they’re power-efficient, they don’t generate much heat, either, so the card can be more compact than before.</p> <p><img src="/files/u160416/asus-strix-gtx960-dc2oc-2gd5_3d-100563779-orig.jpg" width="620" height="412" /></p> <p>Overall, though, it’s difficult to recommend the GTX 960, because AMD offers comparable performance and power consumption for substantially less money, at least for the 2GB version. The 4GB version can get you somewhat higher framerates, but at $240, it’s not much less than a Radeon R9 290, which has much better performance than both cards. There’s a twist, though: The 290 also needs a lot of juice. We’d recommend a 600-watt power supply for one of those, and 850 watts if you wanted to add a second one to your system for Crossfire.</p> <p><strong>Choices #4 and #5: GTX 970 and 980</strong></p> <p>These are meaningfully different cards, but we’re grouping them together because they came out at the same time. <a title="GeForce GTX 980 review" href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014">When the GTX 980 arrived, it was Nvidia’s flagship card</a>—with a flagship price of $550, which hasn’t gone down much since its release in September last year. It’s faster than AMD’s beefiest offering, the Radeon R9 290X, while needing less power and less real estate inside your PC for its cooling system. It comes with 4GB of VRAM and happily makes use of all of that.</p> <p>The GTX 970 is slower across the board and has a funky VRAM management system where the first 3.5GB runs normally and the last 500MB is a bit hobbled, but it also costs about $200 less, and you won’t often encounter scenarios where that 500MB chunk slows things down. Unfortunately, the nature of the 970’s memory system wasn’t clearly communicated to the public, and there’s been some drama.</p> <p>.<img src="/files/u160416/7g0a0209_620_0.jpg" width="620" height="349" style="vertical-align: middle;" /></p> <p>The GTX 970’s biggest enemies, though, are arguably AMD’s R9 290 and 290X. The 290 is just a little bit slower and costs about $80 less, which is getting to be the price for a 250GB SSD. The 290X is a bit faster most of the time, costs about the same, and uses conventional memory management. In fact, at 4K resolution, the 290X is a respectable contender to the GTX 980, (though you’d want two of each to get good framerates at that point). But if you have or plan to have a 500- to 550-watt power supply, the GTX 970 still comes out ahead of AMD’s comparable cards.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/choosing_best_nvidia_graphics_card_2015?page=0,1">Click here to see the rest of our choices, and the winner</a></h4> <hr /> <p><strong>Choice #6: The GTX Titan X</strong></p> <p class="MsoNormal">The Titan series is basically Nvidia’s 800-pound gorilla, an exclamation point at the end of its lineup. <a title="GeForce GTX Titan X review" href="http://www.maximumpc.com/nvidia_titan_x_review_2015?page=0,2">It will get you the best gaming performance that money can buy</a> —but you’d better have enough, because the asking price is a cool $1,000. That’s nearly twice the cost of a GTX 980, and about three times the cost of an R9 290X. But the Titan cards also have huge amounts of VRAM—a staggering 12GB in the case of the Titan X. They draw more power than other Nvidia cards, too, with a TDP of about 250 watts. That’s a measure of its draw when firing on all cylinders, without its clock speeds being manipulated above factory settings.</p> <p class="MsoNormal"><img src="/files/u160416/titan_x.jpg" width="620" height="372" /></p> <p>Despite that, the Titan X still fits inside Nvidia’s reference cooling system, with a 10.5 inch card getting air from a single turbine-like intake that pushes all its heat out the back of the case. This design is called a "full shroud.” A partial shroud uses fans to blow air onto a heatsink, but the frame holding the fans is not fully enclosed, so that heat circulates around the case. However, the fans and the heatsink can be much bigger and more effective when not restricted to Nvidia’s reference specs.</p> <p>When you’re in a 250-watt range, that extra flexibility comes in handy, as we’ve seen from AMD’s R9 290 and 290X. The reference versions of these cards were hot and noisy, <a title="Sapphire Tri-X Radeon R9 290X review" href="http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review">but you can get cards like the Sapphire Tri-X that run cool and quiet</a> (if you have enough room for a 12-inch card, that is). We mention this because Nvidia does not allow third parties to put alternative coolers on its Titan cards. The only exception it makes is for EVGA’s "Hydro Copper” series. These have pre-installed heatsinks that are designed to hook into custom water-cooling loops. If you want cooling other than what Nvidia has approved, you have to do it yourself, which can be a little stressful, considering the expense of the card itself.</p> <p><strong>And the winner is...</strong></p> <p>It’s tough to pick a single overall winner from such a wide spread of choices, ranging literally from $100 to $1,000. Do you want the best overall performance? Then the Titan X is your guy. But maybe you’re not comfortable with spending that much money, so we go down to the GTX 980. Is it really more than $200 better than a GTX 970, though? We’re not convinced of that. If we had to pick an overall winner that balanced price, performance, and features,&nbsp;<strong>our choice is the GeForce GTX 970</strong>, despite the way that its VRAM has been segmented.</p> <p><img src="/files/u160416/gigabyte_970.jpg" width="620" height="529" /></p> <p>In our experience, the segmentation just hasn’t produced a subjectively noticeable drop in performance, or even an objectively consistent one—even when scaling up to 4K, where VRAM demands are high. Since it’s also more than $200 cheaper than the GTX 980, you could add a second GTX 970 to your system and spend only about $150 more. Just one will be plenty at 1080p, though. And SLI can be buggy and sometimes doesn’t work at all anyway. And you need the proper amount of power and PCI Express connectors (about 750 watts, four connectors). But it’s a nice option to have.</p> <p>And like the GTX 980, you can get cards that have a variety of cooling options. If you’re into mini-ITX PCs and you don’t have a lot of space to work with, you can also get shrunk-down versions of the 970 from Asus, Gigabyte, and Zotac. They’re much shorter, but they don’t sacrifice any performance. That’s an option that you can’t get from the Radeon R9 290 or 290X, or even the GTX 980. Given the flexibility, performance, and price of the GTX 970, it’s hard to argue for other cards from the Green Team, unless your budget is either extremely tight or extremely loose.</p> http://www.maximumpc.com/choosing_best_nvidia_graphics_card_2015#comments best gpu graphics graphics card nvidia PC hardware video News Features Tue, 07 Apr 2015 18:58:48 +0000 Tom McNamara 29705 at http://www.maximumpc.com Asus Unveils Limited Edition Gold Edition GeForce GTX 980 Graphics Card http://www.maximumpc.com/asus_unveils_limited_edition_gold_edition_geforce_gtx_980_graphics_card_2015 <!--paging_filter--><h3><img src="/files/u69/asus_gold_gtx_980.jpg" alt="Asus Gold GTX 980" title="Asus Gold GTX 980" width="228" height="197" style="float: right;" />Celebrating 20 years of graphics card production</h3> <p>It was in 1995 that Asus introduced its first graphics card, and to celebrate 20 years of graphics card production, <strong>Asus is releasing a limited edition 20th Anniversary Golden Edition GTX 980</strong> that it claims is the fastest of its kind. How fast? Asus cranked the GPU to 1,431MHz, up from Nvidia's reference specification of 1,126MHz. It's even higher than Nvidia's 1,216MHz reference <em>boost</em> clockspeed.</p> <p>The 4GB of GDDR5 memory remains untouched at 7,010MHz on a 256-bit bus, though the clockspeed and gold colored cooling solution aren't the only standouts. The card boasts high quality components that should stand up better to overclocking, and the cooling apparatus boats 0dB fan technology that cools 15 percent better than reference and runs three times quieter, according to Asus. There's also a memory defroster, in case you plan on going nuts with liquid nitrogen.</p> <p>On top of the card is a 20th Anniversary designation. More than just eye candy, it changes color depending on load -- blue is a light load, orange is a medium load, red is a heavy load, and green means you're back in safe mode, or default clocks (you can restore clocks to default with a tap of the Clear VBIOS button).</p> <p>Asus didn't say when the card will be available or for how much, though in the meantime, you can take a trip down memory lane and see see what innovations came out of Asus over the years. For example, did you know that Asus was the first to plop a cooling fan on a graphics card? Or that it was the first to offer a video card with a premium black PCB? These and other fun facts have been assembled <a href="http://www.asus.com/event/VGA/20THANNIVERSARY/" target="_blank">onto a timeline</a>, a neat pit stop before making your way over to the <a href="http://www.asus.com/Graphics_Cards/GOLD20THGTX980P4GD5/" target="_blank">product page</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/asus_unveils_limited_edition_gold_edition_geforce_gtx_980_graphics_card_2015#comments asus Build a PC Gaming geforce gtx 980 gold gpu graphics card Hardware News Thu, 02 Apr 2015 19:49:22 +0000 Paul Lilly 29682 at http://www.maximumpc.com Async Shaders Will Allow GPUs to Live Up to Their Full Potential, Says AMD http://www.maximumpc.com/async_shaders_will_allow_gpus_live_their_full_potential_says_amd_2015 <!--paging_filter--><h3>Improvements to be enabled via DX12 and Vulkan APIs</h3> <p>Graphics cards are currently “not living up to their full potential,” says AMD, and the company is adamant that the forthcoming DX12 and Vulkan APIs will change that. Specifically, the red team says that these APIs will be able to take advantage of AMD’s asynchronous compute engines (ACE), which are inherent to AMD’s GCN architecture. These asynchronous compute engines will allow future games that support them to accomplish more simultaneous tasks at a time. AMD infers that this is tantamount to hyperthreading for GPUs.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/dx11.png" alt="DX11 amd" title="DX11 amd" width="620" height="332" /></p> <p style="text-align: center;"><strong>This is the traditional GPU pipeline with DX11.&nbsp;</strong></p> <p>The logic here is that having multiple GPU queues allows for tasks to be completed much faster, and that users could see better performance out of their GCN graphics cards. On a call with AMD, the company claimed that the traditional GPU pipeline behavior is currently very linear with DirectX11 and that all work must be done in a single queue that is scheduled in a pre-determined order. With DX12, however, tasks like physics, lighting, and post-processing can be divided into different queues and can be scheduled independently. This not only amounts to higher FPS in applications that support asynchronous shaders, but lower latency as well, which is key to having good VR experiences. To analogize the comparison, AMD equated DX11’s current inefficiency to inner-city roads with traffic lights against DX12’s more asynchronous model, which the company equated to a freeway system. In the asynchronous/freeway model, tasks can merge in to fill gaps and aren’t bogged down by red lights, or bottlenecks, in this example.</p> <p style="text-align: center;"><img src="/files/u154082/dx12.png" alt="dx12 amd async" title="dx12 amd async" width="620" height="324" /></p> <p style="text-align: center;"><strong>AMD says DX12 will be able to leverage its GPU's asynchronous compute engines to perform more efficiently.</strong></p> <p>According to AMD, using asynchronous shaders can provide post-processing effects with minimal impact on performance. The company cited its LiquidVR SDK demo, which rendered a scene that had an average FPS at 245 with async shaders and post-processing turned off. With the post-processing effect on, however, the FPS took a huge hit and dropped to 158. With async shaders turned on, the average FPS jumped back up to 230, which is just ever slightly more taxing than the scene with post-processing turned off. According to AMD, async shaders have the potential to save performance by over 40%.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/asynch_shaders_perf.jpg" alt="async shaders perf" title="async shaders perf" width="620" height="326" /></p> <p style="text-align: center;"><strong>AMD is touting that its async shaders will be very non-taxing on post-processing.&nbsp;</strong></p> <p>AMD says that async shaders and asynchronous compute engines are a part of the GCN DNA, so developers will be able to take advantage of them with the next generation APIs. With AMD pouring its Mantle learnings into Vulkan, the next iteration of OpenGL, the open-source API will also be able to take advantage of AMD’s asynchronous shaders. In addition, AMD tells us that all the major game engines, like Epic’s Unreal Engine and Crytek’s CryEngine, will be able to take advantage of AMD’s asynchronous shaders.&nbsp;</p> <p>According to AMD, the PlayStation 4, which uses AMD hardware, is already uses asynchronous shaders in games like InFamous Second Son and The Tomorrow Children to get more performance efficiencies and the company believes these learnings will work their way over to the PC with the next-generation APIs. AMD also says the philosophy behind its asynchronous shaders will also apply to the company’s GCN-based APUs.&nbsp;</p> http://www.maximumpc.com/async_shaders_will_allow_gpus_live_their_full_potential_says_amd_2015#comments ace amd async shaders asynchronous gcn gpu graphics card multithread nvidia video News Tue, 31 Mar 2015 17:45:04 +0000 Jimmy Thang 29661 at http://www.maximumpc.com Nvidia GeForce GTX Titan X SLI Benchmarks [UPDATED] http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_sli_benchmarks_2015 <!--paging_filter--><h3>Performance that will make you see double</h3> <p>So you might have heard that Nvidia <a href="http://www.maximumpc.com/nvidia_titan_x_review_2015">released the GeForce GTX Titan X video card yesterday</a>. It's the fastest single-GPU card on the planet (though not the fastest single card, because of the dual GPUs in the Titan Z and <a href="http://www.maximumpc.com/amd_unleashes_dual-gpu_radeon_r9_295x2">the Radeon R9 295X2</a>). Maybe most people would be satisfied with the benchmarks of a single Titan X, but we're not most people. So we called a guy who knows a guy, and we acquired a second Titan X. The things we do for you people!</p> <p><strong>UPDATE: </strong>We located a <strong>third</strong> Titan X, and we discovered that we need to upgrade our CPU! This is fun.</p> <p>To recap, this is the system that we've been using to test our video cards:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Corsair AX1200</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 64-bit</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p>It's an aging system, but it has plenty of juice to drive up to four GPUs. We used six titles to benchmark the Titan X and similar cards:&nbsp; Metro: Last Light, Hitman: Absolution, Tomb Raider, Batman: Arkham Origins, Unigine Heaven, and Middle-earth: Shadow of Mordor. We use these games because they have an even balance of Nvidia friendliness and AMD friendliness, they'll push your hardware when you max-out the settings, and they have built-in benchmarks, so you can reproduce our results yourself.</p> <p style="text-align: center;"><img src="/files/u160416/this_is_happening.jpg" alt="Titan X SLI" title="Titan X SLI" width="620" height="465" /></p> <p>The Nvidia cards were benchmarked with the GeForce 347.84 sent to Titan X reviewers, which are apparently nearly identical to the 347.88 drivers released to the public yesterday. Our MSI Radeon R9 290X Lightning Edition card used <a href="http://www.maximumpc.com/amds_year_end_gift_gamers_catalyst_omega_special_edition_driver_2014">AMD's Omega drivers released in December</a>. The other cards in the mix are the Asus GTX970-DCMOC-4GD5; and the Asus&nbsp;STRIX-GTX780-OC-6GD5. The GTX 780 Ti in this roundup is the reference model. All clock speeds in the chart below are of the actual cards we tested, rather than the default clock speeds of the baseline models, except when a baseline model was actually used.</p> <p>Since we were not blessed with a second MSI GTX 980 Gaming 4G, the SLI benchmark is of two reference 980s in our possession. The difference will be small, but it is there.</p> <div class="spec-table orange" style="font-size: 12px; font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>Titan X</td> <td>Titan&nbsp;</td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 780 Ti</td> <td>GTX 780</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>&nbsp;GM200</td> <td>&nbsp;GK110</td> <td>&nbsp;GM204</td> <td>&nbsp;GM204&nbsp;</td> <td>&nbsp;GK110&nbsp;</td> <td class="item-dark">&nbsp;GK104</td> <td>Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1,000</td> <td>&nbsp;837</td> <td>&nbsp;1,216</td> <td>&nbsp;1,088</td> <td>&nbsp;876</td> <td>&nbsp;889</td> <td>"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1,075</td> <td>&nbsp;876</td> <td>&nbsp;1,317</td> <td>&nbsp;1,228</td> <td>&nbsp;928</td> <td class="item-dark">&nbsp;941</td> <td>N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7,010</td> <td>&nbsp;6,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;6,000</td> <td>5,000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;12GB</td> <td>&nbsp;6GB</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;3GB</td> <td>&nbsp;6GB</td> <td>4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;96</td> <td>&nbsp;48</td> <td>&nbsp;64</td> <td>&nbsp;56</td> <td>&nbsp;48</td> <td>&nbsp;48</td> <td>64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;192</td> <td>&nbsp;224</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;240</td> <td>&nbsp;192</td> <td>176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;3,072</td> <td>&nbsp;2,688</td> <td>&nbsp;2,048</td> <td>&nbsp;1,664</td> <td>&nbsp;2,880</td> <td>&nbsp;2,304</td> <td>2,816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;24</td> <td>&nbsp;15</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;15</td> <td>&nbsp;12</td> <td>N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>290</td> </tr> <tr> <td>Launch Date</td> <td>March 2015</td> <td>March 2013</td> <td>Sept 2014</td> <td>Sept 2014</td> <td>Nov 2013</td> <td>May 2013</td> <td>Oct 2013</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$999</td> <td>&nbsp;$999</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>$549</td> </tr> </tbody> </table> </div> <p>You can refer to our Titan X review for more information on what these specs mean. We don't want to flap our gums here any more than necessary. Now that we've explained the context of the benchmarks, here they are:</p> <h3>3840x2160 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider*</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td>Titan X</td> <td>&nbsp;35</td> <td>&nbsp;53</td> <td>&nbsp;33</td> <td>&nbsp;44</td> <td>&nbsp;44/60</td> <td>&nbsp;26</td> </tr> <tr> <td>Titan X SLI</td> <td>&nbsp;54</td> <td>&nbsp;94</td> <td>&nbsp;58</td> <td>&nbsp;75</td> <td>&nbsp;83/112</td> <td>&nbsp;49</td> </tr> <tr> <td><strong>SLI Scaling</strong></td> <td><strong>&nbsp;54%</strong></td> <td><strong>&nbsp;77%</strong></td> <td><strong>&nbsp;76%</strong></td> <td><strong>&nbsp;70%</strong></td> <td><strong>&nbsp;87%/87%</strong></td> <td><strong>&nbsp;88%</strong></td> </tr> <tr> <td>3-Way SLI</td> <td>&nbsp;53</td> <td>&nbsp;110</td> <td>&nbsp;78</td> <td>&nbsp;89</td> <td>&nbsp;119/161</td> <td>&nbsp;70</td> </tr> <tr> <td><strong>3-Way Scaling</strong></td> <td><strong>&nbsp;N/A</strong></td> <td><strong>&nbsp;17%</strong></td> <td><strong>&nbsp;34%</strong></td> <td><strong>&nbsp;19%</strong></td> <td><strong>&nbsp;43% /43%</strong></td> <td><strong>&nbsp;43%</strong></td> </tr> <tr> <td>Titan</td> <td>&nbsp;24</td> <td>&nbsp;34</td> <td>&nbsp;22</td> <td>&nbsp;25</td> <td>&nbsp;26/37</td> <td>&nbsp;18</td> </tr> <tr> <td>980</td> <td>&nbsp;32</td> <td>&nbsp;41</td> <td>&nbsp;24</td> <td>&nbsp;37</td> <td>&nbsp;36/48</td> <td>&nbsp;20</td> </tr> <tr> <td>980 SLI</td> <td>&nbsp;46</td> <td>&nbsp;74</td> <td>&nbsp;44</td> <td>&nbsp;59</td> <td>&nbsp;64/84</td> <td>&nbsp;35</td> </tr> <tr> <td><strong>SLI Scaling</strong></td> <td><strong>&nbsp;44%</strong></td> <td><strong>&nbsp;80%</strong></td> <td><strong>&nbsp;83%</strong></td> <td><strong>&nbsp;60%</strong></td> <td><strong>&nbsp;77%/75%</strong></td> <td><strong>&nbsp;75%</strong></td> </tr> <tr> <td>970</td> <td>&nbsp;24</td> <td>&nbsp;32</td> <td>&nbsp;19</td> <td>&nbsp;28</td> <td>&nbsp;27/37</td> <td>&nbsp;15</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;27</td> <td>&nbsp;38</td> <td>&nbsp;23</td> <td>&nbsp;32</td> <td>&nbsp;29/40</td> <td>&nbsp;19</td> </tr> <tr> <td>780</td> <td>&nbsp;26</td> <td>&nbsp;35</td> <td>&nbsp;23</td> <td>&nbsp;30</td> <td>&nbsp;27/38</td> <td>&nbsp;18</td> </tr> <tr> <td>290X</td> <td>&nbsp;28</td> <td>&nbsp;41</td> <td>&nbsp;29</td> <td>&nbsp;37</td> <td>&nbsp;31/43</td> <td>&nbsp;17</td> </tr> </tbody> </table> </div> </h4> <p style="text-align: left;"><span style="font-weight: normal;">*<em>TressFX on/TressFX off</em></span></p> <p style="text-align: left;"><span style="font-weight: normal;">We're benchmarking these games on their highest presets with 4x multi-sample anti-aliasing (or in Tomb Raider's case, 2x super-sample anti-aliasing, since it has no MSAA option), so you're not going to see ideal performance here. We push these cards by design, rather than aiming for playable framerates. At the prices you're paying for these cards, you shouldn't have to make many compromises. Even with a second Titan X in the mix, though, we still can't hit 60fps across the board. Granted, at 4K, you probably don't need 4xMSAA, but it is interesting to see just how much this resolution affects performance. What's also interesting is how much the SLI scaling varies from game to game. The Titan X is a lot more constistent, but both it and the GTX 980 struggle with Metro: Last Light (which, it should be said, is an AMD-friendly game, as is Hitman: Absolution).</span></p> <p style="text-align: left;"><span style="font-weight: normal;">When we add the third Titan-X (I think they're multiplying when we're not looking), we get a smaller performance bump, but this is to be expected. What we didn't see coming were the particularly modest gains in Batman and Shadow of Mordor, indicating that our CPU is hitting a wall (at least, at its stock clock speed). So this addition to our benchmarks has been educational for us as well. Metro: Last Light also didn't even recognize the third GPU, so we're considering dropping that game from our benchmark suite, because this is not the first time it's happened. And upgrading our testing rig to a 5960X is now a high priority. We'll also experiment with overclocking the 3960X that's currently installed.</span></p> <p style="text-align: left;"><span style="font-weight: normal;"><img src="/files/u160416/this_is_also_happening.jpg" alt="Nvidia Titan X 3-Way SLI" title="Nvidia Titan X 3-Way SLI" width="620" height="465" /><br /></span></p> <p style="text-align: left;">In the coming days, we plan to get you some more multi-GPU benches to compare against the Titan X. In the meantime, we hope you found these new numbers both delicious and nutritious.</p> http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_sli_benchmarks_2015#comments benchmarks geforce gpu nvidia sli Titan X Video Card News Sat, 21 Mar 2015 01:26:19 +0000 Tom McNamara 29611 at http://www.maximumpc.com Nvidia Titan X Review http://www.maximumpc.com/nvidia_titan_x_review_2015 <!--paging_filter--><h3>A new hero descends from the heights of Mount GeForce</h3> <p>In ancient Greek mythology, the Titans are the immediate descendants of the primordial gods. So it is with the Nvidia GeForce GTX Titan, descended from the company's top-shelf professional workstation GPUs. <a title="Nvidia GeForce GTX Titan review" href="http://www.maximumpc.com/evga_geforce_gtx_titan_review" target="_blank">First debuting in March 2013</a>, the original Titan was nearly the most powerful video card that the company could offer. They sealed off a couple items that would be of little interest to gamers, which also prevented professionals from using these much less expensive gamer variants for workstation duties.</p> <p>In the two years since, the company has iterated on this design, adding more shader processors (or "CUDA cores," as Nvidia likes to call them), and even adding a second GPU core on the same card. Now the time has come for it to deliver the Maxwell generation of super-premium GPUs, this time dubbed the <strong>GTX Titan X</strong>. And it's a beast. Despite being stuck on the 28nm process node for several years now, the company continues to extract more and more performance from its silicon. Interestingly, the card goes up for sale today, but only at Nvidia's own online storefront. There is currently a limit of two per order. The company tells us that you'll be able to buy it from other stores and in pre-built systems "over the next few weeks." First-world problems, right?</p> <p><img src="/files/u99720/nvidia_titan_5159.png" alt="Titan X" title="Titan X" width="620" height="401" style="text-align: center;" /></p> <p>These days, you can use the number of shader cores as a rough estimate of performance. We say "rough" because the Maxwell cores in this Titan X are, according to Nvidia, 40 percent faster than the Kepler cores in the earlier Titans. So when you see that the Titan X has "only" 3072 of them, this is actually a huge boost. It's about 30 percent more than the GTX 980, which is already a barnstormer. For reference, the difference in shader count between <a title="Nvidia GeForce GTX 780 review" href="http://www.maximumpc.com/asus_rog_poseidon_gtx_780_review" target="_blank">the GTX 780</a> and the original Titan was about 16 percent. The Titan X also has an almost ridiculous 12GB of GDDR5 VRAM. We say "almost" because Nvidia has some ambitious goals for the resolution that it expects you to be able to play at with this card.</p> <p>At the Game Developers Conference two weeks ago, its reps pitched the Titan X to us as the first GPU that could handle 4K gaming solo, at high settings. They demoed Middle-Earth: Shadow of Mordor, which wasn't a solid 60fps, as they readily acknowledged. But we did see all the graphics settings cranked up, and gameplay was smooth at about 45fps <a title="G-Sync introduction video" href="http://www.maximumpc.com/acer_4k_g-sync_monitor_tested_gtx_980_video" target="_blank">when paired with a G-Sync monitor</a>. As its name implies, G-sync synchronizes your monitor's refresh rate to the frame rate being delivered to your video card, which vastly reduces tearing. They also enabled motion blur, which can help mask frame rate drops.</p> <p><img src="/files/u160416/titanx3.jpg" width="620" height="349" /></p> <p>For our review, we used seven high-end cards that have come out in the same two-year time frame as the original Titan. Some of these are no longer sold in stores, but they still provide an important frame of reference, and their owners may want to know if upgrading is going to be worth it.</p> <p style="font-weight: normal;">Note that the clock speeds in the charts on the next page are not all for the reference versions. These are for the particular models that we used for this review. The GTX 980 is the MSI Gaming 4G model; the GTX 970 is the Asus GTX970-DCMOC-4GD5; the GTX 780 is the Asus&nbsp;STRIX-GTX780-OC-6GD5 (and the reference model also has 3GB of VRAM instead of 6GB); and the Radeon R9 290X is the MSI Lightning edition. We used the prices for the reference versions, however.</p> <h3 style="text-align: right;"><a title="GeForce Titan X Review Page 2" href="http://www.maximumpc.com/nvidia_titan_x_review_2015?page=0,1" target="_self">Click here to turn to page 2 for the specs!</a></h3> <hr /> <p>Let's take a look at their specs:</p> <div class="spec-table orange" style="font-size: 12px; font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>Titan X</td> <td>Titan&nbsp;</td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 780 Ti</td> <td>GTX 780</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>&nbsp;GM200</td> <td>&nbsp;GK110</td> <td>&nbsp;GM204</td> <td>&nbsp;GM204&nbsp;</td> <td>&nbsp;GK110&nbsp;</td> <td class="item-dark">&nbsp;GK104</td> <td>Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1,000</td> <td>&nbsp;837</td> <td>&nbsp;1,216</td> <td>&nbsp;1,088</td> <td>&nbsp;876</td> <td>&nbsp;889</td> <td>"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1,075</td> <td>&nbsp;876</td> <td>&nbsp;1,317</td> <td>&nbsp;1,228</td> <td>&nbsp;928</td> <td class="item-dark">&nbsp;941</td> <td>N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7,010</td> <td>&nbsp;6,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;6,000</td> <td>5,000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;12GB</td> <td>&nbsp;6GB</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;3GB</td> <td>&nbsp;6GB</td> <td>4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;96</td> <td>&nbsp;48</td> <td>&nbsp;64</td> <td>&nbsp;56</td> <td>&nbsp;48</td> <td>&nbsp;48</td> <td>64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;192</td> <td>&nbsp;224</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;240</td> <td>&nbsp;192</td> <td>176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;3,072</td> <td>&nbsp;2,688</td> <td>&nbsp;2,048</td> <td>&nbsp;1,664</td> <td>&nbsp;2,880</td> <td>&nbsp;2,304</td> <td>2,816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;24</td> <td>&nbsp;15</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;15</td> <td>&nbsp;12</td> <td>N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>290</td> </tr> <tr> <td>Launch Date</td> <td>March 2015</td> <td>March 2013</td> <td>Sept 2014</td> <td>Sept 2014</td> <td>Nov 2013</td> <td>May 2013</td> <td>Oct 2013</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$999</td> <td>&nbsp;$999</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>$549</td> </tr> </tbody> </table> </div> <p>You probably noticed that the Titan X has a whopping 96 ROPs. These render output units are responsible for the quality and performance of your anti-aliasing (AA), among other things. AA at 4K resolutions can kill your framerate, so when Nvidia pitches the Titan X as a 4K card, the number of ROPs here is one of the reasons why. They've also made a return to a high number of texture mapping units. TMUs take a 3D object and apply a texture to it, after calculating angles and perspectives. The higher your resolution, the more pixels you're dealing with, so this is another change that serves 4K performance well.</p> <p>"SM" stands for "streaming multi-processor." Stream processing allows a GPU to divide its workload to be processed on multiple chips at the same time. In Nvidia's architecture, each one of these SMs contains a set of CUDA cores and a small amount of dedicated cache memory (apart from the gigabytes of VRAM listed on the box). Having 50 percent more SMs than your next-fastest card should give you an impressive jump in performance. The result won't be linear, though, becuase the Titan X has lower clock speeds—those extra one billion transistors on the Titan X generate additional heat, so lowering clocks is the main way of dealing with that. Its siblings the GTX 980 and 970 have "only" 5.2 billion transistors each, so they can set their clocks much higher.</p> <p><img src="/files/u160416/titanx2.jpg" width="620" height="390" /></p> <p>Despite all the silicon crammed into the Titan X, it still uses Nvidia's reference dimensions; it's only about 10.5 inches long, and it's not taller or wider than the slot bracket. If not for its darker coloring, you could easily confuse it for any baseline Nvidia card released in the past couple years. Its fan is noticeably quieter than the Titans that have come before, but it won't disappear into the background like we've seen (heard) when Nvidia's partners install their own cooling systems. If you want reliable quietude, you'll have to wait for EVGA's Hydro Copper version, which attaches to a custom water-cooling loop, or try your hand at <a title="Accelero Hybrid GTX 680 Review" href="http://www.maximumpc.com/arctic_cooling_accelero_hybrid_gtx_680_review" target="_blank">something like Arctic Cooling's Accelero Hybrid.</a></p> <p>One card arguably missing from our lineup is the Titan Black. However, <a title="Nvidia GeForce GTX 780 Ti review" href="http://www.maximumpc.com/gigabyte_gtx_780_ti_oc_review" target="_blank">the GTX 780 Ti</a> is basically the same thing, but with a 3GB frame buffer instead of a 6GB frame buffer, and slightly lower clock speeds.</p> <p><a title="AMD Radeon R9 290X review" href="http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review" target="_blank">The Radeon R9 290X</a> is the fastest GPU that AMD currently has available, so we thought it would make for a good comparison, despite being about a year and a half old; and the MSI Lightning edition is arguably the beefiest version of it.</p> <p>Before we show you the benchmarks, here's the system that we used to test these cards:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Corsair AX1200</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 64-bit</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p>Our Sandy Bridge-E system is getting a little long in the tooth, but the Intel Core i7-3960X is still quite a beefy chip and fine for benchmarking video cards. We'll probably be moving to the Haswell-E platform soon.</p> <p>We test with every game set to its highest graphical preset and 4x multi-sampled anti-aliasing (MSAA). Sometimes individual settings can be increased even further, but we leave these alone for more normalized results. That's because these settings are usually optimized for a specific brand of cards, which can end up skewing results. For example, we leave PhysX disabled. We did make one exception, to show you how much of an impact certain niche settings can have: At 3840x2160, we tested Tomb Raider with TressFX on, and TressFX off. Since this hair-rendering tech is an open spec, both Nvidia and AMD can optimize for it.</p> <p>MSAA is not an available setting in Tomb Raider, so we use 2x super-sample antialiasing (SSAA) instead. This form of AA generates a higher resolution frame than what the monitor is set at, and squishes the frame down to fit.</p> <p>All Nvidia cards in this roundup were tested with the 347.84 drivers, which were given to us ahead of release and are scheduled to be available for everyone to download on March 17th. The Titan X is also scheduled to hit retail on this day. We tested the R9 290X with <a href="http://www.maximumpc.com/amds_year_end_gift_gamers_catalyst_omega_special_edition_driver_2014" target="_blank">AMD's Omega drivers released in December</a>.</p> <h3 style="text-align: right;"><a title="GeForce Titan X Review Page 3" href="http://www.maximumpc.com/nvidia_titan_x_review_2015?page=0,2" target="_self">Click here to see the benchmarks and analysis!</a></h3> <hr /> <p>We test with a mix of AMD-friendly and Nvidia-friendly titles (it seems like you're either one or the other, these days); Metro: Last Light, Hitman: Absolution, and Tomb Raider usually favor AMD; Batman: Arkham Origins, Middle-earth: Shadow of Mordor, and Unigine Heaven favor Nvidia. In all cases, we use their built-in bechmarks to minimize variance.</p> <h3>1920x1080 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td class="item">Titan X</td> <td>&nbsp;93</td> <td>&nbsp;127</td> <td>&nbsp;84</td> <td class="item-dark">&nbsp;106</td> <td>&nbsp;205</td> <td>&nbsp;97</td> </tr> <tr> <td>Titan</td> <td>&nbsp;63</td> <td>&nbsp;80</td> <td>&nbsp;63</td> <td>&nbsp;67</td> <td>&nbsp;129</td> <td>&nbsp;57</td> </tr> <tr> <td>980</td> <td>&nbsp;86</td> <td>&nbsp;99</td> <td>&nbsp;70</td> <td>&nbsp;93</td> <td>&nbsp;164</td> <td>&nbsp;79</td> </tr> <tr> <td>970</td> <td>&nbsp;71</td> <td>&nbsp;81</td> <td>&nbsp;59</td> <td>&nbsp;72</td> <td>&nbsp;132</td> <td>&nbsp;61</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;72</td> <td>&nbsp;84</td> <td>&nbsp;70</td> <td>&nbsp;77</td> <td>&nbsp;142</td> <td>&nbsp;69</td> </tr> <tr> <td>780</td> <td>&nbsp;67</td> <td>&nbsp;77</td> <td>&nbsp;65</td> <td>&nbsp;71</td> <td>&nbsp;122</td> <td>&nbsp;62</td> </tr> <tr> <td>290X</td> <td>&nbsp;82</td> <td>&nbsp;111</td> <td>&nbsp;64</td> <td>&nbsp;84</td> <td>&nbsp;143</td> <td>&nbsp;65</td> </tr> </tbody> </table> </div> </h4> <p>You probably noticed that the GTX 780 trades blows with the original GTX Titan, despite the Titan having better specs. The 780 benefits from a higher clock speed and an enhanced cooler designed by Asus. Historically, Nvidia has not allowed its partners to use vendor-specific coolers on the Titan cards, so the other cards with slightly lower specs and better cooling could catch up with some overclocking. However, Nvidia says that the Titan X was highly overclockable despite using a reference cooler, so we'll be exploring that soon.</p> <p>The 780 Ti handily beats the original Titan despite also using reference clock speeds, because the Ti variant is basically a Titan Black, which is the sequel to the original Titan and came out about a year later. (And the Titan X is a physically black card, while the Titan Black is not. It can get a little confusing.)</p> <p>Meanwhile, the R9 290X beats all the Kepler generation cards, except in Hitman: Absolution, which is usually a bastion for AMD's GPUs. It looks like Nvidia has figured out some driver optimizations here.</p> <p>In general, the Titan X says to the other cards, "Get on my level." It's clearly operating on a different tier of performance.&nbsp;<a title="Nvidia GeForce GTX 980 Review" href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014" target="_blank">The GTX 980</a> also stays generally ahead of the 290X by a comfortable margin.</p> <h3>2560x1440 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td class="item">Titan X</td> <td>&nbsp;64</td> <td>&nbsp;90</td> <td>&nbsp;60</td> <td class="item-dark">&nbsp;77</td> <td>&nbsp;129</td> <td>&nbsp;61</td> </tr> <tr> <td>Titan</td> <td>&nbsp;44</td> <td>&nbsp;58</td> <td>&nbsp;43</td> <td>&nbsp;49</td> <td>&nbsp;77</td> <td>&nbsp;38</td> </tr> <tr> <td>980</td> <td>&nbsp;59</td> <td>&nbsp;71</td> <td>&nbsp;46</td> <td>&nbsp;67</td> <td>&nbsp;105</td> <td>&nbsp;48</td> </tr> <tr> <td>970</td> <td>&nbsp;47</td> <td>&nbsp;59</td> <td>&nbsp;39</td> <td>&nbsp;51</td> <td>&nbsp;81</td> <td>&nbsp;36</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;51</td> <td>&nbsp;62</td> <td>&nbsp;48</td> <td>&nbsp;56</td> <td>&nbsp;86</td> <td>&nbsp;42</td> </tr> <tr> <td>780</td> <td>&nbsp;47</td> <td>&nbsp;59</td> <td>&nbsp;44</td> <td>&nbsp;52</td> <td>&nbsp;80</td> <td>&nbsp;40</td> </tr> <tr> <td>290X</td> <td>&nbsp;54</td> <td>&nbsp;83</td> <td>&nbsp;54</td> <td>&nbsp;63</td> <td>&nbsp;91</td> <td>&nbsp;40</td> </tr> </tbody> </table> </div> </h4> <p>As we ratchet up the resolution (while keeping all other graphical settings the same) we see the performance separation begin. While everyone comfortably sustained 60-plus fps at 1080p, older GPUs struggle to maintain that threshold at 2560x1440, as does the GTX 970. We're pushing 77 percent more pixels onto the screen, and the original Titan's relatively low number of ROPs, low clock speeds, and Kepler-generation CUDA cores combine to make an obstacle that the other cards don't have to deal with. The new Titan X is producing well over 50 percent more frames in some of these tests, despite generating less noise, about the same amount of heat, and costing about the same. Wringing these kind of gains from the same 28nm process node is pretty impressive. It comfortably beats AMD's best card in every test. Tomb Raider and <a title="Batman: Arkham Origins review" href="http://www.maximumpc.com/batman_arkham_origins_review_2014" target="_blank">Batman: Arkham Origins</a> distinguish themselves as two particularly well-optimized games.&nbsp;</p> <p>The R9 290X remains ahead of Nvidia's Kepler cards and pulls away in Hitman. AMD's 512-bit bus provides a wide pipe for memory bandwidth, and that advantage emerges once you move past 1080p. It's not until we encounter newer premium cards like the GTX 980 and Titan X that we find a competitive alternative from Nvidia. And when the Titan X arrives, it makes a statement, decisively maintaining 60-plus fps no matter what we threw at it. We'd want nothing less from a card that costs nearly three times as much as the 290X. The GTX 980 gets more mixed results here, but it still looks like a great card for playing at this resolution.</p> <h3>3840x2160 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider*</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td class="item">Titan X</td> <td>&nbsp;35</td> <td>&nbsp;53</td> <td>&nbsp;33</td> <td class="item-dark">&nbsp;44</td> <td>&nbsp;44/60</td> <td>&nbsp;26</td> </tr> <tr> <td>Titan</td> <td>&nbsp;24</td> <td>&nbsp;34</td> <td>&nbsp;22</td> <td>&nbsp;25</td> <td>&nbsp;26/37</td> <td>&nbsp;18</td> </tr> <tr> <td>980</td> <td>&nbsp;32</td> <td>&nbsp;41</td> <td>&nbsp;24</td> <td>&nbsp;37</td> <td>&nbsp;36/48</td> <td>&nbsp;20</td> </tr> <tr> <td>970</td> <td>&nbsp;24</td> <td>&nbsp;32</td> <td>&nbsp;19</td> <td>&nbsp;28</td> <td>&nbsp;27/37</td> <td>&nbsp;15</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;27</td> <td>&nbsp;38</td> <td>&nbsp;23</td> <td>&nbsp;32</td> <td>&nbsp;29/40</td> <td>&nbsp;19</td> </tr> <tr> <td>780</td> <td>&nbsp;26</td> <td>&nbsp;35</td> <td>&nbsp;23</td> <td>&nbsp;30</td> <td>&nbsp;27/38</td> <td>&nbsp;18</td> </tr> <tr> <td>290X</td> <td>&nbsp;28</td> <td>&nbsp;41</td> <td>&nbsp;29</td> <td>&nbsp;37</td> <td>&nbsp;31/43</td> <td>&nbsp;17</td> </tr> </tbody> </table> </div> </h4> <p style="text-align: left;"><span style="font-weight: normal;">*<em>TressFX on/TressFX off</em></span></p> <p><span style="font-weight: normal;">When you look at these results, it's important to keep in mind that our review process does not aim for playable framerates. We want to see how these cards perform when pushed to the limit. Despite this demanding environment, the Titan X remains a viable solo card to have at 4K, though it's still not ideal (putting aside for the moment <a title="4K Monitors: Everything You Need to Know" href="http://www.maximumpc.com/4k_monitor_2014" target="_blank">the technical resolution difference between DCI 4K and Ultra HD 4K</a>). The good news is that 4xMSAA is arguably not needed at a resolution this high, unless you're gaming on a big 4K HDTV that's less than a couple of feet from your eyes.</span></p> <p><span style="font-weight: normal;">Those with screens that are 32 inches or smaller will probably be fine with 2xMSAA, or some version of SMAA (</span><span style="font-weight: normal; font-size: 1em;">Enhanced Subpixel Morphological Antialiasing), which is known to be quite efficient while producing minimal blurriness and shimmering. Nvidia's TXAA (Temporal Anti-Aliasing) can be a good option when you have one of the company's cards and are playing a game that supports the feature. And with the Maxwell generation of cards (the Titan X, GTX 980, and GTX 970), you also have MFAA, or&nbsp;Multi-Frame Sample Anti-Aliasing. The company claims that this gets you 4xMSAA visual quality at the performance cost of 2xMSAA.</span></p> <p><span style="font-weight: normal; font-size: 1em;">The GTX 780 nearly catches up with the 780 Ti at this resolution, again demonstrating the importance of clock speeds, although the difference is pretty modest in this scenario. At 4K, this GTX 780's additional 3GB of VRAM also comes into play. The 6GB card spends less processing power on memory management. However, the 780 does not support 4-way SLI, if that's your thing. It's limited to 3-way SLI. The GTX 970 and 980 have the same difference with their SLI support. The GTX 960 is limited to only 2-way SLI. This is one of the methods that Nvidia uses to encouraging the purchase of their more expensive cards. All Titans support 4-way SLI.</span></p> <p><span style="font-weight: normal; font-size: 1em;">The R9 290X maintains its lead over Kepler, though it shrinks inside the margin of error at times. It's weakest in Unigine Heaven, because this benchmark makes heavy use of tessellation (dynamically increasing surface complexity by subdividing triangles in real time), and that's something that Kepler and Maxwell do much better. In general, it's a very respectable performer, especially for the price, which has fallen to roughly that of a GTX 970. Since the 290X is meaningfully faster in every single benchmark that we used, and it bumps up against the GTX 980 when we get to 4K, it makes for a pretty good spoiler until the Titan X arrives and leapfrogs everyone in the contest.</span></p> <p><span style="font-weight: normal; font-size: 1em;"><img src="/files/u160416/titanx1.jpg" width="620" height="393" /></span></p> <h3><span style="font-weight: normal; font-size: 1em;">Conclusion</span></h3> <p><span style="font-weight: normal; font-size: 1em;">Overall, things are looking pretty rosy for the Titan X. Since it's packed with a huge amount of ROPs, SMs, shader processors, and VRAM, it's able to overcome the limitation of the aging 28nm process. The Maxwell-generation CUDA cores are also about 40 percent faster than the older Kepler version (by Nvidia's estimation, at least), and the company improved color compression for additional performance gains. It's not the Chosen One if you want to game with a single GPU at 4K, but you can get pretty close if you're willing to tweak a few graphical settings.</span></p> <p><span style="font-weight: normal; font-size: 1em;">Also keep in mind that it was about one year ago when Nvidia debuted the GTX Titan Z, which has two Titan Black GPUs on a single card. So they may plan to drop a dual Titan X sometime soon, as well. And there's room in the lineup for a "980 Ti," since there's quite a spec gap (and price gap) right now between the GTX 980 and the GTX Titan X. If that's not enough, <a title="AMD Radeon R9 370 Core Edition Leaked" href="http://www.maximumpc.com/xfx_radeon_r9_370_core_edition_leaks_web_higher_end_r300_series_cards_follow" target="_blank">rumors around AMD's next generation of video cards are reaching a boiling point</a>. There's always something new around the corner, isn't there? But if you're comfortable with this price tag, and you don't care about what AMD's got cooking, the Titan X is the fastest thing you'll find for gaming beyond 1080p.</span></p> http://www.maximumpc.com/nvidia_titan_x_review_2015#comments Gaming gpu Hardware Nvidia Titan X sli Video Card Reviews Tue, 17 Mar 2015 19:00:13 +0000 Tom McNamara 29579 at http://www.maximumpc.com Possible Look at Specifications and Performance for AMD's Radeon R9 390X http://www.maximumpc.com/possible_look_specifications_and_performance_amds_radeon_r9_390x_2015 <!--paging_filter--><h3><img src="/files/u69/amd_radeon_1.jpg" alt="AMD Radeon R9 290X" title="AMD Radeon R9 290X" width="228" height="170" style="float: right;" />A potentially beastly card in the making</h3> <p>Go ahead and apply the standard disclaimer about leaked specs not being verified or official, because that's certainly the case here. Disclaimer aside, we hope that <strong>unconfirmed specifications of the AMD's forthcoming Radeon R9 390X graphics card</strong> turn out to be accurate, because if they are, it's going to be a potent part that's up to 60 percent faster than AMD's Radeon R9 290X.</p> <p>The folks at <a href="http://videocardz.com/55146/amd-radeon-r9-390x-possible-specifications-and-performance-leaked" target="_blank"><em>Videocardz</em></a> asked their source if he could share additional information about AMD's new flagship graphics card, and to the site's surprised, he responded in kind with a few more goodies to digest. One of those goodies is that AMD scrapped plans to run with 4GB of High Bandwidth Memory (HBM) Gen1 (1GB per stack) after Nvidia unveiled its Titan X graphics card. Now the plan is to release the Radeon R9 390X with 8GB, but Gen2 (2GB per stack), on a 4,096-bit bus (1,024-bit per stack). That should give the card around 1.25TB/s of memory bandwidth.</p> <p>The GPU is said to be a 28nm Fiji XT part with 4,096 unified cores and 256 Texture Mapping Units (TMUs). There's no mention of ROPs or core clockspeed, though the boost clockspeed is reportedly 1,050MHz. Other specs include a 1,250MHz memory clock, 8.6TFLOPS of compute performance, and either a 6+8 pin or dual 8-pin PCI-E configuration.</p> <p>There's also a performance slide that was leaked, and if it's accurate, performance will be up to around 1.65 times that of the Radeon R9 290X in 4K gaming.</p> <p>Reports from elsewhere on the web have the card debuting at around $700, which is also unconfirmed.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/possible_look_specifications_and_performance_amds_radeon_r9_390x_2015#comments amd Build a PC Fiji Gaming gpu graphics card Hardware Radeon R9 390X Video Card News Mon, 16 Mar 2015 15:41:48 +0000 Paul Lilly 29588 at http://www.maximumpc.com GeForce GTX 960M and 950M Chips Round Out Nvidia’s Mobile GPU Lineup http://www.maximumpc.com/geforce_gtx_960m_and_950m_chips_round_out_nvidia%E2%80%99s_mobile_gpu_lineup_2015 <!--paging_filter--><h3><img src="/files/u69/geforce_gtx_960m.jpg" alt="GeForce GTX 960M" title="GeForce GTX 960M" width="228" height="149" style="float: right;" />Two new GPUs for laptops</h3> <p><strong>Nvidia today rolled out its GeForce GTX 960M and 950M GPUs, the latest additions to its GTX 900M Series</strong>. The new GPUs bring up the rear of Nvidia's latest generation of laptop graphics, slipping underneath the GeForce GTX 965M, 970M, and flagship 980M. You'll mostly find the new parts in thin and light gaming laptops where Nvidia promises they'll deliver "never-before-seen levels of gaming performance" for the category.</p> <p>"Today’s launch is with immediate availability too, so gamers today have more thin and light gaming notebooks to choose from than ever before. In fact, many of them have now joined the 'size does matter' club which means their notebooks are less than one-inch thick. So, if you’ve been itching to take your gaming on the go with a portable notebook design, now’s your chance to play with the big boys!," Nvidia says.</p> <p>Nvidia hasn't updated its GPU page with specs for the new parts yet, though it did reveal some supported features, such as BatteryBoost -- as its name suggests, this is supposed to give gamers longer battery life when unplugged from the wall.</p> <p>The GeForce GTX 960M and 950M also include support for ShadowPlay (record gaming moments and share them on YouTube or stream live to Twitch), Optimus (optimizes notebooks for best graphics performance or best battery life, depending on the application), and of course DirectX 12.</p> <p>Nvidia's newest GPUs are shipping today in a number of upgraded systems from hardware partners such as Asus, Alienware, Razer, HP, Lenovo, Acer, Clevo, Gigabyte, and others.</p> <h3>Update</h3> <p>Nvidia added product pages for both GPUs right as this article was going live. The specs for the <a href="http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-960m/specifications" target="_blank">GeForce GTX 960M</a> include 640 CUDA cores, 1,096MHz base clock, uknown boost clockspeed, and GDDR5 memory clocked at 2,500MHz on a 128-bit bus (80GB/s bandwidth).</p> <p>The <a href="http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-950m/specifications" target="_blank">GeForce GTX 950M</a> is spec'd similarly, but with a 914MHz base clockspeed, 1,000MHz (DDR3) or 2,500MHz (GDDR5) memory clockspeed, and 32GB/s or 80GB/s of memory bandwidth.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/geforce_gtx_960m_and_950m_chips_round_out_nvidia%E2%80%99s_mobile_gpu_lineup_2015#comments GeForce GTX 960M GeForce GTX 950M gpu Hardware mobile nvidia News Thu, 12 Mar 2015 15:40:55 +0000 Paul Lilly 29575 at http://www.maximumpc.com Nvidia GeForce GTX Titan X: We Touched It http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_we_touched_it_2015 <!--paging_filter--><h3>A quick peek into the future</h3> <p>In the land of video cards, Nvidia's GTX Titan is generally considered the king. The original gangster came out in February 2013, followed by the Titan Black a year later, each sporting an unprecedented 6GB of RAM, 7 billion transistors, and more shader processors than you could shake a stick at (eventually tipping the scales at 2880). Nvidia capped it off in March 2014 with the Titan Z, which put two Titan Black GPUs on one card. And now it's been nearly a year since we've seen activity from them on the super-premium end. But the company hasn't been idle. Today we got up close and personal with this obsidian brick of magic, the GTX Titan X.</p> <p>How close? This close:</p> <p><img src="/files/u160416/titanx_620.jpg" alt="Nvidia GeForce GTX Titan X video card" title="Nvidia GeForce GTX Titan X video card" width="620" height="465" /></p> <p>Unfortunately, we were forced to double-pinky swear that we wouldn't give you any specifics about the card just yet, other than the fact that it's got 12GB RAM, eight billion transistors, and is probably the fastest video card on Earth. But we can confirm that it was running several live demos on the show floor of the Game Developers Conference this week, conducted by Epic, Valve and Crytek. This is obviously not going to be a paper launch -- the card is already here. The Titan X is just waiting in the wings until it can get a proper introduction at Nvidia's GPU Technology Conference, which starts on March 17th. In the meantime, we took some nifty photos for you. Hope you brought a bib for the drool!</p> http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_we_touched_it_2015#comments geforce gpu GTC nvida Titan X Video Card News Fri, 06 Mar 2015 02:00:47 +0000 Tom McNamara 29548 at http://www.maximumpc.com Nvidia Slapped with Lawsuit Over GTX 970 Performance and Specifications http://www.maximumpc.com/nvidia_slapped_lawsuit_over_misleading_gtx_970_performance_claims243 <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u69/nvidia_geforce_gtx_970.jpg" alt="Nvidia Geforce GTX 970" title="Nvidia Geforce GTX 970" width="228" height="184" style="float: right;" />Gigabyte also tagged in proposed class-action lawsuit</h3> <p>The furor over GTX 970’s specs refuses to die down. What was until recently a <strong>public relations debacle is now threatening to snowball into a costly lawsuit</strong>, with a class-action complaint being filed Thursday by Cass County, Michigan-resident Andrew Ostrowski against Nvidia and Gigabyte for engaging “in a scheme to mislead consumers nationwide about the characteristics, qualities and benefits of the GTX 970.”</p> <p>Before we go any further, here’s a quick recap: In late January, many people began complaining about performance issues in games once VRAM usage hit the 3.5GB mark. This <a href="http://www.maximumpc.com/gamers_petition_geforce_gtx_970_refund_over_error_specs_2015" target="_blank">prompted Nvidia to clarify</a> that the 970’s total memory is divided into a 3.5GB segment and a 0.5GB segment, with the comparatively slower second partition being only used when the game needs more than 3.5GB memory. To make matters worse, the company also disclosed that the card actually has fewer ROPs (raster operations pipeline) and a smaller L2 cache than advertised, a gaffe it attributes to internal miscommunication that lead to an error in the reviewer’s guide. It bears mentioning, however, that the impact on real-world performance appears to be minimal — at least for now.</p> <p>The <a href="http://www.scribd.com/doc/256406451/Nvidia-lawsuit-over-GTX-970" target="_blank">proposed lawsuit</a> alleges that Nvidia deliberately avoided disclosing the discrepancy, lest it have an adverse impact on sales and ruin what eventually turned out be an annus mirabilis of sorts for the company.</p> <p>“In other words, Nvidia’s record profits were driven in part by the sale of the company’s flagship GTX 970 GPUs, which is likely why it did not want to disclose the material limitations at issue herein until after it had made millions of dollars in sales of such products,” reads the complaint, adding that the two defendants still persist in making some of the misleading claims in their advertising and marketing literature.</p> <p>First and foremost, Ostrowski wants the court to issue an order granting the case official class-action status, and to appoint him and his counsel to represent the class. Once that is out of the way, he would like to see the court award, among other things, disgorgement, restitution and “an order that defendants engage in a corrective advertising or full refund campaign.”</p> <p><em>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></em></p> http://www.maximumpc.com/nvidia_slapped_lawsuit_over_misleading_gtx_970_performance_claims243#comments class action gpu graphics card GTX 970 lawsuit legal nvidia News Mon, 23 Feb 2015 11:46:46 +0000 Pukit Chandna 29457 at http://www.maximumpc.com Nvidia Driver Purposely Disables Overclocking GTX 900M Series GPUs http://www.maximumpc.com/nvidia_driver_purposely_disables_overclocking_gtx_900m_series_gpus_2015 <!--paging_filter--><h3><img src="/files/u69/gtx_980m.jpg" alt="Nvidia GeForce GTX 980M" title="Nvidia GeForce GTX 980M" width="228" height="126" style="float: right;" />Nvidia confirms it doesn't want you overclocking its GTX 900M GPUs</h3> <p>To overclock or not to overclock -- it's a question every enthusiast wonders at some point or another. The primary advantage to overclocking is a free performance boost, provided you don't fry anything in the process. And of course the downsides are the various risks, from instability to cooking your components. It's those downsides that prompted <strong>Nvidia to take away the ability to overclock (or underclock) GeForce GTX 900M Series GPUs</strong> through a recent driver update.</p> <p>The revelation came when users discovered they could no longer overclock their GTX 900M GPUs after installing Nvidia's GeForce R347.29 driver release. Miffed and perplexed, they sought an explanation on Nvidia's forums, only to have their worst fears confirmed.</p> <p>A few pages in, a customer care agent for Nvidia had this to say:</p> <p style="padding-left: 30px;">"Unfortunately GeForce notebooks were not designed to support overclocking. Overclocking is by no means a trivial feature, and depends on thoughtful design of thermal, electrical, and other considerations," Nvidia said. "By overclocking a notebook, a user risks serious damage to the system that could result in non-functional systems, reduced notebook life, or many other effects.</p> <p style="padding-left: 30px;">There was a bug introduced into our drivers which enabled some systems to overclock. This was fixed in a recent update. Our intent was not to remove features from GeForce notebooks, but rather to safeguard systems from operating outside design limits."</p> <p>This is actually a restriction that's been present in the last three driver updates, though this is the first time that Nvidia confirmed it's an intentional restriction.</p> <p>It will be interesting to see if Nvidia holds firm on this stance. On one hand, the company is right, there are risks to overclocking, and it's especially tricky in laptops with limited cooling potential. But on the other hand, Nvidia is treating its performance oriented customers as being reckless and not particularly savvy. And maybe some are, but the consensus among those who've replied to Nvidia's post is that they should be allowed to weigh the risks and choose for themselves whether or not to overclock.</p> <p>"This is just outrageous behavior. Not even a word that this was 'by design' following months of complaints. I have just ordered an Sli 980m enthusiast machine. Now I find you blocked overclocking deliberately. I am so mad I cannot even comment any more without resorting to bad language," a user responded.</p> <p>Another person chimed in that they own a notebook with a desktop processor and when overclocking, the system scores 9,503 in Fire Strike. At stock, it scores 8,470.</p> <p>"That's a pretty decent difference. My GPU temps don't go past 80C and my processor doesn't go past 60C. There should at least be an exception to hybrids like mine. Otherwise, I guess this is the last driver I'm going to be updating to. I didn't pay all that money just to be restricted. Ya'll wanna gimp, then how about lowering that price," the user added.</p> <p>You can <a href="https://forums.geforce.com/default/topic/805791/geforce-drivers/gtx-900m-overclocking-with-347-09-347-25/post/4458903/#4458903" target="_blank">check out the thread</a> for more angry responses, and then let us know what you think about all this in the comments section below.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_driver_purposely_disables_overclocking_gtx_900m_series_gpus_2015#comments geforce gtx 900m gpu graphics mobile nvidia overclocking News Fri, 13 Feb 2015 17:33:23 +0000 Paul Lilly 29422 at http://www.maximumpc.com AMD Radeon 300 Series GPUs Nearing Release http://www.maximumpc.com/amd_radeon_300_series_gpus_nearing_release390 <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u46168/radeon_graphics_logo_0.jpg" alt="Radeon R9 300 Series GPUs Incoming" title="Radeon R9 300 Series GPUs Incoming" width="228" height="193" style="float: right;" />Company is ‘putting finishing touches’</h3> <p>Ever since the <a href="http://www.maximumpc.com/gamers_petition_geforce_gtx_970_refund_over_error_specs_2015" target="_blank">furor over GTX 970’s specs</a> first erupted last month, AMD has been having plenty of fun rubbing salt into Nvidia’s self-inflicted wounds, reminding GTX 970 owners how Nvidia lied to them and asking those interested in getting a “real video card at a decent price” to go for one of its products instead. Seemingly convinced by the sales pitch, a <a href="http://www.pcgamer.com/amd-putting-the-finishing-touches-on-radeon-300-series/" target="_blank">former GTX 970 owner from Down Under recently took to AMD’s Facebook page</a> to know about the odds of either the <strong><a href="http://www.maximumpc.com/benchmarks_what_could_be_amd_r9_390x_allegedly_leaked300" target="_blank">R9 390x</a> or R9 380x</strong> making it to the market in time for GTA V’s PC release.</p> <p>Here’s what AMD had to say: “Hey mate, we don't have an official date to share just yet but the second we know, we will definitely announce it on Facebook. We're still putting the finishing touches on the 300 series to make sure they live up to expectation. Can't wait to reveal them though. We're pretty excited."</p> <p>During the chipmaker’s latest earnings call, CEO Lisa Su promised to launch some&nbsp; "very good" graphics cards during the second quarter of 2015. It also expects to begin shipping its new “Carrizo” accelerated processing units (APUs) during the same period.</p> <p><em>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></em></p> <p>&nbsp;</p> http://www.maximumpc.com/amd_radeon_300_series_gpus_nearing_release390#comments AMD radeon 300 gpu graphics card GTX 970 nvidia r9 300 News Mon, 09 Feb 2015 11:44:10 +0000 Pulkit Chandna 29399 at http://www.maximumpc.com How to Overclock Your Graphics Card http://www.maximumpc.com/how_overclock_your_graphics_card_2015 <!--paging_filter--><h3><span style="font-weight: normal;"><img src="/files/u162579/314023-nvidia-geforce-gtx-titan-angle.jpg" alt="Titan" title="Titan" width="250" height="245" style="float: right;" />Learn how to wring every last bit of performance out of your video card</span></h3> <p>Overclocking a graphics card used to be more trouble than it was worth, but things have changed. EVGA Precision X and MSI Afterburner are just two of the most popular choices for software overclocking. AMD even bundles its own overclocking solution—AMD OverDrive—with its Catalyst drivers. Wringing more performance out of your graphics card is now as simple as moving a few sliders and testing for stability with a benchmark.&nbsp;</p> <p>That’s not to say that the best overclocking practices are obvious. We’re here to help with a guide on how to overclock your graphics card. Be forewarned—even the most basic overclocks can end in tragedy. Although we’re willing to walk you through the steps, we can’t be responsible for any damaged hardware or problems arising during the overclocking process. If you’re willing to take the risk, read on to learn how to overclock your graphics card. Keep in mind that the procedure for each video card can be slightly different. If any part of the guide doesn’t make sense, ask for help in the comments or spend some time on Google.&nbsp;</p> <h3><span style="font-weight: normal;">1. Gearing Up</span></h3> <p style="text-align: center;"><img src="/files/u162579/afterburner.png" alt="MSI Afterburner" title="MSI Afterburner" width="500" height="338" /></p> <p style="text-align: center;"><strong>MSI Afterburner is capable overclocking software that works with most AMD and Nvidia cards.</strong></p> <p>Our favorite overclocking software is <a href="http://event.msi.com/vga/afterburner/download.htm" target="_blank">MSI Afterburner</a>. Your other options include <a href="http://www.evga.com/precision/" target="_blank">EVGA Precision X</a> for Nvidia cards, and for AMD Cards, AMD OverDrive, but to keep things simple we’ll be working solely with MSI Afterburner.&nbsp;</p> <p>You’ll also need a benchmark like <a href="http://store.steampowered.com/app/223850/" target="_blank">3DMark</a>—download the demo—or <a href="http://unigine.com/products/heaven/" target="_blank">Unigine’s Heaven Benchmark</a> to make sure your overclocks are stable enough for daily use. They’re also useful for quantifying just how much more performance you’re getting out of your hardware.&nbsp;</p> <p><a href="http://www.techpowerup.com/gpuz/" target="_blank">GPU-Z</a> is the final piece of the puzzle and although you don’t technically need it, it’s super helpful for checking your GPU and memory clock speeds.&nbsp;</p> <h3><span style="font-weight: normal;">2. Getting in the Know</span></h3> <p>Before you even start overclocking, it helps to know what sort of overclocks you can expect from your hardware. <a href="http://hwbot.org/" target="_blank">HWBOT</a> is the easiest way to look up what overclocks other users are achieving. Our test bench included the <a href="http://hwbot.org/hardware/videocard/geforce_gtx_650_ti/" target="_blank">GTX 650 TI</a> and <a href="http://hwbot.org/hardware/videocard/radeon_hd_7850/" target="_blank">7850</a>, which have average overclocks listed on the site.&nbsp;</p> <p>It also helps to know how much real-world performance you’ll be getting out of your overclocks. Although you probably don’t need to run through an entire suite of benchmarks, having a baseline to refer to is useful. Run through 3DMark or Heaven Benchmark once to get your base scores.&nbsp;</p> <h3><span style="font-weight: normal;">3. Core Speed Overclocks</span></h3> <p style="text-align: center;"><img src="/files/u162579/heaven2.jpg" alt="Unigine Heaven" title="Unigine Heaven" width="600" height="338" /></p> <p style="text-align: center;"><strong>Unigine’s Heaven benchmark looks good and is packed with features.</strong></p> <p>Once you’ve got some averages in hand—for the 650 TI: 1,179MHz GPU and 1,687MHz memory—you’re ready to start overclocking. Start by maxing out the Power Limit slider—this isn’t the same as overvolting, the power limit is simply how much power your card can draw. Then grab the Core Clock slider and move it forward at 20MHz increments. After applying your changes, crank up the settings on Heaven Benchmark—quality at ultra, tessellation to extreme, anti-aliasing to 8x, and resolution at system—and run through it at least once by pressing F9 or clicking the “Benchmark” button. &nbsp;Keep an eye out for weird graphical artifacts—visual glitches that range from colorful lines of light to random off-color pixels across the screen—and for crashes. If the benchmark crashes to the desktop, seems to slow down dramatically, or gives you a lower frame rate or score upon completion, drop the clock speed by 10MHz until you can run through the benchmark without any problems.</p> <h3><span style="font-weight: normal;">4. Memory Speed Overclocks</span></h3> <p>When you’ve found the highest stable clock speed for your card, repeat step two with the memory clock slider. Your memory clock speed generally won’t affect your frame rate or benchmark scores as much as the core clock speed, but it’ll help, especially if you’re running at a higher resolution.&nbsp;</p> <h3><span style="font-weight: normal;">5. Stability Check</span></h3> <p>Lock in both of your increased clock speeds, run through Heaven a final time, and you should be seeing higher frame rates and a higher score. Go wild and test out your overclocked card in your favorite games to make sure that it’s stable enough for daily use—if it isn’t, step down your GPU and memory clock speeds until it is. To be extra safe, you can leave Heaven running for a few hours to make sure you won’t run into any problems during an extended gaming session.</p> <p><em>Read on for information on overvolting, special situations, and the results of our overclocks.</em></p> <hr /> <h3><span style="font-weight: normal;">Overvolting</span></h3> <p>If you’re not satisfied with your card’s overclocking performance at standard voltages, some cards let you crank up the voltage to squeeze even more performance out of your hardware. Before you do anything, spend a few minutes on Google to look up what other users are reporting as safe voltages for your specific graphics card.&nbsp;</p> <p style="text-align: center;"><img src="/files/u162579/afterburner_voltage_control_settings.png" alt="MSI Afterburner Properties" title="MSI Afterburner Properties" width="350" height="628" /></p> <p style="text-align: center;"><strong>If you're feeling frisky, unlock voltage control and monitoring.</strong></p> <p>You have to dig into Afterburner's settings to gain access to your card’s voltage. Increase your voltage by 10mV at a time until your overclock is stable, your temperatures exceed 70 degrees Celsius, or you reach your card’s maximum safe voltage.&nbsp;</p> <p>Even if you’re operating within the maximum safe voltage, overvolting a card can have severe consequences, including general instability, decreased part lifespan, and unsafe temperatures. It’s usually a good idea to stick to stock voltages unless you really need every last bit of performance from your card.&nbsp;</p> <h3><span style="font-weight: normal;">Special Situations</span></h3> <p>Each and every video card overclocks differently. These differences aren’t limited to just how much you can push the card. Some cards like the GTX 670 and 680 utilize GPU boost to ramp up graphics performance when you need it. Those cards unlock special sliders in Precision X to manage when the boost is active. If you’re working with a card that has GPU boost, you’ll want to play around with the Power Target slider, which determines when the boost is applied. Pump up the boost and your card won’t downclock as often—unless you’re temperatures are getting too high.</p> <h3><span style="font-weight: normal;">The Results</span></h3> <p style="text-align: center;"><img src="/files/u162579/overclocked_650ti.gif" alt="Nvidia GTX 650 Ti Overclock" title="Nvidia GTX 650 Ti Overclock" width="393" height="485" /></p> <p style="text-align: center;"><strong>We haven’t won any records, but we do have a respectable overclock.</strong></p> <p>In our Nvidia test system with an i5-3570k running at 3.4GHz and a GTX 650 Ti, we managed to overclock the graphics card to 1,161/1,600MHz from a stock 941/1,350MHz. That’s a 19% increase in GPU clock speed and a 16% increase in memory clock speed.&nbsp;</p> <p style="text-align: center;"><img src="/files/u162579/overclocked_7850.png" alt="AMD Radeon HD 7850 Overclock" title="AMD Radeon HD 7850 Overclock" width="393" height="485" /></p> <p style="text-align: center;"><strong>This 7850 didn’t play nice with memory overclocks, but a 190MHz increase in core clock speed isn’t bad at all.</strong></p> <p>Our AMD test system with an i5-3570k running at 3.8GHz and a 7850, generated comparable results with a default 860/1,200MHz pushed to 1,050/1,225MHz. That’s an 18% increase in GPU clock speed and a less impressive 2% bump in memory clock speed.</p> <div style="text-align: left;"> <table class="MsoNormalTable" style="width: 615px; border-collapse: collapse;" border="0" cellspacing="0" cellpadding="0"> <thead> <tr style="mso-yfti-irow: 0; mso-yfti-firstrow: yes; height: .2in;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">&nbsp;</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Stock GTX 650 Ti</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Overclocked GTX 650 Ti</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Stock 7850 </span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Overclocked 7850</span></p> </td> </tr> </thead> <tbody> <tr style="mso-yfti-irow: 1; height: 9.75pt;"> <td style="border: none; border-bottom: solid #CCCCCC 1.0pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">3DMark Fire Strike</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">2,990</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">3,574</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">4,119</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">4,706</span></p> </td> </tr> <tr style="mso-yfti-irow: 2; height: .2in;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Unigine Heaven 4.0 (fps)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">15.6</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">18.7</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">20.5</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">23.8</span></p> </td> </tr> <tr style="mso-yfti-irow: 3; height: 9.75pt;"> <td style="border: none; border-bottom: solid #CCCCCC 1.0pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">BioShock Infinite (fps)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">36.6</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">42.1</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">42.4</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">48.44</span></p> </td> </tr> <tr style="mso-yfti-irow: 4; height: 9.75pt;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Tomb Raider (fps)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">25.2</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">31.5</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">31.3</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: 0in; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">33.2</span></p> </td> </tr> <tr style="mso-yfti-irow: 5; mso-yfti-lastrow: yes; height: 9.75pt;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Core/Memory Clock (MHz)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">941/1,350</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">1,161/1,600</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">860/1,200</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">1,050/1,225</span></p> </td> </tr> </tbody> </table> </div> http://www.maximumpc.com/how_overclock_your_graphics_card_2015#comments amd. graphics card gpu how to overclock nvidia overclocking performance Video Card Features How-Tos Fri, 06 Feb 2015 23:28:34 +0000 Ben Kim 27083 at http://www.maximumpc.com Ask the Doctor: IDing Your GPU, Surge Protection, and RAM Configs http://www.maximumpc.com/ask_doctor_iding_your_gpu_surge_protection_and_ram_configs_2014 <!--paging_filter--><p><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:PunctuationKerning /> <w:ValidateAgainstSchemas /> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:Compatibility> <w:BreakWrappedTables /> <w:SnapToGridInCell /> <w:WrapTextWithPunct /> <w:UseAsianBreakRules /> <w:DontGrowAutofit /> </w:Compatibility> <w:BrowserLevel>MicrosoftInternetExplorer4</w:BrowserLevel> </w:WordDocument> </xml><![endif]--><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:PunctuationKerning /> <w:ValidateAgainstSchemas /> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:Compatibility> <w:BreakWrappedTables /> <w:SnapToGridInCell /> <w:WrapTextWithPunct /> <w:UseAsianBreakRules /> <w:DontGrowAutofit /> </w:Compatibility> <w:BrowserLevel>MicrosoftInternetExplorer4</w:BrowserLevel> </w:WordDocument> </xml><![endif]--></p> <p><!--[if gte mso 9]><xml> <w:LatentStyles DefLockedState="false" LatentStyleCount="156"> </w:LatentStyles> </xml><![endif]--><!--[if gte mso 9]><xml> <w:LatentStyles DefLockedState="false" LatentStyleCount="156"> </w:LatentStyles> </xml><![endif]--><!--[if !mso]><object classid="clsid:38481807-CA0E-42D2-BF39-B33AF135CC4D" id=ieooui></object> <mce:style><! st1\:*{behavior:url(#ieooui) } --><!--[if !mso]><object classid="clsid:38481807-CA0E-42D2-BF39-B33AF135CC4D" id=ieooui></object> <mce:style><! st1\:*{behavior:url(#ieooui) } --><!--[endif] --><!--[endif] --><!--[if gte mso 10]> <mce:style><! /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-ansi-language:#0400; mso-fareast-language:#0400; mso-bidi-language:#0400;} --><!--[if gte mso 10]> <mce:style><! /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-ansi-language:#0400; mso-fareast-language:#0400; mso-bidi-language:#0400;} --><!--[endif] --><!--[endif] --></p> <h3>Welcome to the Club</h3> <p>I am new to “higher-end” PC gaming and I have a couple of questions. How do I tell what graphics card I have? I have thrown the box away and the only thing I can see on the card is “Direct CUII.” The DirectX diagnostic tool, Windows system devices, and AMD Catalyst screens all just say, “AMD Radeon R9 200 series” and not the actual model I have. I don’t know if it’s an R9 270 or a 290. Second, I currently game on a PS3 on a 42-inch HDTV. What PC games can you recommend that will really showcase the fact that the PC has better graphics than the PS3 and PS4? -- Dave</p> <p>&nbsp;</p> <p class="MsoNormal"><strong>The Doctor Responds:</strong> To find out which video card you have, you can download GPU-Z (www.techpowerup.com/gpuz) and it will tell you. You can also right-click your “PC” icon, aka “My Computer,” and select Properties, then go to Device Manager, expand the tree under Display Adapters, and you should see it there. You should also be able to tell the difference between the R9 270 and 290 by size, as the R9 290 is roughly 11 inches long, whereas the R9 270 is only about nine inches. Or, you can look for the code names for the GPU cores in the utilities, as the R9 290 will be referred to as Hawaii and the R9 270 is called Pitcairn. As far as which games showcase the PC’s power, any of the latest titles will do, such as Battlefield 4, Tomb Raider, Watch Dogs, or Titanfall. However, the biggest advantage the PC has over consoles is the ability to run at high resolution instead of being limited to whatever the console can do, which is 720p for your PS3 and 1080p for the next-gen consoles. A rig with an R9 270 will probably be stuck at 1080p, but you should be able to turn up some details to make it look better. If you have an R9 290 though, you should be investing in a larger monitor, preferably one that runs at 2560x1440 or so. Once you see a PC game on that huge LCD with all settings maxed, you’ll never go back to a console.</p> <p class="MsoNormal">&nbsp;</p> <p style="text-align: center;"><img src="/files/u187432/mpc102.qs_doctor.gpuz_.jpg" width="400" height="494" /></p> <p style="text-align: center;"><em>GPU-Z is a free utility that will show you more information than you ever wanted to know about your GPU.</em></p> <h3 class="MsoNormal">How to Stop My Surging</h3> <p class="MsoNormal">Back in 2007 I came into some money, so I bought the main parts of the 2007 Dream Machine. The only part I’ve upgraded over the years is the video card. As good as this setup was, it was beginning to show its years, so I decided to build a new machine this year. I purchased your recommended (at the time) best mobo, the Asus X79 Deluxe, along with Core i7-4820K processor, Cooler Master Hyper 212, WD 3TB HD, Critical M500 480GB SSD, 16GB of Ballistix DDR3/1600 RAM, and a Radeon R9 270 Gaming graphics card. I kept my LG Bluray player, Ultra 1,000-watt ATX power supply, and Antec Nine Hundred Black Steel ATX case. I am in the military and so my gaming time is limited (usually to only one game at a time). Currently, I am playing Elder Scrolls Online. Every time I boot up ESO and have the x79 Anti-Surge enabled (UEFI BIOS advanced feature), the computer shuts down. If I turn off the Anti-Surge, I can play the game fine. There have been a few times during normal computing the Anti-Surge feature has been triggered, as well. So my question is, is the X79 Anti-Surge feature just too sensitive, or is my Ultra power supply needing to be replaced? If you think my power supply should be replaced, could I have damaged my Mobo by playing with the Anti Surge feature off? -- Thomas Eddy</p> <p class="MsoNormal"><strong>The Doctor responds:</strong> The Asus Anti-Surge feature is supposed to shut down the system if it detects abnormal voltage coming to the motherboard. There are three possible problems; the first is easy. Asus says if you’re running additional voltage-monitoring tools (software or direct monitoring via probes to the motherboard), this may trigger false positives to the Anti-Surge feature and shut down the machine. The second problem could be that Anti-Surge is known to be a bit finicky, and a PSU that is just a little out of spec or just starting to give up the ghost could be triggering it. This brings us to the third reason: Your PSU is likely the problem. You say this only occurs when you fire up ESO, the only game you play. That also means it’s the only time your PSU is under its heaviest load, since the graphics card is the biggest power load on the system. This in turn triggers Anti-Surge, and you go dark. You could buy a PSU from a store that allows returns without restocking fees and install the new PSU to see if it resolves your issue. (It will likely solve your problem, but if it’s not a PSU issue, the Doc thinks it’s kosher to return the new one.) The second solution is to simply turn off Anti-Surge. This option is probably the easiest and cheapest, since hardware is surprisingly sturdy in the Doctor’s experience and failing PSUs rarely wreck hardware. Rarely doesn’t mean never though, so if you are risk averse, an updated PSU from a reputable brand may be a good move. Just be advised that you probably don’t need anything as beefy as a 1,000W PSU today. You will be fine with a far more moderate and modern PSU, and if you want to save on cash, one in the 650–750W range should work just fine.</p> <h3 class="MsoNormal">Will It Fk up My dsk?</h3> <p class="MsoNormal">I work on computers quite often, and Chkdsk is a utility I make frequent use of. However, I recently read it can actually cause more damage to a corrupt drive. I’ve never had any problems with it (worst case scenario: it didn’t help), but I’m a cautious person, especially when I’m working on another person’s computer. Because so much of the Internet is unreliable (that was my source), I’m coming to you for advice. Is it ever dangerous to use Chkdsk? -- Justin</p> <p class="MsoNormal"><strong>The Doctor responds:</strong> For those who don’t know what Chkdsk is, it’s a free utility that comes with every recent version of Windows and it is, according to Microsoft, “...a tool that checks volumes on your hard disk drive for problems. The tool then tries to repair any problems that it finds. For example, Chkdsk can repair problems related to bad sectors, lost clusters, cross-linked files, and directory errors.” So, to summarize, Chkdsk does two things. First, it examines a drive to make sure all the files that are referenced are actually there and can be accessed, and also notes if a sector can’t be accessed. Then, once a drive has been examined, the tool can try to “repair” any errors it found, but it will only do this if you check that option. Therefore, running Chkdsk by itself is normally not dangerous unless a drive is teetering on the brink of disaster, then yes, any attempt to read a drive’s bad sectors reduces the chance of data recovery from those sectors. Since Chkdsk can take a while to run, especially on a failing drive, it may in fact “kill” the drive by continually accessing it. In that case, it can cause data to become unrecoverable. Chkdsk is not a data-recovery tool, it just repairs errors relating to a drive’s file structure, but won’t recover data from a failing drive. To actually recover inaccessible data, you need special software and it almost always costs money.</p> <p class="MsoNormal">One more thing -- failing drives need to be treated with the utmost caution, so the first order of business when a drive starts giving you access errors is to make an image or a sector-by-sector copy of it onto a healthy drive. You can then perform your data recovery on the healthy drive. And just in case it isn’t clear, you should always, always, always have a backup of important files so that when a drive begins acting funky, you can run repair tools like Chkdsk without worrying about losing important files.</p> <p class="MsoNormal">&nbsp;</p> <p style="text-align: center;"><img src="/files/u187432/mpc102.qs_doctor.chkdsk.png" width="669" height="422" /></p> <p style="text-align: center;"><em>Chkdsk can “repair” file structures but is not a data-recovery tool.</em></p> <h3 class="MsoNormal">Are Four DIMMS Better than Two?</h3> <p class="MsoNormal">I am in the process of buying a new desktop. It will be running Windows 7 Professional 64-bit, an Intel Core i7-4770 processor, and 16GB of DDR3/1600. I have the option of using either four sticks of 4GB or 2 sticks of 8GB. The vendor specs either option at the same price. My initial thought process says go with the 2x8GB option, leaving two memory slots open for a possible upgrade to 32GB sometime in the future (if ever needed). But then I began wondering if there was any reason that the 4x4GB option might have advantages. Could it be faster? More reliable? Better longevity? Generate less heat? The obvious downside is that I will have to throw away one or more 4GB sticks if I want to increase my memory in the future. I’m not a gamer but I do want a system that will be fast and reliable for a number of years (my present desktop is 10 years old!). What’s your recommendation on the memory choice? -- Peter Anderson</p> <p class="MsoNormal"><strong>The Doctor responds:</strong> With today’s CPUs there’s no perceivable advantage between using four RAM modules versus two modules. Typically, though, higher-clocked modules are usually available with lower-capacity modules first. This isn’t always true, but generally it is. More modules will also theoretically use more power and increase the chance of failure, since you have four physical modules rather than two physical modules. With some memory controllers, fully loading them by filling all of the modules could result in lower top-end clocks, as well. In the old Athlon 64 days, for example, running four modules would result in lower RAM clock speeds than if you just used two modules.</p> <p class="MsoNormal">Since you probably aren’t going to be trying to overclock your RAM to DDR3/4290 (the current record), your best course of action is to use two 8GB modules to give you the best upgrade price.</p> http://www.maximumpc.com/ask_doctor_iding_your_gpu_surge_protection_and_ram_configs_2014#comments ask the doctor gpu ram surge prorection From the Magazine Thu, 05 Feb 2015 22:30:53 +0000 Max PC 29379 at http://www.maximumpc.com