graphics card en AMD Will Discuss Next-gen GPUs ‘Later This Quarter’ <!--paging_filter--><h3><img src="" alt="Radeon R9 300 Series" title="Radeon R9 300 Series" width="228" height="193" style="float: right;" />Nvidia’s latest and greatest GPUs remain unchallenged</h3> <p>Advanced Micro Devices will talk about its much-awaited (and long overdue) next-gen graphics cards later in this quarter, the company’s CEO Lisa Su said in an earnings call earlier this week. This is significant because <strong>if there was ever a need for a graphics card refresh, it’s right now.</strong></p> <p>Until recently many people were wondering if the launch would happen in time for GTA V’s PC release, but AMD remained noncommittal, offering nothing more than a vague, albeit tantalizing, response. “We’re giving finishing touches,” is all it said. GTA V is already here, of course.</p> <p>To be fair to AMD, it’s not as if they promised something in a particular timeframe and failed to deliver on time. Even during the previous quarterly earnings call all Su would say is that the company had some “very good” graphics cards lined up for the second quarter of 2015. But the company is not doing itself any favors by letting <a href="">Nvidia’s latest and greatest GPUs go virtually unchallenged</a> for so long (over 9 months and counting).</p> <p>“So as we go into the second half of the year, we would like to see some regain of share in both the desktop and the notebook business,” Su said, responding to a question from Credit Suisse’s John Pitzer.</p> <p>“I've talked about Carrizo being a strong product for us, I talked about some of our graphics launches that we'll talk about later this quarter. So from our standpoint, I would say the first half of the year, we had some, let's call it, some of our issues that we were correcting in terms of the channel, and then a weaker than expected market environment.”</p> <p>The <a href="">first quarter wasn’t a particularly good one for the company</a> as it slumped to a $180 million net loss. Although a significant improvement over the previous quarter in which it reported a net loss of $364 million, this is several times worse than the $20 million loss it recorded during the same period last year.</p> <p>Hopefully, things will improve once the new graphics cards are finally available. Kitguru <a href="" target="_blank">expects them to be unveiled at Computex in June.</a></p> <p>In the meantime, we will have to make do with unverified reports and rumored specs.<br />Coalescing relevant bits from the various R9 300 rumors we’ve heard so far, one can paint a <a href="">somewhat detailed picture of the company’s next-gen GPU lineup</a>:</p> <ul> <li>AMD Radeon R9 390X: 28nm Fiji XT GPU, 3,584 cores, 224 TMUs, 64 ROPs, 4GB memory, $599</li> <li>AMD Radeon R9 390: 28nm Fiji Pro GPU, 3,328 cores, 208 TMUs, 64 ROPs, 4GB GDDR5, $399</li> <li>AMD Radeon R9 380X: 28nm Hawaii XTX GPU, 2,816 cores, 176 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 380: 28nm Hawaii Pro GPU, 2,560 cores, 160 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 375X: Tonga XT GPU, 2,048 cores, 128 TMUs, 32 ROPs, 2GB GDDR5, 384-bit, price unknown</li> <li>AMD Radeon R9 375: Tonga Pro GPU, 1,792 cores, 112 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370X: Trinidad XT GPU, 1,280 cores, 80 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370: 28nm Trinidad Pro GPU, 1,024 cores, 64 TMUs, 24 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R7 360X: Bermuda GPU, 896 cores, 128-bit GDDR5, price unknown</li> <li>AMD Radeon R7 350X/340X: Oland GPU, 320 cores, DDR3 and GDDR5 memory, 128-bit</li> <li>AMD Radeon R5 300: Caicos GPU, 160 cores, DDR3 memory, 64-bit</li> </ul> <p><em>Follow Pulkit on <a href="" target="_blank">Google+</a></em></p> amd fiji xt gpu graphics card Hardware nvidia radeon r9 300 News Mon, 20 Apr 2015 06:13:05 +0000 Pulkit Chandna 29751 at Choosing the Best AMD Graphics Card <!--paging_filter--><h3>Prefer the Red Team over the Green Team? We’ve got your back</h3> <p>There was a time when a dozen different companies were selling video cards and vying for your hard-earned cash. But, at least when it comes to gaming, the field has narrowed to just two: Nvidia and AMD. If you’re just doing spreadsheets, surfing the web, and playing the occasional Flash game, you’ll be fine with integrated graphics. But if you spend a lot of time shooting, racing, and flying, a dedicated graphics card is the way to go.</p> <p><a href="" target="_blank">We covered Nvidia a couple of days ago</a>, and now <strong>we’re turning our crosshairs to AMD. What follows is a streamlined buying guide.</strong> No benchmark charts, diagrams, or spec sheets. We’ll link to places where you can get that stuff if you want, but here is where we condense the product line into a few pages of advice. Dig in!</p> <p><strong>Options #1, #2, and #3: Radeon R7 260, 260X, and 265</strong></p> <p>First, we’re looking at the cards in the $100–$130 range. Overall, AMD has a denser collection of options than Nvidia. This creates some overlap, so we’re combining these cards into one tier that’s roughly equivalent to the Nvidia GeForce GTX 750 and 750 Ti tier. Like that pair, these AMD cards are fine for playing at 1080p most of the time.</p> <p><img src="/files/u160416/r7_265.jpg" alt="Sapphire Radeon R7 265" title="Sapphire Radeon R7 265" width="620" height="473" style="vertical-align: middle;" /></p> <p><a title="Sapphire Radeon R7 265 review" href="">We’d recommend the R7 265 with two gigabytes of VRAM to get the best performance.</a> Naturally, you’ll find those at the top of the price range, but we think it’s worth the extra bucks. These three Radeon cards also need only one 6-pin PCI Express cable, so you should be fine with a power supply in the 400–500 watt range. If you have around $200 to spend, though, there are better options from both AMD and Nvidia.</p> <p><strong>Options #4 and #5: Radeon R9 270 and 270X</strong></p> <p>Performance-wise, these cards are roughly equivalent to the Radeon HD 7850 and 7870, and the Nvidia GeForce GTX 670 and 760. So their price range is about $130–$180, if you include mail-in rebates. Like the other R7 cards, we recommend 2GB of VRAM for the best gaming experience at 1080p. These GPUs also will let you use higher visual settings than the R7 cards mentioned earlier. However, the 270 and 270X need two PCI Express power cables, and that’s rare to find with power supply units that are rated for less than 500 watts. So, you may need to upgrade your PSU or factor a more expensive one into your budget. Overall, we’d go with the 270X for a little extra oomph, unless you can afford something even speedier.</p> <p><strong>Options #6, #7, and #8: Radeon R9 280, 285, and 280X</strong></p> <p>The 280 and 280X are basically respun versions of the Radeon HD 7950 with Boost, and the HD 7970 GHz Edition. Which are also roughly comparable to a GTX 680 and 770. The 280 and 280X have 3GB of VRAM, which is a lot for 1080p, but not unwelcome. In fact, it’s enough to handle 1440p fairly smoothly, though you might want a second card in a Crossfire configuration to keep up at that resolution. The 285, however, has 2GB of VRAM and uses a newer, more power-efficient GPU core. Its performance falls in between the 280 and 280X, but because it generates a lot less heat, you can find it in sizes designed for a mini-ITX case.</p> <p><img src="/files/u160416/strix_285.jpg" alt="Asus Strix Radeon R9 285" title="Asus Strix Radeon R9 285" width="620" height="451" style="vertical-align: middle;" /></p> <p>It can also be much quieter, with some versions not even spinning up their fans until the GPU hits a certain temperature. It uses one PCI Express cable, while the 280 and 280X need two. Overall, <a href=";ie=UTF8&amp;node=10510329011&amp;pf_rd_m=ATVPDKIKX0DER&amp;pf_rd_s=merchandised-search-2&amp;pf_rd_r=010AS36QFWGFNJAZTZ3T&amp;pf_rd_t=101&amp;pf_rd_p=2005577422&amp;pf_rd_i=8588809011">our favorite at this tier is the 285</a>, despite having less VRAM, because it can run cooler, quieter, and in a wider variety of cases.</p> <h4 style="text-align: right;"><a href=",1">Click here to see the rest of your options, and the overall winner</a></h4> <hr /> <p><strong>Options #9 and #10: Radeon R9 290 and 290X</strong></p> <p>These are AMD’s top-performing single-GPU cards, and their performance will be within spitting distance of a GeForce GTX 970 and 980. The GTX 980 is a consistently faster card overall, but some gamers still opt for the 290X because it’s about $200 cheaper. However, the 290 and 290X need a lot of watts. We’d recommend 600 or more watts for one of these cards, and 850 watts or more for 2-way Crossfire.</p> <p><img src="/files/u160416/tri-x_290x_0.jpg" alt="Sapphire Tri-X Radeon R9 290X" title="Sapphire Tri-X Radeon R9 290X" width="620" height="284" style="vertical-align: middle;" /></p> <p>We’d also definitely avoid the “reference” cards, because they run quite hot and noisy. <a title="Sapphire Tri-X Radeon R9 290X review" href="">Cooling designs like Sapphire’s Tri-X</a> or Vapor-X, ASUS’s DirectCU II, Gigabyte’s Windforce 3X, MSI’s TwinFrozr, and XFX’s Double Dissipation are highly recommended to keep these cards running quietly and relatively cool. If you’re okay with those stipulations, the 290 and 290X will give you a lot of bang for your buck—about $250 for the 290 and about $350 for the 290X.</p> <p><strong>Option #11: Radeon R9 295X2</strong></p> <p>This card is basically two R9 290X GPUs on a single card. Since these GPUs need a respectable amount of cooling, it should come as no surprise that the 295X2 has a closed-loop liquid cooler (CLC) built into it. This uses a 120mm radiator (bundled with a fan) that you must install on a fan mount somewhere in your case. The card is also 12 inches long. <a title="Radeon R9 295X2 review" href="">And it's got the break-neck performance to justify all this.</a></p> <p><span style="color: #888888;"><img src="/files/u160416/295x2.jpg" alt="Radeon R9 295X2" title="Radeon R9 295X2" width="620" height="304" /></span></p> <p>So while it will get you more performance than a single GTX Titan X, for a few hundred dollars less, it needs a lot of real estate and a lot of power. As we mentioned earlier, an 850-watt power supply (or more) is highly recommended when dealing with multiple 290 or 290X GPUs. That said, the card runs pretty quietly, thanks to the CLC, and it takes up half as many motherboard slots as a 2-way Crossfire config. Like the Titan X, there are no third-party cooling designs, but the “reference” version here is quite good.</p> <p>You may sometimes see this card listed as having “8GB,” but Crossfire, like SLI, mirrors your VRAM, instead of letting you add the two card’s VRAM together. In practice, you will have a capacity of 4GB, just like the R9 290 and 290X, and the GTX 970 and 980. The GTX Titan X has a whopping 12GB, but we haven’t encountered a game or screen resolution where that felt like a necessity. 4GB is fine even with a 4K display (though you’d still want multiple GPUs to smoothly game at that point).</p> <p><strong>And the Winner Is...</strong></p> <p>Like Nvidia, AMD has a wide range of options that make picking a single winner difficult. The R7 265 is our pick at the entry level, and the R9 295X2 packs a ton of performance into about $700, enough to get decent frame rates at 4K and definitely plenty for 1440p. If there were a happy medium here, <strong>we’d go with the R9 290X as the best overall AMD GPU</strong>, provided that you get one with a large heatsink and multiple fans, and you have a sufficient power supply unit.</p> <p><img src="/files/u160416/r9_290.jpg" alt="MSI Radeon R9 290X" title="MSI Radeon R9 290X" width="620" height="452" style="vertical-align: middle;" /></p> <p>As far as PSUs go, we’ve had good experiences with EVGA, Corsair, Antec, Enermax, Silverstone, and SeaSonic. That’s not an all-inclusive list, just the brands that come to mind most often when we need a reliable PSU. The Rosewill Hive and Capstone have good reps too, but we haven’t had as much direct experience with those. You should expect to need two 8-pin PCI Express cables, and possibly an additional 6-pin connection if you want top-end cards like MSI’s “Lightning” edition. But if you can manage that, we think it’s worth it.</p> amd ati best hardware buying guide gpu graphics card radeon shopping guide Features Thu, 09 Apr 2015 20:23:38 +0000 Tom McNamara 29715 at Choosing the Best Nvidia Graphics Card <!--paging_filter--><h3>Doing a little GPU shopping? We know what to put on your list</h3> <p>When you’re trying to figure out the next PC upgrade you should buy, there are at least two ways to go about it. Some people like going through lots of pages of benchmarks, analysis, galleries of the component in various states of disassembly, forum debate, and pictures of fluffy kittens. And that’s great, when you have the time. But not everyone does. For people who want a quicker breakdown of choices like <strong>which Nvidia video card you should buy</strong>, we can condense that into just a couple of pages. We’ll give you a quick tour through the various choices that you have at different price points, and what the pros and cons are at each stage. Then we’ll select an overall winner.</p> <p>For simplicity’s sake, we’ll be sticking to the current "Maxwell” generation of Nvidia’s cards. It has some features not available in the older Kepler generation, like Multi-Frame Sample Anti-Aliasing (MFAA), which is a highly efficient way of smoothing out jagged edges on 3D objects, and Voxel Global Illumination (VXGI), which creates shadows with a degree of realism that we hadn’t seen occurring in real-time before. So that means that our breakdown will be sticking to the GeForce GTX 750, 750 Ti, 960, 970, 980, and the recently released Titan X.</p> <p>If you’re wondering why we’re not doing a breakdown of AMD cards, don’t worry—that’s coming soon.</p> <p class="MsoNormal"><strong>Choices #1 and #2: GTX 750 and GTX 750 Ti</strong></p> <p>We’re combining these two cards because of their overall similarity. These are the entry-level enthusiast cards; the 750 comes in at about $100, and the Ti flavor starts at about $125. These are positioned as the next step up from integrated graphics. You are assisted by the fact that the regular GTX 750 does not even require a PCI Express power cable. It gets all the power it needs from the slot it’s plugged into, which provides up to 75 watts. Although that’s just average for an incandescent light bulb, it’s plenty to get some respectable gaming performance at medium settings. A few versions of the 750 Ti require a PCIe cable, but you still shouldn’t need serious power; a 400-watt power supply will be just fine.<img src="/files/u160416/evga960.jpg" width="620" height="499" /></p> <p>&nbsp;</p> <p>If this is the kind of card that you can afford, we recommend going for the 750 Ti, since it will give you some extra oomph needed to hit that magic mark of 60 frames per second in your games. And we definitely recommend the versions with 2GB of VRAM instead of 1GB, since current 3D games will happily take advantage of the additional capacity. Since these are entry-level cards, we can’t declare them as the "best," but <a title="GeForce GTX 750 Ti benchmarks" href="">they’re fine for 1080p gaming most of the time</a>. These two cards are roughly comparable to the AMD Radeon R7 265 or 270.</p> <p><strong>Choice #3: The GTX 960</strong></p> <p>While the GTX 750 and 750 Ti are technically Maxwell cards, they don’t have the full feature set, so they don’t get MFAA and VXGI. So we sometimes refer to the cards above them as "Maxwell 2.0.” The GTX 960 is the least expensive version, setting you back around $200. It comes in 2GB and 4GB versions, with the latter costing around $240. This card’s performance is roughly comparable to AMD’s R9 Radeon 280 or 285. If you want some benchmarks for reference, <a href=",1" target="_blank">we have them in the GTX 960 review here</a>.Like the 750 and 750 Ti, the GTX 960 does not draw a lot of power. You can find versions that use the same 6-pin PCIe connection that some 750 Ti cards do. But some versions need two such connections, in which case you need at least a 500-watt power supply unit—that’s the threshold where PSUs start having multiple PCIe cables. Since they’re power-efficient, they don’t generate much heat, either, so the card can be more compact than before.</p> <p><img src="/files/u160416/asus-strix-gtx960-dc2oc-2gd5_3d-100563779-orig.jpg" width="620" height="412" /></p> <p>Overall, though, it’s difficult to recommend the GTX 960, because AMD offers comparable performance and power consumption for substantially less money, at least for the 2GB version. The 4GB version can get you somewhat higher framerates, but at $240, it’s not much less than a Radeon R9 290, which has much better performance than both cards. There’s a twist, though: The 290 also needs a lot of juice. We’d recommend a 600-watt power supply for one of those, and 850 watts if you wanted to add a second one to your system for Crossfire.</p> <p><strong>Choices #4 and #5: GTX 970 and 980</strong></p> <p>These are meaningfully different cards, but we’re grouping them together because they came out at the same time. <a title="GeForce GTX 980 review" href="">When the GTX 980 arrived, it was Nvidia’s flagship card</a>—with a flagship price of $550, which hasn’t gone down much since its release in September last year. It’s faster than AMD’s beefiest offering, the Radeon R9 290X, while needing less power and less real estate inside your PC for its cooling system. It comes with 4GB of VRAM and happily makes use of all of that.</p> <p>The GTX 970 is slower across the board and has a funky VRAM management system where the first 3.5GB runs normally and the last 500MB is a bit hobbled, but it also costs about $200 less, and you won’t often encounter scenarios where that 500MB chunk slows things down. Unfortunately, the nature of the 970’s memory system wasn’t clearly communicated to the public, and there’s been some drama.</p> <p>.<img src="/files/u160416/7g0a0209_620_0.jpg" width="620" height="349" style="vertical-align: middle;" /></p> <p>The GTX 970’s biggest enemies, though, are arguably AMD’s R9 290 and 290X. The 290 is just a little bit slower and costs about $80 less, which is getting to be the price for a 250GB SSD. The 290X is a bit faster most of the time, costs about the same, and uses conventional memory management. In fact, at 4K resolution, the 290X is a respectable contender to the GTX 980, (though you’d want two of each to get good framerates at that point). But if you have or plan to have a 500- to 550-watt power supply, the GTX 970 still comes out ahead of AMD’s comparable cards.</p> <h4 style="text-align: right;"><a href=",1">Click here to see the rest of our choices, and the winner</a></h4> <hr /> <p><strong>Choice #6: The GTX Titan X</strong></p> <p class="MsoNormal">The Titan series is basically Nvidia’s 800-pound gorilla, an exclamation point at the end of its lineup. <a title="GeForce GTX Titan X review" href=",2">It will get you the best gaming performance that money can buy</a> —but you’d better have enough, because the asking price is a cool $1,000. That’s nearly twice the cost of a GTX 980, and about three times the cost of an R9 290X. But the Titan cards also have huge amounts of VRAM—a staggering 12GB in the case of the Titan X. They draw more power than other Nvidia cards, too, with a TDP of about 250 watts. That’s a measure of its draw when firing on all cylinders, without its clock speeds being manipulated above factory settings.</p> <p class="MsoNormal"><img src="/files/u160416/titan_x.jpg" width="620" height="372" /></p> <p>Despite that, the Titan X still fits inside Nvidia’s reference cooling system, with a 10.5 inch card getting air from a single turbine-like intake that pushes all its heat out the back of the case. This design is called a "full shroud.” A partial shroud uses fans to blow air onto a heatsink, but the frame holding the fans is not fully enclosed, so that heat circulates around the case. However, the fans and the heatsink can be much bigger and more effective when not restricted to Nvidia’s reference specs.</p> <p>When you’re in a 250-watt range, that extra flexibility comes in handy, as we’ve seen from AMD’s R9 290 and 290X. The reference versions of these cards were hot and noisy, <a title="Sapphire Tri-X Radeon R9 290X review" href="">but you can get cards like the Sapphire Tri-X that run cool and quiet</a> (if you have enough room for a 12-inch card, that is). We mention this because Nvidia does not allow third parties to put alternative coolers on its Titan cards. The only exception it makes is for EVGA’s "Hydro Copper” series. These have pre-installed heatsinks that are designed to hook into custom water-cooling loops. If you want cooling other than what Nvidia has approved, you have to do it yourself, which can be a little stressful, considering the expense of the card itself.</p> <p><strong>And the winner is...</strong></p> <p>It’s tough to pick a single overall winner from such a wide spread of choices, ranging literally from $100 to $1,000. Do you want the best overall performance? Then the Titan X is your guy. But maybe you’re not comfortable with spending that much money, so we go down to the GTX 980. Is it really more than $200 better than a GTX 970, though? We’re not convinced of that. If we had to pick an overall winner that balanced price, performance, and features,&nbsp;<strong>our choice is the GeForce GTX 970</strong>, despite the way that its VRAM has been segmented.</p> <p><img src="/files/u160416/gigabyte_970.jpg" width="620" height="529" /></p> <p>In our experience, the segmentation just hasn’t produced a subjectively noticeable drop in performance, or even an objectively consistent one—even when scaling up to 4K, where VRAM demands are high. Since it’s also more than $200 cheaper than the GTX 980, you could add a second GTX 970 to your system and spend only about $150 more. Just one will be plenty at 1080p, though. And SLI can be buggy and sometimes doesn’t work at all anyway. And you need the proper amount of power and PCI Express connectors (about 750 watts, four connectors). But it’s a nice option to have.</p> <p>And like the GTX 980, you can get cards that have a variety of cooling options. If you’re into mini-ITX PCs and you don’t have a lot of space to work with, you can also get shrunk-down versions of the 970 from Asus, Gigabyte, and Zotac. They’re much shorter, but they don’t sacrifice any performance. That’s an option that you can’t get from the Radeon R9 290 or 290X, or even the GTX 980. Given the flexibility, performance, and price of the GTX 970, it’s hard to argue for other cards from the Green Team, unless your budget is either extremely tight or extremely loose.</p> best gpu graphics graphics card nvidia PC hardware video News Features Tue, 07 Apr 2015 18:58:48 +0000 Tom McNamara 29705 at Asus Unveils Limited Edition Gold Edition GeForce GTX 980 Graphics Card <!--paging_filter--><h3><img src="/files/u69/asus_gold_gtx_980.jpg" alt="Asus Gold GTX 980" title="Asus Gold GTX 980" width="228" height="197" style="float: right;" />Celebrating 20 years of graphics card production</h3> <p>It was in 1995 that Asus introduced its first graphics card, and to celebrate 20 years of graphics card production, <strong>Asus is releasing a limited edition 20th Anniversary Golden Edition GTX 980</strong> that it claims is the fastest of its kind. How fast? Asus cranked the GPU to 1,431MHz, up from Nvidia's reference specification of 1,126MHz. It's even higher than Nvidia's 1,216MHz reference <em>boost</em> clockspeed.</p> <p>The 4GB of GDDR5 memory remains untouched at 7,010MHz on a 256-bit bus, though the clockspeed and gold colored cooling solution aren't the only standouts. The card boasts high quality components that should stand up better to overclocking, and the cooling apparatus boats 0dB fan technology that cools 15 percent better than reference and runs three times quieter, according to Asus. There's also a memory defroster, in case you plan on going nuts with liquid nitrogen.</p> <p>On top of the card is a 20th Anniversary designation. More than just eye candy, it changes color depending on load -- blue is a light load, orange is a medium load, red is a heavy load, and green means you're back in safe mode, or default clocks (you can restore clocks to default with a tap of the Clear VBIOS button).</p> <p>Asus didn't say when the card will be available or for how much, though in the meantime, you can take a trip down memory lane and see see what innovations came out of Asus over the years. For example, did you know that Asus was the first to plop a cooling fan on a graphics card? Or that it was the first to offer a video card with a premium black PCB? These and other fun facts have been assembled <a href="" target="_blank">onto a timeline</a>, a neat pit stop before making your way over to the <a href="" target="_blank">product page</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> asus Build a PC Gaming geforce gtx 980 gold gpu graphics card Hardware News Thu, 02 Apr 2015 19:49:22 +0000 Paul Lilly 29682 at Async Shaders Will Allow GPUs to Live Up to Their Full Potential, Says AMD <!--paging_filter--><h3>Improvements to be enabled via DX12 and Vulkan APIs</h3> <p>Graphics cards are currently “not living up to their full potential,” says AMD, and the company is adamant that the forthcoming DX12 and Vulkan APIs will change that. Specifically, the red team says that these APIs will be able to take advantage of AMD’s asynchronous compute engines (ACE), which are inherent to AMD’s GCN architecture. These asynchronous compute engines will allow future games that support them to accomplish more simultaneous tasks at a time. AMD infers that this is tantamount to hyperthreading for GPUs.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/dx11.png" alt="DX11 amd" title="DX11 amd" width="620" height="332" /></p> <p style="text-align: center;"><strong>This is the traditional GPU pipeline with DX11.&nbsp;</strong></p> <p>The logic here is that having multiple GPU queues allows for tasks to be completed much faster, and that users could see better performance out of their GCN graphics cards. On a call with AMD, the company claimed that the traditional GPU pipeline behavior is currently very linear with DirectX11 and that all work must be done in a single queue that is scheduled in a pre-determined order. With DX12, however, tasks like physics, lighting, and post-processing can be divided into different queues and can be scheduled independently. This not only amounts to higher FPS in applications that support asynchronous shaders, but lower latency as well, which is key to having good VR experiences. To analogize the comparison, AMD equated DX11’s current inefficiency to inner-city roads with traffic lights against DX12’s more asynchronous model, which the company equated to a freeway system. In the asynchronous/freeway model, tasks can merge in to fill gaps and aren’t bogged down by red lights, or bottlenecks, in this example.</p> <p style="text-align: center;"><img src="/files/u154082/dx12.png" alt="dx12 amd async" title="dx12 amd async" width="620" height="324" /></p> <p style="text-align: center;"><strong>AMD says DX12 will be able to leverage its GPU's asynchronous compute engines to perform more efficiently.</strong></p> <p>According to AMD, using asynchronous shaders can provide post-processing effects with minimal impact on performance. The company cited its LiquidVR SDK demo, which rendered a scene that had an average FPS at 245 with async shaders and post-processing turned off. With the post-processing effect on, however, the FPS took a huge hit and dropped to 158. With async shaders turned on, the average FPS jumped back up to 230, which is just ever slightly more taxing than the scene with post-processing turned off. According to AMD, async shaders have the potential to save performance by over 40%.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/asynch_shaders_perf.jpg" alt="async shaders perf" title="async shaders perf" width="620" height="326" /></p> <p style="text-align: center;"><strong>AMD is touting that its async shaders will be very non-taxing on post-processing.&nbsp;</strong></p> <p>AMD says that async shaders and asynchronous compute engines are a part of the GCN DNA, so developers will be able to take advantage of them with the next generation APIs. With AMD pouring its Mantle learnings into Vulkan, the next iteration of OpenGL, the open-source API will also be able to take advantage of AMD’s asynchronous shaders. In addition, AMD tells us that all the major game engines, like Epic’s Unreal Engine and Crytek’s CryEngine, will be able to take advantage of AMD’s asynchronous shaders.&nbsp;</p> <p>According to AMD, the PlayStation 4, which uses AMD hardware, is already uses asynchronous shaders in games like InFamous Second Son and The Tomorrow Children to get more performance efficiencies and the company believes these learnings will work their way over to the PC with the next-generation APIs. AMD also says the philosophy behind its asynchronous shaders will also apply to the company’s GCN-based APUs.&nbsp;</p> ace amd async shaders asynchronous gcn gpu graphics card multithread nvidia video News Tue, 31 Mar 2015 17:45:04 +0000 Jimmy Thang 29661 at EVGA GeForce GTX 980 Hybrid Gets Wet and Wild with Maxwell <!--paging_filter--><h3><img src="/files/u69/evga_geforce_gtx_980_hybrid.jpg" alt="EVGA GeForce GTX 980 Hybrid" title="EVGA GeForce GTX 980 Hybrid" width="228" height="219" style="float: right;" />When air cooling isn't enough</h3> <p>Have you ever tried liquid cooling a graphics card? It's not the most difficult thing in the world, though between the water cooling loop and delicately removing the card's stock cooling solution, it can be a little intimidating. And then there's <strong>EVGA's new GeForce GTX 980 Hybrid with an all-in-one water cooling already installed</strong>. All you need to do is plug the card into your mobo, feed it power, and mount the single 120mm fan radiator.</p> <p>There's no filling required, no custom tubing to mess with, and no maintenance. Your reward for giving the Maxwell-based GPU a bath is significantly lower temperatures compared to Nvidia's reference air cooler. According to EVGA's benchmark chart, a card running at 70C degrees using a reference cooler would be under 45C with the Hybrid.</p> <p>The card itself comes factory overclocked. Instead of a base clockspeed of 1,126MHz and boost clock of 1,216MHz, the Hybrid runs at 1,291MHz and 1,393MHz, respectively. The 4GB of GDDR5 memory stays at stock speeds -- 7,010MHz on a 256-bit bus, resulting in memory bandwidth of 224.3GB/s.</p> <p>Of course, cooler temps invite overclocking, and EVGA has a couple of software tools to help with that. One is EVGA Precision X, which allows you to adjust the GPU and memory frequencies, moitor temps, and more. You can also use EVGA's OC Scanner X to stress test and benchmark your overclocked card.</p> <p>The GeForce GTX 980 Hybrid is <a href="" target="_blank">available now</a> direct from EVGA for $650. If you already own the card, you can purchase the Hybrid water cooler by itself for $100, which is also <a href="" target="_blank">available now</a>.</p> <p><iframe src="" width="620" height="349" frameborder="0"></iframe></p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC evga Gaming GeForceGTX 980 Hybrid graphics card Hardware liquid cooling maxwell Video Card News Wed, 25 Mar 2015 19:03:14 +0000 Paul Lilly 29641 at No BS Podcast #237: Titan X, Valve's VR Headset, and More <!--paging_filter--><h3><img src="" width="241" height="144" style="float: right;" />We also introduce Maximum PC's new Associate Editor Alex Campbell</h3> <p>There has been a lot of PC news since the No BS podcast last convened and on <a href="" target="_blank"><strong>episode 237</strong></a>, the crew gets busy tackling it. Topics on this week’s show include Nvidia’s new $999 <a title="titan x" href=" " target="_blank">Titan X</a> GPU, Valve’s revolutionary <a href=" " target="_blank">Vive VR</a> system, <a title="gtc" href="" target="_blank">GTC</a>,&nbsp;<a href=" " target="_blank">GDC</a> and more. We also introduce you to Maximum PC’s newest cast member, <a title="alex campbell" href="" target="_blank">Associate Editor Alex Campbell</a>. And finally, we tackle some of your reader questions!</p> <p><iframe src="" width="560" height="315" frameborder="0"></iframe></p> <p><a title="Download Maximum PC Podcast #236 MP3" href="" target="_blank"><img src="/files/u160416/rss-audiomp3.png" width="80" height="15" /></a>&nbsp;<a title="Maximum PC Podcast RSS Feed" href="" target="_blank"><img src="/files/u160416/chicklet_rss-2_0.png" width="80" height="15" /></a>&nbsp;<a href=""><img src="/files/u160416/chicklet_itunes.gif" alt="Subscribe to Maximum PC Podcast on iTunes" title="Subscribe to Maximum PC Podcast on iTunes" width="80" height="15" /></a></p> <div> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Subscribe to the magazine for only 99 cents an issue:</h4> <h5><a title="Subscribe to Maximum PC Magazine" href="" target="_blank">In print</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Zinio" href="" target="_blank">On Zinio</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Google Play" href=";hl=en" target="_blank">On Google Play</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on iTunes" href="" target="_blank">On iTunes</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Amazon Kindle" href=";qid=1406326197">On the Amazon Kindle Store</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Your Nook" href="" target="_blank">On the Barnes &amp; Noble Nook Store</a></h5> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Stalk us in a number of ways:</h4> <p>Become a fan&nbsp;<a title="Maximum PC Facebook page" href="" target="_blank">on Facebook</a></p> <p>Follow us&nbsp;<a href="" target="_blank">on Twitter</a></p> <p>Subscribe to us&nbsp;<a title="Maximum PC Youtube page" href="" target="_blank">on Youtube</a></p> <p>Subscribe&nbsp;<a title="Maximum PC RSS Feed" href="">to our RSS feed</a></p> <p>Subscribe&nbsp;<a href="" target="_blank">to the podcast on iTunes</a></p> <p>email us at:&nbsp;<a href="">maximumpcpodcast AT gmail DOT com</a></p> <p>Leave us a voicemail at 877-404-1337 x1337</p> </div> alex campbell digits devbox geforce graphics card GTC htc nvidia self driving cars Titan X Valve vive vr News No BS Podcast Thu, 19 Mar 2015 20:45:02 +0000 The Maximum PC Staff 29615 at Digital Storm Now Offers Titan X in Aventum, Bolt, and Velox Systems <!--paging_filter--><h3><img src="/files/u69/digital_storm_4_way.jpg" alt="Digital Storm Titan X" title="Digital Storm Titan X" width="228" height="196" style="float: right;" />Tackling a Titan X</h3> <p>Nvidia finally made official a new flagship graphics card today, the mighty <a href="">GeForce Titan X</a>, and right on cue are the barrage of announcements from system builders flaunting the availability of the successor to Titan Z. That includes boutique builder <strong>Digital Storm, which is now (or soon) offering the Titan X in various configurations</strong> inside its Aventum, Bolt, and Velox desktop product lines.</p> <p>The Bolt is Digital Storm's version of a Steam Machine and is a logical fit for the Titan X if you're already rocking or planning to upgrade to a 4K Ultra HD television. For even more power, there's the Velox, which is Digital Storm's standard desktop for enthusiasts, and the Aventum, the boutique builder's top shelf gaming system with room for up to four graphics cards.</p> <p>As we learned the today, the Titan X features 3,072 Maxwell cores, 192 TMUs, 96 ROPs, 24 SMs, and 12GB of GDDR5 on a 384-bit bus. The memory at reference is clocked at 7,010MHz and the GPU at 1,000MHz/1,075MHz (Core/Boost).</p> <p>"The GTX Titan X is the most advanced piece of hardware we've seen here at Digital Storm and we are all very excited to see what people can do with these cards in our machines," said Harjit Chana, Chief Brand Officer. "This card has the potential to be a game-changer and it deserves a machine that can keep up with it."</p> <p>At the time of this writing, Digital Storm still hadn't updated its website to reflect the availability of the new cards, though they should be available any time now. In the meantime, you can check out some 4K benchmarks Digital Storm ran of a three-way SLI Titan X setup <a href="" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Aventum bolt Digital Storm GeForce GTX Titan X graphics card Hardware nvidia OEM rigs velox Video Card News Wed, 18 Mar 2015 03:16:20 +0000 Paul Lilly 29606 at Possible Look at Specifications and Performance for AMD's Radeon R9 390X <!--paging_filter--><h3><img src="/files/u69/amd_radeon_1.jpg" alt="AMD Radeon R9 290X" title="AMD Radeon R9 290X" width="228" height="170" style="float: right;" />A potentially beastly card in the making</h3> <p>Go ahead and apply the standard disclaimer about leaked specs not being verified or official, because that's certainly the case here. Disclaimer aside, we hope that <strong>unconfirmed specifications of the AMD's forthcoming Radeon R9 390X graphics card</strong> turn out to be accurate, because if they are, it's going to be a potent part that's up to 60 percent faster than AMD's Radeon R9 290X.</p> <p>The folks at <a href="" target="_blank"><em>Videocardz</em></a> asked their source if he could share additional information about AMD's new flagship graphics card, and to the site's surprised, he responded in kind with a few more goodies to digest. One of those goodies is that AMD scrapped plans to run with 4GB of High Bandwidth Memory (HBM) Gen1 (1GB per stack) after Nvidia unveiled its Titan X graphics card. Now the plan is to release the Radeon R9 390X with 8GB, but Gen2 (2GB per stack), on a 4,096-bit bus (1,024-bit per stack). That should give the card around 1.25TB/s of memory bandwidth.</p> <p>The GPU is said to be a 28nm Fiji XT part with 4,096 unified cores and 256 Texture Mapping Units (TMUs). There's no mention of ROPs or core clockspeed, though the boost clockspeed is reportedly 1,050MHz. Other specs include a 1,250MHz memory clock, 8.6TFLOPS of compute performance, and either a 6+8 pin or dual 8-pin PCI-E configuration.</p> <p>There's also a performance slide that was leaked, and if it's accurate, performance will be up to around 1.65 times that of the Radeon R9 290X in 4K gaming.</p> <p>Reports from elsewhere on the web have the card debuting at around $700, which is also unconfirmed.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> amd Build a PC Fiji Gaming gpu graphics card Hardware Radeon R9 390X Video Card News Mon, 16 Mar 2015 15:41:48 +0000 Paul Lilly 29588 at XFX Radeon R9 370 Core Edition Leaks to Web, Higher End R300 Series Cards to Follow <!--paging_filter--><h3><img src="/files/u69/xfx_card.jpg" alt="XFX Card" title="XFX Card" width="228" height="143" style="float: right;" />AMD R300 Series is around the corner</h3> <p>We know that AMD is getting ready to refresh its graphics card lineup -- a refresh that's long overdue, as far as we're concerned -- though it looks like the first of the upcoming Radeon R9 300 Series won't be a flagship part. At least that won't be the case if, as rumored, <strong>XFX launches its Radeon R9 370 Core Edition video card</strong> powered by AMD's Trinidad Pro processor next month.</p> <p>The rumor <a href="" target="_blank">originates at <em>Videocardz</em></a>, which caught wind of the forthcoming card by a reader of the site claiming to work for XFX. According to the supposed XFX employee, the first GPU of the R300 Series will be Trinidad Pro, and the site believes him to be telling the truth after a new leak from XFX seemed to corroborate his story.</p> <p>If true, the R9 370 Core Edition (R9-370A-ENF) will come in 2GB and 4GB GDDR5 versions, both with a 256-bit memory bus, single 6-pin PCI-E power connector, and two Dual-Link DVI ports flanked by HDMI and DisplayPort.</p> <h3>R300 Series</h3> <p>Based on the rumors so far, the R9 370 Core Edition will be a mid-range card. Here's a look at the full lineup:</p> <ul> <li>AMD Radeon R9 390X: 28nm Fiji XT GPU, 3,584 cores, 224 TMUs, 64 ROPs, 4GB memory, $599</li> <li>AMD Radeon R9 390: 28nm Fiji Pro GPU, 3,328 cores, 208 TMUs, 64 ROPs, 4GB GDDR5, $399</li> <li>AMD Radeon R9 380X: 28nm Hawaii XTX GPU, 2,816 cores, 176 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 380: 28nm Hawaii Pro GPU, 2,560 cores, 160 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 375X: Tonga XT GPU, 2,048 cores, 128 TMUs, 32 ROPs, 2GB GDDR5, 384-bit, price unknown</li> <li>AMD Radeon R9 375: Tonga Pro GPU, 1,792 cores, 112 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370X: Trinidad XT GPU, 1,280 cores, 80 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370: 28nm Trinidad Pro GPU, 1,024 cores, 64 TMUs, 24 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R7 360X: Bermuda GPU, 896 cores, 128-bit GDDR5, price unknown</li> <li>AMD Radeon R7 350X/340X: Oland GPU, 320 cores, DDR3 and GDDR5 memory, 128-bit</li> <li>AMD Radeon R5 300: Caicos GPU, 160 cores, DDR3 memory, 64-bit</li> </ul> <p>None of these are official or set in stone, and as you can see, more is 'known' (rumored) about the higher end GPUs than the lower end ones. So, take these specs with a block of salt.</p> <p>There are also a few benchmarks scattered around the web, though their legitimacy is a huge question mark, expecially when putting up numbers <a href="" target="_blank">like this</a>.</p> <p>Regardless, it looks like we won't have to wait long to see what kind of performance AMD's R300 Series brings to the table.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> amd Build a PC graphics card Hardware R300 Series Radeon R9 370 Core Edition Video Card xfx News Thu, 12 Mar 2015 15:12:35 +0000 Paul Lilly 29574 at First Purported GeForce Titan X Benchmarks Appear Online <!--paging_filter--><h3><img src="/files/u69/titan_x_0.jpg" alt="Nvidia GeForce Titan X" title="Nvidia GeForce Titan X" width="228" height="168" style="float: right;" />Sneak peek at performance</h3> <p>When <a href="">Nvidia unveiled its GeForce Titan X</a> graphics card at the 2015 Game Developers Conference (GDC) last week, company CEO Jen-Hsun Huang revealed almost nothing about the part, other than to say it has 12GB of onboard memory and 8 billion transistors. There was no mention of other specs, let alone benchmarks, though information across the board has begun to leak on the web, including a <strong>first look at how the Titan X performs</strong>.</p> <p>Bearing in mind that none of this is official, the folks at <em></em> report that Titan X sports 3,072 CUDA cores, 192 TMUs, 96 ROPs, 1,002MHz core clockspeed (boost is unknown), 1,750MHz memory clock, and a 384-bit memory bus resulting in 336GB/s of bandwidth.</p> <p>The site also reports there are three mini DisplayPorts, a single DisplayPort, and an HDMI port, along with 6-pin and 8-pin (one each) PCI-E power connectors.</p> <p>As for the benchmarks, they show the Titan X scoring 22,903 in 3DMark 11 using the Performance setting and 26,444 when overclocked. Both are lower scores than AMD's Radeon R9295X2 (28,930), though they blow the Titan (13,814) and Titan Black (14,557) out of the water.</p> <p>There are also benchmarks for other 3DMark tests, along with 2-way, 3-way, and 4-way Titan X SLI scores. Check them out <a href="" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC GeForce Titan X graphics card Hardware nvidia Video Card News Wed, 11 Mar 2015 17:25:16 +0000 Paul Lilly 29572 at Nvidia's Giving Away Witcher 3 Codes with Select GeForce GTX 900 Series Graphics Cards <!--paging_filter--><h3><img src="/files/u69/witcher_3_0.jpg" alt="The Witcher 3: Wild Hunt" title="The Witcher 3: Wild Hunt" width="228" height="203" style="float: right;" />A little gaming bribery never hurt anyone</h3> <p>After the fiasco with Nvidia's GeForce GTX 970 graphics card and the way it handles the last .5GB of its onboard 4GB of memory, Nvidia could use a bit of positive press. One of the best ways to do that is to dangle something shiney in front of the public, like an anticipated game. So, available now for a limited time, <strong>customers who buy a select GeForce GTX 980, 970, and 960 graphics card, or a GTX 970M or above notebook, will receive a code for The Witcher 3: Wild Hunt</strong>, Nvidia announced today.</p> <p>"Over my 10-plus years at Nvidia, I’ve seen, worked with, and played countless games. Few stand out to me as deserving of the term epic. The Witcher: Wild Hunt is one of those titles," Nvidia's Leslie Pirritano stated in a blog post. "Developer CD Projekt Red has provided gamers with an epic story, an epic adventure, and epic graphics. The untamed world of this action-adventure game is a graphics showcase, with stunning vistas and detailed characters. So, it’s exciting to me that we’re offering it to GeForce gamers as part of our new 'Undeniably Epic' bundle."</p> <p>Nvidia was also quick to point out that the upcoming title supports technologies like Nvidia HairWorks and PhysX, the first of which will add a level realism to the fur and hair of more than 50 monsters and characters in the game.</p> <p>The Witcher 3: Wild Hunt is currently scheduled to release May 19, 2015. To grab a qualifying card, be sure to start you <a href="" target="_blank">search here</a>, which has links to participating vendors.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC games geforce graphics card Hardware nvidia Software the witcher 3: wild hunt Video Card News Tue, 10 Mar 2015 15:57:54 +0000 Paul Lilly 29567 at Nvidia Unveils Titan X Graphics Card at GDC <!--paging_filter--><h3><img src="/files/u69/titan_x.jpg" alt="Titan X" title="Titan X" width="231" height="177" style="float: right;" />A new top-end GPU</h3> <p>It was speculated that Nvidia might announce a new Titan graphics card during GDC, and that's what the company did—in a somewhat dramatic fashion. It happened at the tail end of an Unreal Engine panel. As Epic founder Tim Sweeny wrapped up his discussion on the state of Unreal, <strong>Nvidia CEO Jen-Hsun Huang surprised attendees by emerging on stage to unveil the company's Titan X</strong>.</p> <p>He called it the "world's most advanced GPU," though was short on details. What he <em>was</em> willing to divulge about the card is that it has 12GB of onboard memory and 8 billion transistors. For the sake of comparison, Titan Black has 7.1 billion transistors and 6GB of GDDR5 memory.</p> <p>"It’s the most advanced GPU the world has ever seen," Jen-Hsun said.</p> <p>He then presented the company's first production unit to Sweeny, though not before autographing the box in came in.</p> <p>Nvidia will release more details about the card during the upcoming GTC event that runs from March 17–20.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC Gaming GDC 2015 graphics card Hardware nvidia Titan X Video Card News Wed, 04 Mar 2015 19:16:25 +0000 Paul Lilly 29540 at EVGA Announces GeForce GTX 960 SuperSC with 4GB of Onboard Memory <!--paging_filter--><h3><img src="/files/u69/evga_geforce_gtx_960_supersc.jpg" alt="EVGA GeForce GTX 960 SuperSC" title="EVGA GeForce GTX 960 SuperSC" width="228" height="148" style="float: right;" />Now with twice the GDDR5 memory</h3> <p>There were rumors earlier this year that 4GB versions of Nvidia's GeForce GTX 960 graphics card would show up in March, and it turns out they were right. <strong>EVGA has emerged as the first to cross into 4GB territory with its GeForce GTX 960 SuperSC graphics card announced today</strong>. Though it's a mid-range card, EVGA is promoting the benefit of higher texture qualities and better 4K resolution gaming performance with the added memory.</p> <p>To keep the things cool and quiet, EVGA has also outfitted its newest graphics card with its ACX 2.0+ custom cooler.</p> <p>"The new EVGA ACX 2.0+ cooler brings new features to the award winning EVGA ACX 2.0 cooling technology. A Memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 11C, and optimized Straight Heat Pipes (SHP) reduce GPU temperature by 5C," EVGA says. "ACX 2.0+ coolers also feature optimized Swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU."</p> <p>EVGA's GeForce GTX 960 SuperSC sports 1,279MHz base and 1,342MHz boost clockspeeds, which are overclocked from the reference design's 1,127MHz base and 1,178MHz boost specifications. The 4GB of GDDR5 memory stays at stock (7,010MHz) on a 128-bit bus.</p> <p>The card is also notable for its dual-BIOS design. Should something go wrong while tinkering, you can switch to a secondary BIOS with a quick flip of a switch.</p> <p>No word yet on when the 4GB card will be available or for how much. There is, however, a "Notify Me" button on the card's <a href="" target="_blank">product page</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> 4GB Build a PC evga geforce gtx 960 graphics card Hardware nvidia Video Card News Tue, 03 Mar 2015 18:06:51 +0000 Paul Lilly 29527 at Nvidia CEO is Mocked for Explanation of GeForce GTX 970 Memory Issue <!--paging_filter--><h3><img src="/files/u69/nvidia_meme.jpg" alt="Nvidia Meme" title="Nvidia Meme" width="228" height="151" style="float: right;" />Here come the memes</h3> <p>Nividia ticked off a lot of people when it came to light that its GeForce GTX 970 graphics card was suffering from performance issues when games tried to access onboard memory above 3.5GB. Turns out it's the result of an architectural design, one that doesn't exist on the GTX 980, and one that wasn't communicated to Nvidia's internal marketing team or externally to reviewers. There's been a lot of negativity surrounding the issue ever since, and in an attempt to diffuse the situation, <strong>Nvidia CEO Jen-Hsun Huang has offered up an explanation of the GTX 970 memory issue</strong>.</p> <p>Before we get into that, we suggest reading <a href="">this</a>, <a href="">this</a>, and <a href="">this</a> as primers to what's going on. If you're crunched for time, the Cliff Notes version is that the above scenario, along with the discovery that the GTX has less ROPs and L2 cache than advertised, has led to a class action lawsuit.</p> <p>Seeing that the contempt is growing bigger, not smaller, Huang tried explaining away the issue as a "feature" that should have been bragged about.</p> <p>"We invented a new memory architecture in Maxwell. This new capability was created so that reduced-configurations of Maxwell can have a larger framebuffer – i.e., so that GTX 970 is not limited to 3GB, and can have an additional 1GB," Huang stated in a blog post.</p> <p>"GTX 970 is a 4GB card. However, the upper 512MB of the additional 1GB is segmented and has reduced bandwidth. This is a good design because we were able to add an additional 1GB for GTX 970 and our software engineers can keep less frequently used data in the 512MB segment," the CEO continued.</p> <p>According to Huang, the expectation was that users would be "excited" about this, but were ultimately "disappointed that we didn't better describe the segmented nature of the architecture." He also admitted that the "new feature of Maxwell should have been clearly detailed from the beginning."</p> <p>Perhaps so, though looking at the comments to his blog post makes me think this was a ticking time bomb no matter how you slice it. For those holding Nvidia's feet to the fire over this, the bottom line here is that the GTX 970 is gimped compared to the GTX 980, which doesn't have an issue accessing all 4GB of VRAM, and that they were misled, both by the impact this would have and by the advertised specs.</p> <p>"Yes, a 'new feature,' a 'good design' not included on GTX 980 because [it] decreases performance," a reader commented. Another stated, "I will most likely never buy from Nvidia again, they care nothing about their customer. And blatantly lie to our faces."</p> <p>Others took to posting memes and doctored videos, like this one:</p> <p><iframe src="" width="620" height="465" frameborder="0"></iframe></p> <p>It's hard to watch the above clip without busting a gut, though for Nvidia, this is no laughing matter. To Nvidia's credit, the performance issue seems to only crop up when gaming at high resolutions and shouldn't bother folks gaming at 1080p. And based on the benchmarks when the performance issue doesn't creep up, the bang-for-buck here is pretty high.</p> <p>But in the end, Nvidia is finding out that none of that matters, as their fan base feels it's been lied to. It's going to take more than a blog post to win back their trust and/or make this go away.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC geforce gtx 970 graphics card Hardware jen-hsun huang nvidia Video Card News Wed, 25 Feb 2015 17:01:39 +0000 Paul Lilly 29469 at