Video Card http://www.maximumpc.com/taxonomy/term/384/ en EVGA GeForce GTX 980 Hybrid Gets Wet and Wild with Maxwell http://www.maximumpc.com/evga_geforce_gtx_980_hybrid_gets_wet_and_wild_maxwell_2015 <!--paging_filter--><h3><img src="/files/u69/evga_geforce_gtx_980_hybrid.jpg" alt="EVGA GeForce GTX 980 Hybrid" title="EVGA GeForce GTX 980 Hybrid" width="228" height="219" style="float: right;" />When air cooling isn't enough</h3> <p>Have you ever tried liquid cooling a graphics card? It's not the most difficult thing in the world, though between the water cooling loop and delicately removing the card's stock cooling solution, it can be a little intimidating. And then there's <strong>EVGA's new GeForce GTX 980 Hybrid with an all-in-one water cooling already installed</strong>. All you need to do is plug the card into your mobo, feed it power, and mount the single 120mm fan radiator.</p> <p>There's no filling required, no custom tubing to mess with, and no maintenance. Your reward for giving the Maxwell-based GPU a bath is significantly lower temperatures compared to Nvidia's reference air cooler. According to EVGA's benchmark chart, a card running at 70C degrees using a reference cooler would be under 45C with the Hybrid.</p> <p>The card itself comes factory overclocked. Instead of a base clockspeed of 1,126MHz and boost clock of 1,216MHz, the Hybrid runs at 1,291MHz and 1,393MHz, respectively. The 4GB of GDDR5 memory stays at stock speeds -- 7,010MHz on a 256-bit bus, resulting in memory bandwidth of 224.3GB/s.</p> <p>Of course, cooler temps invite overclocking, and EVGA has a couple of software tools to help with that. One is EVGA Precision X, which allows you to adjust the GPU and memory frequencies, moitor temps, and more. You can also use EVGA's OC Scanner X to stress test and benchmark your overclocked card.</p> <p>The GeForce GTX 980 Hybrid is <a href="http://www.evga.com/products/Product.aspx?pn=04G-P4-1989-KR" target="_blank">available now</a> direct from EVGA for $650. If you already own the card, you can purchase the Hybrid water cooler by itself for $100, which is also <a href="http://www.evga.com/products/Product.aspx?pn=400-HY-H980-B1" target="_blank">available now</a>.</p> <p><iframe src="https://www.youtube.com/embed/PFGhQQUcWhs" width="620" height="349" frameborder="0"></iframe></p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/evga_geforce_gtx_980_hybrid_gets_wet_and_wild_maxwell_2015#comments Build a PC evga Gaming GeForceGTX 980 Hybrid graphics card Hardware liquid cooling maxwell Video Card News Wed, 25 Mar 2015 19:03:14 +0000 Paul Lilly 29641 at http://www.maximumpc.com Nvidia GeForce GTX Titan X SLI Benchmarks [UPDATED] http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_sli_benchmarks_2015 <!--paging_filter--><h3>Performance that will make you see double</h3> <p>So you might have heard that Nvidia <a href="http://www.maximumpc.com/nvidia_titan_x_review_2015">released the GeForce GTX Titan X video card yesterday</a>. It's the fastest single-GPU card on the planet (though not the fastest single card, because of the dual GPUs in the Titan Z and <a href="http://www.maximumpc.com/amd_unleashes_dual-gpu_radeon_r9_295x2">the Radeon R9 295X2</a>). Maybe most people would be satisfied with the benchmarks of a single Titan X, but we're not most people. So we called a guy who knows a guy, and we acquired a second Titan X. The things we do for you people!</p> <p><strong>UPDATE: </strong>We located a <strong>third</strong> Titan X, and we discovered that we need to upgrade our CPU! This is fun.</p> <p>To recap, this is the system that we've been using to test our video cards:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Corsair AX1200</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 64-bit</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p>It's an aging system, but it has plenty of juice to drive up to four GPUs. We used six titles to benchmark the Titan X and similar cards:&nbsp; Metro: Last Light, Hitman: Absolution, Tomb Raider, Batman: Arkham Origins, Unigine Heaven, and Middle-earth: Shadow of Mordor. We use these games because they have an even balance of Nvidia friendliness and AMD friendliness, they'll push your hardware when you max-out the settings, and they have built-in benchmarks, so you can reproduce our results yourself.</p> <p style="text-align: center;"><img src="/files/u160416/this_is_happening.jpg" alt="Titan X SLI" title="Titan X SLI" width="620" height="465" /></p> <p>The Nvidia cards were benchmarked with the GeForce 347.84 sent to Titan X reviewers, which are apparently nearly identical to the 347.88 drivers released to the public yesterday. Our MSI Radeon R9 290X Lightning Edition card used <a href="http://www.maximumpc.com/amds_year_end_gift_gamers_catalyst_omega_special_edition_driver_2014">AMD's Omega drivers released in December</a>. The other cards in the mix are the Asus GTX970-DCMOC-4GD5; and the Asus&nbsp;STRIX-GTX780-OC-6GD5. The GTX 780 Ti in this roundup is the reference model. All clock speeds in the chart below are of the actual cards we tested, rather than the default clock speeds of the baseline models, except when a baseline model was actually used.</p> <p>Since we were not blessed with a second MSI GTX 980 Gaming 4G, the SLI benchmark is of two reference 980s in our possession. The difference will be small, but it is there.</p> <div class="spec-table orange" style="font-size: 12px; font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>Titan X</td> <td>Titan&nbsp;</td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 780 Ti</td> <td>GTX 780</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>&nbsp;GM200</td> <td>&nbsp;GK110</td> <td>&nbsp;GM204</td> <td>&nbsp;GM204&nbsp;</td> <td>&nbsp;GK110&nbsp;</td> <td class="item-dark">&nbsp;GK104</td> <td>Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1,000</td> <td>&nbsp;837</td> <td>&nbsp;1,216</td> <td>&nbsp;1,088</td> <td>&nbsp;876</td> <td>&nbsp;889</td> <td>"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1,075</td> <td>&nbsp;876</td> <td>&nbsp;1,317</td> <td>&nbsp;1,228</td> <td>&nbsp;928</td> <td class="item-dark">&nbsp;941</td> <td>N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7,010</td> <td>&nbsp;6,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;6,000</td> <td>5,000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;12GB</td> <td>&nbsp;6GB</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;3GB</td> <td>&nbsp;6GB</td> <td>4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;96</td> <td>&nbsp;48</td> <td>&nbsp;64</td> <td>&nbsp;56</td> <td>&nbsp;48</td> <td>&nbsp;48</td> <td>64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;192</td> <td>&nbsp;224</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;240</td> <td>&nbsp;192</td> <td>176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;3,072</td> <td>&nbsp;2,688</td> <td>&nbsp;2,048</td> <td>&nbsp;1,664</td> <td>&nbsp;2,880</td> <td>&nbsp;2,304</td> <td>2,816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;24</td> <td>&nbsp;15</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;15</td> <td>&nbsp;12</td> <td>N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>290</td> </tr> <tr> <td>Launch Date</td> <td>March 2015</td> <td>March 2013</td> <td>Sept 2014</td> <td>Sept 2014</td> <td>Nov 2013</td> <td>May 2013</td> <td>Oct 2013</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$999</td> <td>&nbsp;$999</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>$549</td> </tr> </tbody> </table> </div> <p>You can refer to our Titan X review for more information on what these specs mean. We don't want to flap our gums here any more than necessary. Now that we've explained the context of the benchmarks, here they are:</p> <h3>3840x2160 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider*</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td>Titan X</td> <td>&nbsp;35</td> <td>&nbsp;53</td> <td>&nbsp;33</td> <td>&nbsp;44</td> <td>&nbsp;44/60</td> <td>&nbsp;26</td> </tr> <tr> <td>Titan X SLI</td> <td>&nbsp;54</td> <td>&nbsp;94</td> <td>&nbsp;58</td> <td>&nbsp;75</td> <td>&nbsp;83/112</td> <td>&nbsp;49</td> </tr> <tr> <td><strong>SLI Scaling</strong></td> <td><strong>&nbsp;54%</strong></td> <td><strong>&nbsp;77%</strong></td> <td><strong>&nbsp;76%</strong></td> <td><strong>&nbsp;70%</strong></td> <td><strong>&nbsp;87%/87%</strong></td> <td><strong>&nbsp;88%</strong></td> </tr> <tr> <td>3-Way SLI</td> <td>&nbsp;53</td> <td>&nbsp;110</td> <td>&nbsp;78</td> <td>&nbsp;89</td> <td>&nbsp;119/161</td> <td>&nbsp;70</td> </tr> <tr> <td><strong>3-Way Scaling</strong></td> <td><strong>&nbsp;N/A</strong></td> <td><strong>&nbsp;17%</strong></td> <td><strong>&nbsp;34%</strong></td> <td><strong>&nbsp;19%</strong></td> <td><strong>&nbsp;43% /43%</strong></td> <td><strong>&nbsp;43%</strong></td> </tr> <tr> <td>Titan</td> <td>&nbsp;24</td> <td>&nbsp;34</td> <td>&nbsp;22</td> <td>&nbsp;25</td> <td>&nbsp;26/37</td> <td>&nbsp;18</td> </tr> <tr> <td>980</td> <td>&nbsp;32</td> <td>&nbsp;41</td> <td>&nbsp;24</td> <td>&nbsp;37</td> <td>&nbsp;36/48</td> <td>&nbsp;20</td> </tr> <tr> <td>980 SLI</td> <td>&nbsp;46</td> <td>&nbsp;74</td> <td>&nbsp;44</td> <td>&nbsp;59</td> <td>&nbsp;64/84</td> <td>&nbsp;35</td> </tr> <tr> <td><strong>SLI Scaling</strong></td> <td><strong>&nbsp;44%</strong></td> <td><strong>&nbsp;80%</strong></td> <td><strong>&nbsp;83%</strong></td> <td><strong>&nbsp;60%</strong></td> <td><strong>&nbsp;77%/75%</strong></td> <td><strong>&nbsp;75%</strong></td> </tr> <tr> <td>970</td> <td>&nbsp;24</td> <td>&nbsp;32</td> <td>&nbsp;19</td> <td>&nbsp;28</td> <td>&nbsp;27/37</td> <td>&nbsp;15</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;27</td> <td>&nbsp;38</td> <td>&nbsp;23</td> <td>&nbsp;32</td> <td>&nbsp;29/40</td> <td>&nbsp;19</td> </tr> <tr> <td>780</td> <td>&nbsp;26</td> <td>&nbsp;35</td> <td>&nbsp;23</td> <td>&nbsp;30</td> <td>&nbsp;27/38</td> <td>&nbsp;18</td> </tr> <tr> <td>290X</td> <td>&nbsp;28</td> <td>&nbsp;41</td> <td>&nbsp;29</td> <td>&nbsp;37</td> <td>&nbsp;31/43</td> <td>&nbsp;17</td> </tr> </tbody> </table> </div> </h4> <p style="text-align: left;"><span style="font-weight: normal;">*<em>TressFX on/TressFX off</em></span></p> <p style="text-align: left;"><span style="font-weight: normal;">We're benchmarking these games on their highest presets with 4x multi-sample anti-aliasing (or in Tomb Raider's case, 2x super-sample anti-aliasing, since it has no MSAA option), so you're not going to see ideal performance here. We push these cards by design, rather than aiming for playable framerates. At the prices you're paying for these cards, you shouldn't have to make many compromises. Even with a second Titan X in the mix, though, we still can't hit 60fps across the board. Granted, at 4K, you probably don't need 4xMSAA, but it is interesting to see just how much this resolution affects performance. What's also interesting is how much the SLI scaling varies from game to game. The Titan X is a lot more constistent, but both it and the GTX 980 struggle with Metro: Last Light (which, it should be said, is an AMD-friendly game, as is Hitman: Absolution).</span></p> <p style="text-align: left;"><span style="font-weight: normal;">When we add the third Titan-X (I think they're multiplying when we're not looking), we get a smaller performance bump, but this is to be expected. What we didn't see coming were the particularly modest gains in Batman and Shadow of Mordor, indicating that our CPU is hitting a wall (at least, at its stock clock speed). So this addition to our benchmarks has been educational for us as well. Metro: Last Light also didn't even recognize the third GPU, so we're considering dropping that game from our benchmark suite, because this is not the first time it's happened. And upgrading our testing rig to a 5960X is now a high priority. We'll also experiment with overclocking the 3960X that's currently installed.</span></p> <p style="text-align: left;"><span style="font-weight: normal;"><img src="/files/u160416/this_is_also_happening.jpg" alt="Nvidia Titan X 3-Way SLI" title="Nvidia Titan X 3-Way SLI" width="620" height="465" /><br /></span></p> <p style="text-align: left;">In the coming days, we plan to get you some more multi-GPU benches to compare against the Titan X. In the meantime, we hope you found these new numbers both delicious and nutritious.</p> http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_sli_benchmarks_2015#comments benchmarks geforce gpu nvidia sli Titan X Video Card News Sat, 21 Mar 2015 01:26:19 +0000 Tom McNamara 29611 at http://www.maximumpc.com Digital Storm Now Offers Titan X in Aventum, Bolt, and Velox Systems http://www.maximumpc.com/digital_storm_now_offers_titan_x_aventum_bolt_and_velox_systems_2015 <!--paging_filter--><h3><img src="/files/u69/digital_storm_4_way.jpg" alt="Digital Storm Titan X" title="Digital Storm Titan X" width="228" height="196" style="float: right;" />Tackling a Titan X</h3> <p>Nvidia finally made official a new flagship graphics card today, the mighty <a href="http://www.maximumpc.com/nvidia_titan_x_review_2015">GeForce Titan X</a>, and right on cue are the barrage of announcements from system builders flaunting the availability of the successor to Titan Z. That includes boutique builder <strong>Digital Storm, which is now (or soon) offering the Titan X in various configurations</strong> inside its Aventum, Bolt, and Velox desktop product lines.</p> <p>The Bolt is Digital Storm's version of a Steam Machine and is a logical fit for the Titan X if you're already rocking or planning to upgrade to a 4K Ultra HD television. For even more power, there's the Velox, which is Digital Storm's standard desktop for enthusiasts, and the Aventum, the boutique builder's top shelf gaming system with room for up to four graphics cards.</p> <p>As we learned the today, the Titan X features 3,072 Maxwell cores, 192 TMUs, 96 ROPs, 24 SMs, and 12GB of GDDR5 on a 384-bit bus. The memory at reference is clocked at 7,010MHz and the GPU at 1,000MHz/1,075MHz (Core/Boost).</p> <p>"The GTX Titan X is the most advanced piece of hardware we've seen here at Digital Storm and we are all very excited to see what people can do with these cards in our machines," said Harjit Chana, Chief Brand Officer. "This card has the potential to be a game-changer and it deserves a machine that can keep up with it."</p> <p>At the time of this writing, Digital Storm still hadn't updated its website to reflect the availability of the new cards, though they should be available any time now. In the meantime, you can check out some 4K benchmarks Digital Storm ran of a three-way SLI Titan X setup <a href="http://www.digitalstormonline.com/unlocked/nvidia-gtx-titan-x-3-way-sli-performance-review-4k-benchmarks-idnum343/" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/digital_storm_now_offers_titan_x_aventum_bolt_and_velox_systems_2015#comments Aventum bolt Digital Storm GeForce GTX Titan X graphics card Hardware nvidia OEM rigs velox Video Card News Wed, 18 Mar 2015 03:16:20 +0000 Paul Lilly 29606 at http://www.maximumpc.com Nvidia Titan X Review http://www.maximumpc.com/nvidia_titan_x_review_2015 <!--paging_filter--><h3>A new hero descends from the heights of Mount GeForce</h3> <p>In ancient Greek mythology, the Titans are the immediate descendants of the primordial gods. So it is with the Nvidia GeForce GTX Titan, descended from the company's top-shelf professional workstation GPUs. <a title="Nvidia GeForce GTX Titan review" href="http://www.maximumpc.com/evga_geforce_gtx_titan_review" target="_blank">First debuting in March 2013</a>, the original Titan was nearly the most powerful video card that the company could offer. They sealed off a couple items that would be of little interest to gamers, which also prevented professionals from using these much less expensive gamer variants for workstation duties.</p> <p>In the two years since, the company has iterated on this design, adding more shader processors (or "CUDA cores," as Nvidia likes to call them), and even adding a second GPU core on the same card. Now the time has come for it to deliver the Maxwell generation of super-premium GPUs, this time dubbed the <strong>GTX Titan X</strong>. And it's a beast. Despite being stuck on the 28nm process node for several years now, the company continues to extract more and more performance from its silicon. Interestingly, the card goes up for sale today, but only at Nvidia's own online storefront. There is currently a limit of two per order. The company tells us that you'll be able to buy it from other stores and in pre-built systems "over the next few weeks." First-world problems, right?</p> <p><img src="/files/u99720/nvidia_titan_5159.png" alt="Titan X" title="Titan X" width="620" height="401" style="text-align: center;" /></p> <p>These days, you can use the number of shader cores as a rough estimate of performance. We say "rough" because the Maxwell cores in this Titan X are, according to Nvidia, 40 percent faster than the Kepler cores in the earlier Titans. So when you see that the Titan X has "only" 3072 of them, this is actually a huge boost. It's about 30 percent more than the GTX 980, which is already a barnstormer. For reference, the difference in shader count between <a title="Nvidia GeForce GTX 780 review" href="http://www.maximumpc.com/asus_rog_poseidon_gtx_780_review" target="_blank">the GTX 780</a> and the original Titan was about 16 percent. The Titan X also has an almost ridiculous 12GB of GDDR5 VRAM. We say "almost" because Nvidia has some ambitious goals for the resolution that it expects you to be able to play at with this card.</p> <p>At the Game Developers Conference two weeks ago, its reps pitched the Titan X to us as the first GPU that could handle 4K gaming solo, at high settings. They demoed Middle-Earth: Shadow of Mordor, which wasn't a solid 60fps, as they readily acknowledged. But we did see all the graphics settings cranked up, and gameplay was smooth at about 45fps <a title="G-Sync introduction video" href="http://www.maximumpc.com/acer_4k_g-sync_monitor_tested_gtx_980_video" target="_blank">when paired with a G-Sync monitor</a>. As its name implies, G-sync synchronizes your monitor's refresh rate to the frame rate being delivered to your video card, which vastly reduces tearing. They also enabled motion blur, which can help mask frame rate drops.</p> <p><img src="/files/u160416/titanx3.jpg" width="620" height="349" /></p> <p>For our review, we used seven high-end cards that have come out in the same two-year time frame as the original Titan. Some of these are no longer sold in stores, but they still provide an important frame of reference, and their owners may want to know if upgrading is going to be worth it.</p> <p style="font-weight: normal;">Note that the clock speeds in the charts on the next page are not all for the reference versions. These are for the particular models that we used for this review. The GTX 980 is the MSI Gaming 4G model; the GTX 970 is the Asus GTX970-DCMOC-4GD5; the GTX 780 is the Asus&nbsp;STRIX-GTX780-OC-6GD5 (and the reference model also has 3GB of VRAM instead of 6GB); and the Radeon R9 290X is the MSI Lightning edition. We used the prices for the reference versions, however.</p> <h3 style="text-align: right;"><a title="GeForce Titan X Review Page 2" href="http://www.maximumpc.com/nvidia_titan_x_review_2015?page=0,1" target="_self">Click here to turn to page 2 for the specs!</a></h3> <hr /> <p>Let's take a look at their specs:</p> <div class="spec-table orange" style="font-size: 12px; font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>Titan X</td> <td>Titan&nbsp;</td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 780 Ti</td> <td>GTX 780</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>&nbsp;GM200</td> <td>&nbsp;GK110</td> <td>&nbsp;GM204</td> <td>&nbsp;GM204&nbsp;</td> <td>&nbsp;GK110&nbsp;</td> <td class="item-dark">&nbsp;GK104</td> <td>Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1,000</td> <td>&nbsp;837</td> <td>&nbsp;1,216</td> <td>&nbsp;1,088</td> <td>&nbsp;876</td> <td>&nbsp;889</td> <td>"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1,075</td> <td>&nbsp;876</td> <td>&nbsp;1,317</td> <td>&nbsp;1,228</td> <td>&nbsp;928</td> <td class="item-dark">&nbsp;941</td> <td>N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7,010</td> <td>&nbsp;6,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;7,000</td> <td>&nbsp;6,000</td> <td>5,000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;12GB</td> <td>&nbsp;6GB</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;3GB</td> <td>&nbsp;6GB</td> <td>4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;96</td> <td>&nbsp;48</td> <td>&nbsp;64</td> <td>&nbsp;56</td> <td>&nbsp;48</td> <td>&nbsp;48</td> <td>64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;192</td> <td>&nbsp;224</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;240</td> <td>&nbsp;192</td> <td>176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;3,072</td> <td>&nbsp;2,688</td> <td>&nbsp;2,048</td> <td>&nbsp;1,664</td> <td>&nbsp;2,880</td> <td>&nbsp;2,304</td> <td>2,816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;24</td> <td>&nbsp;15</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;15</td> <td>&nbsp;12</td> <td>N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>290</td> </tr> <tr> <td>Launch Date</td> <td>March 2015</td> <td>March 2013</td> <td>Sept 2014</td> <td>Sept 2014</td> <td>Nov 2013</td> <td>May 2013</td> <td>Oct 2013</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$999</td> <td>&nbsp;$999</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>$549</td> </tr> </tbody> </table> </div> <p>You probably noticed that the Titan X has a whopping 96 ROPs. These render output units are responsible for the quality and performance of your anti-aliasing (AA), among other things. AA at 4K resolutions can kill your framerate, so when Nvidia pitches the Titan X as a 4K card, the number of ROPs here is one of the reasons why. They've also made a return to a high number of texture mapping units. TMUs take a 3D object and apply a texture to it, after calculating angles and perspectives. The higher your resolution, the more pixels you're dealing with, so this is another change that serves 4K performance well.</p> <p>"SM" stands for "streaming multi-processor." Stream processing allows a GPU to divide its workload to be processed on multiple chips at the same time. In Nvidia's architecture, each one of these SMs contains a set of CUDA cores and a small amount of dedicated cache memory (apart from the gigabytes of VRAM listed on the box). Having 50 percent more SMs than your next-fastest card should give you an impressive jump in performance. The result won't be linear, though, becuase the Titan X has lower clock speeds—those extra one billion transistors on the Titan X generate additional heat, so lowering clocks is the main way of dealing with that. Its siblings the GTX 980 and 970 have "only" 5.2 billion transistors each, so they can set their clocks much higher.</p> <p><img src="/files/u160416/titanx2.jpg" width="620" height="390" /></p> <p>Despite all the silicon crammed into the Titan X, it still uses Nvidia's reference dimensions; it's only about 10.5 inches long, and it's not taller or wider than the slot bracket. If not for its darker coloring, you could easily confuse it for any baseline Nvidia card released in the past couple years. Its fan is noticeably quieter than the Titans that have come before, but it won't disappear into the background like we've seen (heard) when Nvidia's partners install their own cooling systems. If you want reliable quietude, you'll have to wait for EVGA's Hydro Copper version, which attaches to a custom water-cooling loop, or try your hand at <a title="Accelero Hybrid GTX 680 Review" href="http://www.maximumpc.com/arctic_cooling_accelero_hybrid_gtx_680_review" target="_blank">something like Arctic Cooling's Accelero Hybrid.</a></p> <p>One card arguably missing from our lineup is the Titan Black. However, <a title="Nvidia GeForce GTX 780 Ti review" href="http://www.maximumpc.com/gigabyte_gtx_780_ti_oc_review" target="_blank">the GTX 780 Ti</a> is basically the same thing, but with a 3GB frame buffer instead of a 6GB frame buffer, and slightly lower clock speeds.</p> <p><a title="AMD Radeon R9 290X review" href="http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review" target="_blank">The Radeon R9 290X</a> is the fastest GPU that AMD currently has available, so we thought it would make for a good comparison, despite being about a year and a half old; and the MSI Lightning edition is arguably the beefiest version of it.</p> <p>Before we show you the benchmarks, here's the system that we used to test these cards:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Corsair AX1200</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 64-bit</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p>Our Sandy Bridge-E system is getting a little long in the tooth, but the Intel Core i7-3960X is still quite a beefy chip and fine for benchmarking video cards. We'll probably be moving to the Haswell-E platform soon.</p> <p>We test with every game set to its highest graphical preset and 4x multi-sampled anti-aliasing (MSAA). Sometimes individual settings can be increased even further, but we leave these alone for more normalized results. That's because these settings are usually optimized for a specific brand of cards, which can end up skewing results. For example, we leave PhysX disabled. We did make one exception, to show you how much of an impact certain niche settings can have: At 3840x2160, we tested Tomb Raider with TressFX on, and TressFX off. Since this hair-rendering tech is an open spec, both Nvidia and AMD can optimize for it.</p> <p>MSAA is not an available setting in Tomb Raider, so we use 2x super-sample antialiasing (SSAA) instead. This form of AA generates a higher resolution frame than what the monitor is set at, and squishes the frame down to fit.</p> <p>All Nvidia cards in this roundup were tested with the 347.84 drivers, which were given to us ahead of release and are scheduled to be available for everyone to download on March 17th. The Titan X is also scheduled to hit retail on this day. We tested the R9 290X with <a href="http://www.maximumpc.com/amds_year_end_gift_gamers_catalyst_omega_special_edition_driver_2014" target="_blank">AMD's Omega drivers released in December</a>.</p> <h3 style="text-align: right;"><a title="GeForce Titan X Review Page 3" href="http://www.maximumpc.com/nvidia_titan_x_review_2015?page=0,2" target="_self">Click here to see the benchmarks and analysis!</a></h3> <hr /> <p>We test with a mix of AMD-friendly and Nvidia-friendly titles (it seems like you're either one or the other, these days); Metro: Last Light, Hitman: Absolution, and Tomb Raider usually favor AMD; Batman: Arkham Origins, Middle-earth: Shadow of Mordor, and Unigine Heaven favor Nvidia. In all cases, we use their built-in bechmarks to minimize variance.</p> <h3>1920x1080 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td class="item">Titan X</td> <td>&nbsp;93</td> <td>&nbsp;127</td> <td>&nbsp;84</td> <td class="item-dark">&nbsp;106</td> <td>&nbsp;205</td> <td>&nbsp;97</td> </tr> <tr> <td>Titan</td> <td>&nbsp;63</td> <td>&nbsp;80</td> <td>&nbsp;63</td> <td>&nbsp;67</td> <td>&nbsp;129</td> <td>&nbsp;57</td> </tr> <tr> <td>980</td> <td>&nbsp;86</td> <td>&nbsp;99</td> <td>&nbsp;70</td> <td>&nbsp;93</td> <td>&nbsp;164</td> <td>&nbsp;79</td> </tr> <tr> <td>970</td> <td>&nbsp;71</td> <td>&nbsp;81</td> <td>&nbsp;59</td> <td>&nbsp;72</td> <td>&nbsp;132</td> <td>&nbsp;61</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;72</td> <td>&nbsp;84</td> <td>&nbsp;70</td> <td>&nbsp;77</td> <td>&nbsp;142</td> <td>&nbsp;69</td> </tr> <tr> <td>780</td> <td>&nbsp;67</td> <td>&nbsp;77</td> <td>&nbsp;65</td> <td>&nbsp;71</td> <td>&nbsp;122</td> <td>&nbsp;62</td> </tr> <tr> <td>290X</td> <td>&nbsp;82</td> <td>&nbsp;111</td> <td>&nbsp;64</td> <td>&nbsp;84</td> <td>&nbsp;143</td> <td>&nbsp;65</td> </tr> </tbody> </table> </div> </h4> <p>You probably noticed that the GTX 780 trades blows with the original GTX Titan, despite the Titan having better specs. The 780 benefits from a higher clock speed and an enhanced cooler designed by Asus. Historically, Nvidia has not allowed its partners to use vendor-specific coolers on the Titan cards, so the other cards with slightly lower specs and better cooling could catch up with some overclocking. However, Nvidia says that the Titan X was highly overclockable despite using a reference cooler, so we'll be exploring that soon.</p> <p>The 780 Ti handily beats the original Titan despite also using reference clock speeds, because the Ti variant is basically a Titan Black, which is the sequel to the original Titan and came out about a year later. (And the Titan X is a physically black card, while the Titan Black is not. It can get a little confusing.)</p> <p>Meanwhile, the R9 290X beats all the Kepler generation cards, except in Hitman: Absolution, which is usually a bastion for AMD's GPUs. It looks like Nvidia has figured out some driver optimizations here.</p> <p>In general, the Titan X says to the other cards, "Get on my level." It's clearly operating on a different tier of performance.&nbsp;<a title="Nvidia GeForce GTX 980 Review" href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014" target="_blank">The GTX 980</a> also stays generally ahead of the 290X by a comfortable margin.</p> <h3>2560x1440 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td class="item">Titan X</td> <td>&nbsp;64</td> <td>&nbsp;90</td> <td>&nbsp;60</td> <td class="item-dark">&nbsp;77</td> <td>&nbsp;129</td> <td>&nbsp;61</td> </tr> <tr> <td>Titan</td> <td>&nbsp;44</td> <td>&nbsp;58</td> <td>&nbsp;43</td> <td>&nbsp;49</td> <td>&nbsp;77</td> <td>&nbsp;38</td> </tr> <tr> <td>980</td> <td>&nbsp;59</td> <td>&nbsp;71</td> <td>&nbsp;46</td> <td>&nbsp;67</td> <td>&nbsp;105</td> <td>&nbsp;48</td> </tr> <tr> <td>970</td> <td>&nbsp;47</td> <td>&nbsp;59</td> <td>&nbsp;39</td> <td>&nbsp;51</td> <td>&nbsp;81</td> <td>&nbsp;36</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;51</td> <td>&nbsp;62</td> <td>&nbsp;48</td> <td>&nbsp;56</td> <td>&nbsp;86</td> <td>&nbsp;42</td> </tr> <tr> <td>780</td> <td>&nbsp;47</td> <td>&nbsp;59</td> <td>&nbsp;44</td> <td>&nbsp;52</td> <td>&nbsp;80</td> <td>&nbsp;40</td> </tr> <tr> <td>290X</td> <td>&nbsp;54</td> <td>&nbsp;83</td> <td>&nbsp;54</td> <td>&nbsp;63</td> <td>&nbsp;91</td> <td>&nbsp;40</td> </tr> </tbody> </table> </div> </h4> <p>As we ratchet up the resolution (while keeping all other graphical settings the same) we see the performance separation begin. While everyone comfortably sustained 60-plus fps at 1080p, older GPUs struggle to maintain that threshold at 2560x1440, as does the GTX 970. We're pushing 77 percent more pixels onto the screen, and the original Titan's relatively low number of ROPs, low clock speeds, and Kepler-generation CUDA cores combine to make an obstacle that the other cards don't have to deal with. The new Titan X is producing well over 50 percent more frames in some of these tests, despite generating less noise, about the same amount of heat, and costing about the same. Wringing these kind of gains from the same 28nm process node is pretty impressive. It comfortably beats AMD's best card in every test. Tomb Raider and <a title="Batman: Arkham Origins review" href="http://www.maximumpc.com/batman_arkham_origins_review_2014" target="_blank">Batman: Arkham Origins</a> distinguish themselves as two particularly well-optimized games.&nbsp;</p> <p>The R9 290X remains ahead of Nvidia's Kepler cards and pulls away in Hitman. AMD's 512-bit bus provides a wide pipe for memory bandwidth, and that advantage emerges once you move past 1080p. It's not until we encounter newer premium cards like the GTX 980 and Titan X that we find a competitive alternative from Nvidia. And when the Titan X arrives, it makes a statement, decisively maintaining 60-plus fps no matter what we threw at it. We'd want nothing less from a card that costs nearly three times as much as the 290X. The GTX 980 gets more mixed results here, but it still looks like a great card for playing at this resolution.</p> <h3>3840x2160 Bechmark Results, Average Frames Per Second</h3> <h4 style="font-size: 12px;"> <div class="spec-table orange" style="font-weight: normal;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td> <p>&nbsp;</p> </td> <td> <p>Metro:</p> <p>Last Light</p> </td> <td> <p>Arkham</p> <p>Origins</p> </td> <td> <p>Hitman:</p> <p>Absolution</p> </td> <td> <p>Shadow of</p> <p>Mordor</p> </td> <td> <p>Tomb</p> <p>Raider*</p> </td> <td> <p>Unigine</p> <p>Heaven</p> </td> </tr> <tr> <td class="item">Titan X</td> <td>&nbsp;35</td> <td>&nbsp;53</td> <td>&nbsp;33</td> <td class="item-dark">&nbsp;44</td> <td>&nbsp;44/60</td> <td>&nbsp;26</td> </tr> <tr> <td>Titan</td> <td>&nbsp;24</td> <td>&nbsp;34</td> <td>&nbsp;22</td> <td>&nbsp;25</td> <td>&nbsp;26/37</td> <td>&nbsp;18</td> </tr> <tr> <td>980</td> <td>&nbsp;32</td> <td>&nbsp;41</td> <td>&nbsp;24</td> <td>&nbsp;37</td> <td>&nbsp;36/48</td> <td>&nbsp;20</td> </tr> <tr> <td>970</td> <td>&nbsp;24</td> <td>&nbsp;32</td> <td>&nbsp;19</td> <td>&nbsp;28</td> <td>&nbsp;27/37</td> <td>&nbsp;15</td> </tr> <tr> <td>780 Ti</td> <td>&nbsp;27</td> <td>&nbsp;38</td> <td>&nbsp;23</td> <td>&nbsp;32</td> <td>&nbsp;29/40</td> <td>&nbsp;19</td> </tr> <tr> <td>780</td> <td>&nbsp;26</td> <td>&nbsp;35</td> <td>&nbsp;23</td> <td>&nbsp;30</td> <td>&nbsp;27/38</td> <td>&nbsp;18</td> </tr> <tr> <td>290X</td> <td>&nbsp;28</td> <td>&nbsp;41</td> <td>&nbsp;29</td> <td>&nbsp;37</td> <td>&nbsp;31/43</td> <td>&nbsp;17</td> </tr> </tbody> </table> </div> </h4> <p style="text-align: left;"><span style="font-weight: normal;">*<em>TressFX on/TressFX off</em></span></p> <p><span style="font-weight: normal;">When you look at these results, it's important to keep in mind that our review process does not aim for playable framerates. We want to see how these cards perform when pushed to the limit. Despite this demanding environment, the Titan X remains a viable solo card to have at 4K, though it's still not ideal (putting aside for the moment <a title="4K Monitors: Everything You Need to Know" href="http://www.maximumpc.com/4k_monitor_2014" target="_blank">the technical resolution difference between DCI 4K and Ultra HD 4K</a>). The good news is that 4xMSAA is arguably not needed at a resolution this high, unless you're gaming on a big 4K HDTV that's less than a couple of feet from your eyes.</span></p> <p><span style="font-weight: normal;">Those with screens that are 32 inches or smaller will probably be fine with 2xMSAA, or some version of SMAA (</span><span style="font-weight: normal; font-size: 1em;">Enhanced Subpixel Morphological Antialiasing), which is known to be quite efficient while producing minimal blurriness and shimmering. Nvidia's TXAA (Temporal Anti-Aliasing) can be a good option when you have one of the company's cards and are playing a game that supports the feature. And with the Maxwell generation of cards (the Titan X, GTX 980, and GTX 970), you also have MFAA, or&nbsp;Multi-Frame Sample Anti-Aliasing. The company claims that this gets you 4xMSAA visual quality at the performance cost of 2xMSAA.</span></p> <p><span style="font-weight: normal; font-size: 1em;">The GTX 780 nearly catches up with the 780 Ti at this resolution, again demonstrating the importance of clock speeds, although the difference is pretty modest in this scenario. At 4K, this GTX 780's additional 3GB of VRAM also comes into play. The 6GB card spends less processing power on memory management. However, the 780 does not support 4-way SLI, if that's your thing. It's limited to 3-way SLI. The GTX 970 and 980 have the same difference with their SLI support. The GTX 960 is limited to only 2-way SLI. This is one of the methods that Nvidia uses to encouraging the purchase of their more expensive cards. All Titans support 4-way SLI.</span></p> <p><span style="font-weight: normal; font-size: 1em;">The R9 290X maintains its lead over Kepler, though it shrinks inside the margin of error at times. It's weakest in Unigine Heaven, because this benchmark makes heavy use of tessellation (dynamically increasing surface complexity by subdividing triangles in real time), and that's something that Kepler and Maxwell do much better. In general, it's a very respectable performer, especially for the price, which has fallen to roughly that of a GTX 970. Since the 290X is meaningfully faster in every single benchmark that we used, and it bumps up against the GTX 980 when we get to 4K, it makes for a pretty good spoiler until the Titan X arrives and leapfrogs everyone in the contest.</span></p> <p><span style="font-weight: normal; font-size: 1em;"><img src="/files/u160416/titanx1.jpg" width="620" height="393" /></span></p> <h3><span style="font-weight: normal; font-size: 1em;">Conclusion</span></h3> <p><span style="font-weight: normal; font-size: 1em;">Overall, things are looking pretty rosy for the Titan X. Since it's packed with a huge amount of ROPs, SMs, shader processors, and VRAM, it's able to overcome the limitation of the aging 28nm process. The Maxwell-generation CUDA cores are also about 40 percent faster than the older Kepler version (by Nvidia's estimation, at least), and the company improved color compression for additional performance gains. It's not the Chosen One if you want to game with a single GPU at 4K, but you can get pretty close if you're willing to tweak a few graphical settings.</span></p> <p><span style="font-weight: normal; font-size: 1em;">Also keep in mind that it was about one year ago when Nvidia debuted the GTX Titan Z, which has two Titan Black GPUs on a single card. So they may plan to drop a dual Titan X sometime soon, as well. And there's room in the lineup for a "980 Ti," since there's quite a spec gap (and price gap) right now between the GTX 980 and the GTX Titan X. If that's not enough, <a title="AMD Radeon R9 370 Core Edition Leaked" href="http://www.maximumpc.com/xfx_radeon_r9_370_core_edition_leaks_web_higher_end_r300_series_cards_follow" target="_blank">rumors around AMD's next generation of video cards are reaching a boiling point</a>. There's always something new around the corner, isn't there? But if you're comfortable with this price tag, and you don't care about what AMD's got cooking, the Titan X is the fastest thing you'll find for gaming beyond 1080p.</span></p> http://www.maximumpc.com/nvidia_titan_x_review_2015#comments Gaming gpu Hardware Nvidia Titan X sli Video Card Reviews Tue, 17 Mar 2015 19:00:13 +0000 Tom McNamara 29579 at http://www.maximumpc.com Possible Look at Specifications and Performance for AMD's Radeon R9 390X http://www.maximumpc.com/possible_look_specifications_and_performance_amds_radeon_r9_390x_2015 <!--paging_filter--><h3><img src="/files/u69/amd_radeon_1.jpg" alt="AMD Radeon R9 290X" title="AMD Radeon R9 290X" width="228" height="170" style="float: right;" />A potentially beastly card in the making</h3> <p>Go ahead and apply the standard disclaimer about leaked specs not being verified or official, because that's certainly the case here. Disclaimer aside, we hope that <strong>unconfirmed specifications of the AMD's forthcoming Radeon R9 390X graphics card</strong> turn out to be accurate, because if they are, it's going to be a potent part that's up to 60 percent faster than AMD's Radeon R9 290X.</p> <p>The folks at <a href="http://videocardz.com/55146/amd-radeon-r9-390x-possible-specifications-and-performance-leaked" target="_blank"><em>Videocardz</em></a> asked their source if he could share additional information about AMD's new flagship graphics card, and to the site's surprised, he responded in kind with a few more goodies to digest. One of those goodies is that AMD scrapped plans to run with 4GB of High Bandwidth Memory (HBM) Gen1 (1GB per stack) after Nvidia unveiled its Titan X graphics card. Now the plan is to release the Radeon R9 390X with 8GB, but Gen2 (2GB per stack), on a 4,096-bit bus (1,024-bit per stack). That should give the card around 1.25TB/s of memory bandwidth.</p> <p>The GPU is said to be a 28nm Fiji XT part with 4,096 unified cores and 256 Texture Mapping Units (TMUs). There's no mention of ROPs or core clockspeed, though the boost clockspeed is reportedly 1,050MHz. Other specs include a 1,250MHz memory clock, 8.6TFLOPS of compute performance, and either a 6+8 pin or dual 8-pin PCI-E configuration.</p> <p>There's also a performance slide that was leaked, and if it's accurate, performance will be up to around 1.65 times that of the Radeon R9 290X in 4K gaming.</p> <p>Reports from elsewhere on the web have the card debuting at around $700, which is also unconfirmed.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/possible_look_specifications_and_performance_amds_radeon_r9_390x_2015#comments amd Build a PC Fiji Gaming gpu graphics card Hardware Radeon R9 390X Video Card News Mon, 16 Mar 2015 15:41:48 +0000 Paul Lilly 29588 at http://www.maximumpc.com XFX Radeon R9 370 Core Edition Leaks to Web, Higher End R300 Series Cards to Follow http://www.maximumpc.com/xfx_radeon_r9_370_core_edition_leaks_web_higher_end_r300_series_cards_follow <!--paging_filter--><h3><img src="/files/u69/xfx_card.jpg" alt="XFX Card" title="XFX Card" width="228" height="143" style="float: right;" />AMD R300 Series is around the corner</h3> <p>We know that AMD is getting ready to refresh its graphics card lineup -- a refresh that's long overdue, as far as we're concerned -- though it looks like the first of the upcoming Radeon R9 300 Series won't be a flagship part. At least that won't be the case if, as rumored, <strong>XFX launches its Radeon R9 370 Core Edition video card</strong> powered by AMD's Trinidad Pro processor next month.</p> <p>The rumor <a href="http://videocardz.com/55051/xfx-radeon-r9-370-core-edition-leaks-out-coming-early-april" target="_blank">originates at <em>Videocardz</em></a>, which caught wind of the forthcoming card by a reader of the site claiming to work for XFX. According to the supposed XFX employee, the first GPU of the R300 Series will be Trinidad Pro, and the site believes him to be telling the truth after a new leak from XFX seemed to corroborate his story.</p> <p>If true, the R9 370 Core Edition (R9-370A-ENF) will come in 2GB and 4GB GDDR5 versions, both with a 256-bit memory bus, single 6-pin PCI-E power connector, and two Dual-Link DVI ports flanked by HDMI and DisplayPort.</p> <h3>R300 Series</h3> <p>Based on the rumors so far, the R9 370 Core Edition will be a mid-range card. Here's a look at the full lineup:</p> <ul> <li>AMD Radeon R9 390X: 28nm Fiji XT GPU, 3,584 cores, 224 TMUs, 64 ROPs, 4GB memory, $599</li> <li>AMD Radeon R9 390: 28nm Fiji Pro GPU, 3,328 cores, 208 TMUs, 64 ROPs, 4GB GDDR5, $399</li> <li>AMD Radeon R9 380X: 28nm Hawaii XTX GPU, 2,816 cores, 176 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 380: 28nm Hawaii Pro GPU, 2,560 cores, 160 TMUs, 64 ROPs, 4GB GDDR5, 512-bit, price unknown</li> <li>AMD Radeon R9 375X: Tonga XT GPU, 2,048 cores, 128 TMUs, 32 ROPs, 2GB GDDR5, 384-bit, price unknown</li> <li>AMD Radeon R9 375: Tonga Pro GPU, 1,792 cores, 112 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370X: Trinidad XT GPU, 1,280 cores, 80 TMUs, 32 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R9 370: 28nm Trinidad Pro GPU, 1,024 cores, 64 TMUs, 24 ROPs, 2GB GDDR5, 256-bit, price unknown</li> <li>AMD Radeon R7 360X: Bermuda GPU, 896 cores, 128-bit GDDR5, price unknown</li> <li>AMD Radeon R7 350X/340X: Oland GPU, 320 cores, DDR3 and GDDR5 memory, 128-bit</li> <li>AMD Radeon R5 300: Caicos GPU, 160 cores, DDR3 memory, 64-bit</li> </ul> <p>None of these are official or set in stone, and as you can see, more is 'known' (rumored) about the higher end GPUs than the lower end ones. So, take these specs with a block of salt.</p> <p>There are also a few benchmarks scattered around the web, though their legitimacy is a huge question mark, expecially when putting up numbers <a href="http://wccftech.com/amd-radeon-r9-300-gpu-alleged-3d-mark-benchmarks-leaked/" target="_blank">like this</a>.</p> <p>Regardless, it looks like we won't have to wait long to see what kind of performance AMD's R300 Series brings to the table.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/xfx_radeon_r9_370_core_edition_leaks_web_higher_end_r300_series_cards_follow#comments amd Build a PC graphics card Hardware R300 Series Radeon R9 370 Core Edition Video Card xfx News Thu, 12 Mar 2015 15:12:35 +0000 Paul Lilly 29574 at http://www.maximumpc.com First Purported GeForce Titan X Benchmarks Appear Online http://www.maximumpc.com/first_purported_geforce_titan_x_benchmarks_appear_online_2015 <!--paging_filter--><h3><img src="/files/u69/titan_x_0.jpg" alt="Nvidia GeForce Titan X" title="Nvidia GeForce Titan X" width="228" height="168" style="float: right;" />Sneak peek at performance</h3> <p>When <a href="http://www.maximumpc.com/nvidia_unveils_titan_x_graphics_card_gdc_2015">Nvidia unveiled its GeForce Titan X</a> graphics card at the 2015 Game Developers Conference (GDC) last week, company CEO Jen-Hsun Huang revealed almost nothing about the part, other than to say it has 12GB of onboard memory and 8 billion transistors. There was no mention of other specs, let alone benchmarks, though information across the board has begun to leak on the web, including a <strong>first look at how the Titan X performs</strong>.</p> <p>Bearing in mind that none of this is official, the folks at <em>Videocardz.com</em> report that Titan X sports 3,072 CUDA cores, 192 TMUs, 96 ROPs, 1,002MHz core clockspeed (boost is unknown), 1,750MHz memory clock, and a 384-bit memory bus resulting in 336GB/s of bandwidth.</p> <p>The site also reports there are three mini DisplayPorts, a single DisplayPort, and an HDMI port, along with 6-pin and 8-pin (one each) PCI-E power connectors.</p> <p>As for the benchmarks, they show the Titan X scoring 22,903 in 3DMark 11 using the Performance setting and 26,444 when overclocked. Both are lower scores than AMD's Radeon R9295X2 (28,930), though they blow the Titan (13,814) and Titan Black (14,557) out of the water.</p> <p>There are also benchmarks for other 3DMark tests, along with 2-way, 3-way, and 4-way Titan X SLI scores. Check them out <a href="http://videocardz.com/55013/nvidia-geforce-gtx-titan-x-3dmark-performance" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/first_purported_geforce_titan_x_benchmarks_appear_online_2015#comments Build a PC GeForce Titan X graphics card Hardware nvidia Video Card News Wed, 11 Mar 2015 17:25:16 +0000 Paul Lilly 29572 at http://www.maximumpc.com Nvidia's Giving Away Witcher 3 Codes with Select GeForce GTX 900 Series Graphics Cards http://www.maximumpc.com/nvidias_giving_away_witcher_3_codes_select_geforce_gtx_900_series_graphics_cards_2015 <!--paging_filter--><h3><img src="/files/u69/witcher_3_0.jpg" alt="The Witcher 3: Wild Hunt" title="The Witcher 3: Wild Hunt" width="228" height="203" style="float: right;" />A little gaming bribery never hurt anyone</h3> <p>After the fiasco with Nvidia's GeForce GTX 970 graphics card and the way it handles the last .5GB of its onboard 4GB of memory, Nvidia could use a bit of positive press. One of the best ways to do that is to dangle something shiney in front of the public, like an anticipated game. So, available now for a limited time, <strong>customers who buy a select GeForce GTX 980, 970, and 960 graphics card, or a GTX 970M or above notebook, will receive a code for The Witcher 3: Wild Hunt</strong>, Nvidia announced today.</p> <p>"Over my 10-plus years at Nvidia, I’ve seen, worked with, and played countless games. Few stand out to me as deserving of the term epic. The Witcher: Wild Hunt is one of those titles," Nvidia's Leslie Pirritano stated in a blog post. "Developer CD Projekt Red has provided gamers with an epic story, an epic adventure, and epic graphics. The untamed world of this action-adventure game is a graphics showcase, with stunning vistas and detailed characters. So, it’s exciting to me that we’re offering it to GeForce gamers as part of our new 'Undeniably Epic' bundle."</p> <p>Nvidia was also quick to point out that the upcoming title supports technologies like Nvidia HairWorks and PhysX, the first of which will add a level realism to the fur and hair of more than 50 monsters and characters in the game.</p> <p>The Witcher 3: Wild Hunt is currently scheduled to release May 19, 2015. To grab a qualifying card, be sure to start you <a href="http://www.geforce.com/GetWitcher3" target="_blank">search here</a>, which has links to participating vendors.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidias_giving_away_witcher_3_codes_select_geforce_gtx_900_series_graphics_cards_2015#comments Build a PC games geforce graphics card Hardware nvidia Software the witcher 3: wild hunt Video Card News Tue, 10 Mar 2015 15:57:54 +0000 Paul Lilly 29567 at http://www.maximumpc.com Nvidia GeForce GTX Titan X: We Touched It http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_we_touched_it_2015 <!--paging_filter--><h3>A quick peek into the future</h3> <p>In the land of video cards, Nvidia's GTX Titan is generally considered the king. The original gangster came out in February 2013, followed by the Titan Black a year later, each sporting an unprecedented 6GB of RAM, 7 billion transistors, and more shader processors than you could shake a stick at (eventually tipping the scales at 2880). Nvidia capped it off in March 2014 with the Titan Z, which put two Titan Black GPUs on one card. And now it's been nearly a year since we've seen activity from them on the super-premium end. But the company hasn't been idle. Today we got up close and personal with this obsidian brick of magic, the GTX Titan X.</p> <p>How close? This close:</p> <p><img src="/files/u160416/titanx_620.jpg" alt="Nvidia GeForce GTX Titan X video card" title="Nvidia GeForce GTX Titan X video card" width="620" height="465" /></p> <p>Unfortunately, we were forced to double-pinky swear that we wouldn't give you any specifics about the card just yet, other than the fact that it's got 12GB RAM, eight billion transistors, and is probably the fastest video card on Earth. But we can confirm that it was running several live demos on the show floor of the Game Developers Conference this week, conducted by Epic, Valve and Crytek. This is obviously not going to be a paper launch -- the card is already here. The Titan X is just waiting in the wings until it can get a proper introduction at Nvidia's GPU Technology Conference, which starts on March 17th. In the meantime, we took some nifty photos for you. Hope you brought a bib for the drool!</p> http://www.maximumpc.com/nvidia_geforce_gtx_titan_x_we_touched_it_2015#comments geforce gpu GTC nvida Titan X Video Card News Fri, 06 Mar 2015 02:00:47 +0000 Tom McNamara 29548 at http://www.maximumpc.com Nvidia Unveils Titan X Graphics Card at GDC http://www.maximumpc.com/nvidia_unveils_titan_x_graphics_card_gdc_2015 <!--paging_filter--><h3><img src="/files/u69/titan_x.jpg" alt="Titan X" title="Titan X" width="231" height="177" style="float: right;" />A new top-end GPU</h3> <p>It was speculated that Nvidia might announce a new Titan graphics card during GDC, and that's what the company did—in a somewhat dramatic fashion. It happened at the tail end of an Unreal Engine panel. As Epic founder Tim Sweeny wrapped up his discussion on the state of Unreal, <strong>Nvidia CEO Jen-Hsun Huang surprised attendees by emerging on stage to unveil the company's Titan X</strong>.</p> <p>He called it the "world's most advanced GPU," though was short on details. What he <em>was</em> willing to divulge about the card is that it has 12GB of onboard memory and 8 billion transistors. For the sake of comparison, Titan Black has 7.1 billion transistors and 6GB of GDDR5 memory.</p> <p>"It’s the most advanced GPU the world has ever seen," Jen-Hsun said.</p> <p>He then presented the company's first production unit to Sweeny, though not before autographing the box in came in.</p> <p>Nvidia will release more details about the card during the upcoming GTC event that runs from March 17–20.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_unveils_titan_x_graphics_card_gdc_2015#comments Build a PC Gaming GDC 2015 graphics card Hardware nvidia Titan X Video Card News Wed, 04 Mar 2015 19:16:25 +0000 Paul Lilly 29540 at http://www.maximumpc.com EVGA Announces GeForce GTX 960 SuperSC with 4GB of Onboard Memory http://www.maximumpc.com/evga_announces_geforce_gtx_960_supersc_4gb_onboard_memory_2015 <!--paging_filter--><h3><img src="/files/u69/evga_geforce_gtx_960_supersc.jpg" alt="EVGA GeForce GTX 960 SuperSC" title="EVGA GeForce GTX 960 SuperSC" width="228" height="148" style="float: right;" />Now with twice the GDDR5 memory</h3> <p>There were rumors earlier this year that 4GB versions of Nvidia's GeForce GTX 960 graphics card would show up in March, and it turns out they were right. <strong>EVGA has emerged as the first to cross into 4GB territory with its GeForce GTX 960 SuperSC graphics card announced today</strong>. Though it's a mid-range card, EVGA is promoting the benefit of higher texture qualities and better 4K resolution gaming performance with the added memory.</p> <p>To keep the things cool and quiet, EVGA has also outfitted its newest graphics card with its ACX 2.0+ custom cooler.</p> <p>"The new EVGA ACX 2.0+ cooler brings new features to the award winning EVGA ACX 2.0 cooling technology. A Memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 11C, and optimized Straight Heat Pipes (SHP) reduce GPU temperature by 5C," EVGA says. "ACX 2.0+ coolers also feature optimized Swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU."</p> <p>EVGA's GeForce GTX 960 SuperSC sports 1,279MHz base and 1,342MHz boost clockspeeds, which are overclocked from the reference design's 1,127MHz base and 1,178MHz boost specifications. The 4GB of GDDR5 memory stays at stock (7,010MHz) on a 128-bit bus.</p> <p>The card is also notable for its dual-BIOS design. Should something go wrong while tinkering, you can switch to a secondary BIOS with a quick flip of a switch.</p> <p>No word yet on when the 4GB card will be available or for how much. There is, however, a "Notify Me" button on the card's <a href="http://www.evga.com/articles/00914/EVGA-GeForce-GTX-960-4GB/" target="_blank">product page</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/evga_announces_geforce_gtx_960_supersc_4gb_onboard_memory_2015#comments 4GB Build a PC evga geforce gtx 960 graphics card Hardware nvidia Video Card News Tue, 03 Mar 2015 18:06:51 +0000 Paul Lilly 29527 at http://www.maximumpc.com Nvidia CEO is Mocked for Explanation of GeForce GTX 970 Memory Issue http://www.maximumpc.com/nvidia_ceo_mocked_explanation_geforce_gtx_970_memory_issue_2015 <!--paging_filter--><h3><img src="/files/u69/nvidia_meme.jpg" alt="Nvidia Meme" title="Nvidia Meme" width="228" height="151" style="float: right;" />Here come the memes</h3> <p>Nividia ticked off a lot of people when it came to light that its GeForce GTX 970 graphics card was suffering from performance issues when games tried to access onboard memory above 3.5GB. Turns out it's the result of an architectural design, one that doesn't exist on the GTX 980, and one that wasn't communicated to Nvidia's internal marketing team or externally to reviewers. There's been a lot of negativity surrounding the issue ever since, and in an attempt to diffuse the situation, <strong>Nvidia CEO Jen-Hsun Huang has offered up an explanation of the GTX 970 memory issue</strong>.</p> <p>Before we get into that, we suggest reading <a href="http://www.maximumpc.com/gamers_petition_geforce_gtx_970_refund_over_error_specs_2015">this</a>, <a href="http://www.maximumpc.com/nvidia_will_help_disgruntled_gtx_970_owners_get_refund_says_driver_update_coming_2015">this</a>, and <a href="http://www.maximumpc.com/nvidia_slapped_lawsuit_over_misleading_gtx_970_performance_claims243">this</a> as primers to what's going on. If you're crunched for time, the Cliff Notes version is that the above scenario, along with the discovery that the GTX has less ROPs and L2 cache than advertised, has led to a class action lawsuit.</p> <p>Seeing that the contempt is growing bigger, not smaller, Huang tried explaining away the issue as a "feature" that should have been bragged about.</p> <p>"We invented a new memory architecture in Maxwell. This new capability was created so that reduced-configurations of Maxwell can have a larger framebuffer – i.e., so that GTX 970 is not limited to 3GB, and can have an additional 1GB," Huang stated in a blog post.</p> <p>"GTX 970 is a 4GB card. However, the upper 512MB of the additional 1GB is segmented and has reduced bandwidth. This is a good design because we were able to add an additional 1GB for GTX 970 and our software engineers can keep less frequently used data in the 512MB segment," the CEO continued.</p> <p>According to Huang, the expectation was that users would be "excited" about this, but were ultimately "disappointed that we didn't better describe the segmented nature of the architecture." He also admitted that the "new feature of Maxwell should have been clearly detailed from the beginning."</p> <p>Perhaps so, though looking at the comments to his blog post makes me think this was a ticking time bomb no matter how you slice it. For those holding Nvidia's feet to the fire over this, the bottom line here is that the GTX 970 is gimped compared to the GTX 980, which doesn't have an issue accessing all 4GB of VRAM, and that they were misled, both by the impact this would have and by the advertised specs.</p> <p>"Yes, a 'new feature,' a 'good design' not included on GTX 980 because [it] decreases performance," a reader commented. Another stated, "I will most likely never buy from Nvidia again, they care nothing about their customer. And blatantly lie to our faces."</p> <p>Others took to posting memes and doctored videos, like this one:</p> <p><iframe src="https://www.youtube.com/embed/spZJrsssPA0" width="620" height="465" frameborder="0"></iframe></p> <p>It's hard to watch the above clip without busting a gut, though for Nvidia, this is no laughing matter. To Nvidia's credit, the performance issue seems to only crop up when gaming at high resolutions and shouldn't bother folks gaming at 1080p. And based on the benchmarks when the performance issue doesn't creep up, the bang-for-buck here is pretty high.</p> <p>But in the end, Nvidia is finding out that none of that matters, as their fan base feels it's been lied to. It's going to take more than a blog post to win back their trust and/or make this go away.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_ceo_mocked_explanation_geforce_gtx_970_memory_issue_2015#comments Build a PC geforce gtx 970 graphics card Hardware jen-hsun huang nvidia Video Card News Wed, 25 Feb 2015 17:01:39 +0000 Paul Lilly 29469 at http://www.maximumpc.com How to Overclock Your Graphics Card http://www.maximumpc.com/how_overclock_your_graphics_card_2015 <!--paging_filter--><h3><span style="font-weight: normal;"><img src="/files/u162579/314023-nvidia-geforce-gtx-titan-angle.jpg" alt="Titan" title="Titan" width="250" height="245" style="float: right;" />Learn how to wring every last bit of performance out of your video card</span></h3> <p>Overclocking a graphics card used to be more trouble than it was worth, but things have changed. EVGA Precision X and MSI Afterburner are just two of the most popular choices for software overclocking. AMD even bundles its own overclocking solution—AMD OverDrive—with its Catalyst drivers. Wringing more performance out of your graphics card is now as simple as moving a few sliders and testing for stability with a benchmark.&nbsp;</p> <p>That’s not to say that the best overclocking practices are obvious. We’re here to help with a guide on how to overclock your graphics card. Be forewarned—even the most basic overclocks can end in tragedy. Although we’re willing to walk you through the steps, we can’t be responsible for any damaged hardware or problems arising during the overclocking process. If you’re willing to take the risk, read on to learn how to overclock your graphics card. Keep in mind that the procedure for each video card can be slightly different. If any part of the guide doesn’t make sense, ask for help in the comments or spend some time on Google.&nbsp;</p> <h3><span style="font-weight: normal;">1. Gearing Up</span></h3> <p style="text-align: center;"><img src="/files/u162579/afterburner.png" alt="MSI Afterburner" title="MSI Afterburner" width="500" height="338" /></p> <p style="text-align: center;"><strong>MSI Afterburner is capable overclocking software that works with most AMD and Nvidia cards.</strong></p> <p>Our favorite overclocking software is <a href="http://event.msi.com/vga/afterburner/download.htm" target="_blank">MSI Afterburner</a>. Your other options include <a href="http://www.evga.com/precision/" target="_blank">EVGA Precision X</a> for Nvidia cards, and for AMD Cards, AMD OverDrive, but to keep things simple we’ll be working solely with MSI Afterburner.&nbsp;</p> <p>You’ll also need a benchmark like <a href="http://store.steampowered.com/app/223850/" target="_blank">3DMark</a>—download the demo—or <a href="http://unigine.com/products/heaven/" target="_blank">Unigine’s Heaven Benchmark</a> to make sure your overclocks are stable enough for daily use. They’re also useful for quantifying just how much more performance you’re getting out of your hardware.&nbsp;</p> <p><a href="http://www.techpowerup.com/gpuz/" target="_blank">GPU-Z</a> is the final piece of the puzzle and although you don’t technically need it, it’s super helpful for checking your GPU and memory clock speeds.&nbsp;</p> <h3><span style="font-weight: normal;">2. Getting in the Know</span></h3> <p>Before you even start overclocking, it helps to know what sort of overclocks you can expect from your hardware. <a href="http://hwbot.org/" target="_blank">HWBOT</a> is the easiest way to look up what overclocks other users are achieving. Our test bench included the <a href="http://hwbot.org/hardware/videocard/geforce_gtx_650_ti/" target="_blank">GTX 650 TI</a> and <a href="http://hwbot.org/hardware/videocard/radeon_hd_7850/" target="_blank">7850</a>, which have average overclocks listed on the site.&nbsp;</p> <p>It also helps to know how much real-world performance you’ll be getting out of your overclocks. Although you probably don’t need to run through an entire suite of benchmarks, having a baseline to refer to is useful. Run through 3DMark or Heaven Benchmark once to get your base scores.&nbsp;</p> <h3><span style="font-weight: normal;">3. Core Speed Overclocks</span></h3> <p style="text-align: center;"><img src="/files/u162579/heaven2.jpg" alt="Unigine Heaven" title="Unigine Heaven" width="600" height="338" /></p> <p style="text-align: center;"><strong>Unigine’s Heaven benchmark looks good and is packed with features.</strong></p> <p>Once you’ve got some averages in hand—for the 650 TI: 1,179MHz GPU and 1,687MHz memory—you’re ready to start overclocking. Start by maxing out the Power Limit slider—this isn’t the same as overvolting, the power limit is simply how much power your card can draw. Then grab the Core Clock slider and move it forward at 20MHz increments. After applying your changes, crank up the settings on Heaven Benchmark—quality at ultra, tessellation to extreme, anti-aliasing to 8x, and resolution at system—and run through it at least once by pressing F9 or clicking the “Benchmark” button. &nbsp;Keep an eye out for weird graphical artifacts—visual glitches that range from colorful lines of light to random off-color pixels across the screen—and for crashes. If the benchmark crashes to the desktop, seems to slow down dramatically, or gives you a lower frame rate or score upon completion, drop the clock speed by 10MHz until you can run through the benchmark without any problems.</p> <h3><span style="font-weight: normal;">4. Memory Speed Overclocks</span></h3> <p>When you’ve found the highest stable clock speed for your card, repeat step two with the memory clock slider. Your memory clock speed generally won’t affect your frame rate or benchmark scores as much as the core clock speed, but it’ll help, especially if you’re running at a higher resolution.&nbsp;</p> <h3><span style="font-weight: normal;">5. Stability Check</span></h3> <p>Lock in both of your increased clock speeds, run through Heaven a final time, and you should be seeing higher frame rates and a higher score. Go wild and test out your overclocked card in your favorite games to make sure that it’s stable enough for daily use—if it isn’t, step down your GPU and memory clock speeds until it is. To be extra safe, you can leave Heaven running for a few hours to make sure you won’t run into any problems during an extended gaming session.</p> <p><em>Read on for information on overvolting, special situations, and the results of our overclocks.</em></p> <hr /> <h3><span style="font-weight: normal;">Overvolting</span></h3> <p>If you’re not satisfied with your card’s overclocking performance at standard voltages, some cards let you crank up the voltage to squeeze even more performance out of your hardware. Before you do anything, spend a few minutes on Google to look up what other users are reporting as safe voltages for your specific graphics card.&nbsp;</p> <p style="text-align: center;"><img src="/files/u162579/afterburner_voltage_control_settings.png" alt="MSI Afterburner Properties" title="MSI Afterburner Properties" width="350" height="628" /></p> <p style="text-align: center;"><strong>If you're feeling frisky, unlock voltage control and monitoring.</strong></p> <p>You have to dig into Afterburner's settings to gain access to your card’s voltage. Increase your voltage by 10mV at a time until your overclock is stable, your temperatures exceed 70 degrees Celsius, or you reach your card’s maximum safe voltage.&nbsp;</p> <p>Even if you’re operating within the maximum safe voltage, overvolting a card can have severe consequences, including general instability, decreased part lifespan, and unsafe temperatures. It’s usually a good idea to stick to stock voltages unless you really need every last bit of performance from your card.&nbsp;</p> <h3><span style="font-weight: normal;">Special Situations</span></h3> <p>Each and every video card overclocks differently. These differences aren’t limited to just how much you can push the card. Some cards like the GTX 670 and 680 utilize GPU boost to ramp up graphics performance when you need it. Those cards unlock special sliders in Precision X to manage when the boost is active. If you’re working with a card that has GPU boost, you’ll want to play around with the Power Target slider, which determines when the boost is applied. Pump up the boost and your card won’t downclock as often—unless you’re temperatures are getting too high.</p> <h3><span style="font-weight: normal;">The Results</span></h3> <p style="text-align: center;"><img src="/files/u162579/overclocked_650ti.gif" alt="Nvidia GTX 650 Ti Overclock" title="Nvidia GTX 650 Ti Overclock" width="393" height="485" /></p> <p style="text-align: center;"><strong>We haven’t won any records, but we do have a respectable overclock.</strong></p> <p>In our Nvidia test system with an i5-3570k running at 3.4GHz and a GTX 650 Ti, we managed to overclock the graphics card to 1,161/1,600MHz from a stock 941/1,350MHz. That’s a 19% increase in GPU clock speed and a 16% increase in memory clock speed.&nbsp;</p> <p style="text-align: center;"><img src="/files/u162579/overclocked_7850.png" alt="AMD Radeon HD 7850 Overclock" title="AMD Radeon HD 7850 Overclock" width="393" height="485" /></p> <p style="text-align: center;"><strong>This 7850 didn’t play nice with memory overclocks, but a 190MHz increase in core clock speed isn’t bad at all.</strong></p> <p>Our AMD test system with an i5-3570k running at 3.8GHz and a 7850, generated comparable results with a default 860/1,200MHz pushed to 1,050/1,225MHz. That’s an 18% increase in GPU clock speed and a less impressive 2% bump in memory clock speed.</p> <div style="text-align: left;"> <table class="MsoNormalTable" style="width: 615px; border-collapse: collapse;" border="0" cellspacing="0" cellpadding="0"> <thead> <tr style="mso-yfti-irow: 0; mso-yfti-firstrow: yes; height: .2in;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">&nbsp;</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Stock GTX 650 Ti</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Overclocked GTX 650 Ti</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Stock 7850 </span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Overclocked 7850</span></p> </td> </tr> </thead> <tbody> <tr style="mso-yfti-irow: 1; height: 9.75pt;"> <td style="border: none; border-bottom: solid #CCCCCC 1.0pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">3DMark Fire Strike</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">2,990</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">3,574</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">4,119</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">4,706</span></p> </td> </tr> <tr style="mso-yfti-irow: 2; height: .2in;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Unigine Heaven 4.0 (fps)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">15.6</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">18.7</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">20.5</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: .2in;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">23.8</span></p> </td> </tr> <tr style="mso-yfti-irow: 3; height: 9.75pt;"> <td style="border: none; border-bottom: solid #CCCCCC 1.0pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">BioShock Infinite (fps)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">36.6</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">42.1</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">42.4</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">48.44</span></p> </td> </tr> <tr style="mso-yfti-irow: 4; height: 9.75pt;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Tomb Raider (fps)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">25.2</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">31.5</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">31.3</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #E7E7E7; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: 0in; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">33.2</span></p> </td> </tr> <tr style="mso-yfti-irow: 5; mso-yfti-lastrow: yes; height: 9.75pt;"> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">Core/Memory Clock (MHz)</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">941/1,350</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">1,161/1,600</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">860/1,200</span></p> </td> <td style="border-top: none; border-left: solid #CCCCCC 1.0pt; border-bottom: solid #CCCCCC 1.0pt; border-right: none; mso-border-left-alt: solid #CCCCCC .75pt; mso-border-bottom-alt: solid #CCCCCC .75pt; background: #EDEDED; padding: 6.0pt 11.25pt 6.0pt 11.25pt; height: 9.75pt;" valign="bottom"> <p class="MsoNormal" style="margin-top: .75pt; margin-right: .75pt; margin-bottom: .0001pt; margin-left: 0in; line-height: 14.25pt;"><span style="font-size: 10.0pt; font-family: &quot;Arial&quot;,&quot;sans-serif&quot;; mso-fareast-font-family: &quot;Times New Roman&quot;; color: #666666;">1,050/1,225</span></p> </td> </tr> </tbody> </table> </div> http://www.maximumpc.com/how_overclock_your_graphics_card_2015#comments amd. graphics card gpu how to overclock nvidia overclocking performance Video Card Features How-Tos Fri, 06 Feb 2015 23:28:34 +0000 Ben Kim 27083 at http://www.maximumpc.com AMD Takes a Chip Shot at Nvidia's GTX 970 Controversy, Cuts Radeon R9 290X Pricing http://www.maximumpc.com/amd_takes_chip_shot_nvidias_gtx_970_controversy_cuts_radeon_r9_290x_pricing <!--paging_filter--><h3><img src="/files/u69/amd_4gb.jpg" alt="AMD 4GB" title="AMD 4GB" width="228" height="171" style="float: right;" />Did anybody <em>not</em> see this coming?</h3> <p>What do you do when you see your enemy <a href="http://www.maximumpc.com/gamers_petition_geforce_gtx_970_refund_over_error_specs_2015">twisting in the wind</a>? You strike, of course, and that's exactly what AMD predictably decided to do as rival Nvidia goes into damage control concerning the memory controversy on its GeForce GTX 970 graphics card. <strong>AMD and its partners have lowered the price of their Radeon R9 290X graphics cards to as low as $280 after rebate, or $300 without</strong>.</p> <p>Credit AMD for waiting until precisely the right time to drop pricing. Had AMD done this when news first broke that there were performance issues on the GTX 970 when accessing onboard memory above 3.5GB, it would have been jumping the gun. From a strategic standpoint, it's brilliant to roll out the price cuts immediately after an Nvidia employee said he would help GTX 970 customers <a href="http://www.maximumpc.com/nvidia_will_help_disgruntled_gtx_970_owners_get_refund_says_driver_update_coming_2015">obtain a refund</a> on their card, if they in fact decide to return it and are unable to get a refund on their own.</p> <p>AMD's price cut could end up being the deciding factor for anyone who was on the fence about keeping their GTX 970 card. In reality, Nvidia's card is still a fantastic GPU for the money, at least for most users, and Nvidia originally said it's working on a driver update that should improve memory performance. That bit has since been edited out of the original post, though we suspect Nvidia will still try to fine tune things.</p> <p>Nevertheless, picking up a competitive card for as much as $50 less than the GTX 970 will be tough for some gamers to ignore, especially those who have yet to upgrade. And to make sure the point is driven home, AMD's technical communications lead, Robert Hallock, took a jab at Nvidia on Twitter by <a href="https://twitter.com/Thracks/status/560511204951855104" target="_blank">posting a picture</a> of the Radeon R9 290 with the caption, "4GB means 4GB."</p> <p>Speaking of which, the Radeon R9 290 can be found on Newegg for as little as $250 after rebate, or $270 without.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/amd_takes_chip_shot_nvidias_gtx_970_controversy_cuts_radeon_r9_290x_pricing#comments amd Build a PC Gaming geforce gtx 970 graphics card Hardware nvidia price cut radeon R9 290x Video Card News Fri, 30 Jan 2015 16:49:04 +0000 Paul Lilly 29341 at http://www.maximumpc.com Sapphire Adds Triple Fan Cooler to 8GB Radeon R9 290X, Tweaks Clocks and Lowers Cost http://www.maximumpc.com/sapphire_adds_triple_fan_cooler_8gb_radeon_r9_290x_tweaks_clocks_and_lowers_cost_2015 <!--paging_filter--><h3><img src="/files/u69/sapphire_radeon_r9_290x_8gb_0.jpg" alt="Sapphire Radeon R9 290X 8GB" title="Sapphire Radeon R9 290X 8GB" width="228" height="225" style="float: right;" />More than just a big frame buffer</h3> <p>Sapphire was the first company to release an 8GB version of AMD's Radeon R9 290X graphics card, though it's no longer the only one -- a handful of other graphics card players jumped on board after AMD gave them a <a href="http://www.maximumpc.com/amd_bumps_ram_8gb_radeon_r9_290x_announces_civilization_beyond_earth_bundle" target="_blank">reference design</a> to play with. Be that as it may, <strong>Sapphire is intent on standing out from the crowd, so it went and retooled its 8GB R9 290X with a triple fan cooler</strong> and some other changes.</p> <p>According to Sapphire, its Tri-X triple fan cooler is the first in the industry to use a central 10mm heatpipe in addition to four subsidiary heatpipes for even heat distribution throughout the heatsink. The fans themselves have dust repelling bearings with dual ball races and are equipped with aerofoil section blades. Topping it off is a fan cowling designed to guide the airflow for maximum cooling efficiency, Sapphire says.</p> <p>The company also points out that it builds its own PCB rather than outsourcing production. In this instance, its using a 6-phase VDDC power design.</p> <p>You'll find 8GB of GDDR5 memory on the new card, along with a 512-bit interface. The memory is "now clocked at 1375MHz (5.5GHz effective) delivering higher bandwidth than earlier models."</p> <p>Other features include a dual BIOS design, two 8-pin power connectors, and engine clock of up to 1020MHz.</p> <p>As for pricing? Good question -- Sapphire said the card comes it at a "slightly lower cost" but didn't specifiy an exact price. It's also not showing up in retail yet, though we'll update this article when/if we hear back from them. In the meantime, you can see more of the card on its <a href="http://www.sapphiretech.com/presentation/product/?cid=1&amp;gid=3&amp;sgid=1227&amp;pid=2548&amp;psn=&amp;lid=1&amp;leg=0" target="_blank">product page</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/sapphire_adds_triple_fan_cooler_8gb_radeon_r9_290x_tweaks_clocks_and_lowers_cost_2015#comments 8GB Build a PC Gaming graphics card Hardware radeon R9 290x sapphire Video Card News Thu, 29 Jan 2015 18:45:34 +0000 Paul Lilly 29334 at http://www.maximumpc.com