nvidia http://www.maximumpc.com/taxonomy/term/320/ en Nvidia Will Help Disgruntled GTX 970 Owners Get a Refund, Says a Driver Update is Coming http://www.maximumpc.com/nvidia_will_help_disgruntled_gtx_970_owners_get_refund_says_driver_update_coming_2015 <!--paging_filter--><h3><img src="/files/u69/gtx_970.jpg" alt="Nvidia GeForce GTX 970" title="Nvidia GeForce GTX 970" width="228" height="156" style="float: right;" />Upcoming driver could improve GTX 970's memory performance</h3> <p>Nvidia really stepped in a pile of PR poo when it was discovered that there was an internal communication gaffe over the way the GeForce GTX 970 handles its 4GB of onboard memory and the resulting specs. In short, the GTX 970 has 56 ROPs and 1,792KB of L2 cache instead of matching the GTX 980's 64 ROPs and 2,048KB of L2 cache as originally advertised. However, <strong>Nvidia wants to make things right and has offered to help GTX 970 owners obtain a refund</strong>, if need be. Should you go that route?</p> <p>In most cases, probably not. Before reading any further, however, we highly recommend familiarizing yourself with the situation by <a href="http://www.maximumpc.com/gamers_petition_geforce_gtx_970_refund_over_error_specs_2015" target="_blank">reading this</a>. Don't worry, we won't go anywhere -- we'll be right here when you get back.</p> <p>Finished? Great, now here's the deal. Nvidia stated on its forum that it's working on a driver update that will do a better job managing the memory scheme on the GTX 970, and expects to improve performance. Granted there's only so much that can be done on the software side to address a physical design, but given that Nvidia built the card the way it did, it stands to reason that it also knows how to properly tune it. We'll see.</p> <p>If you ultimately decide that you don't want the card, however, that's your choice, and Nvidia says it will help you obtain a refund if you're unable to do so on your own. Here's the <a href="https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090" target="_blank">full statement</a>.</p> <p style="padding-left: 30px;">"Hey,</p> <p style="padding-left: 30px;">First, I want you to know that I'm not just a mod, I work for Nvidia in Santa Clara</p> <p style="padding-left: 30px;">I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.</p> <p style="padding-left: 30px;">It sucks because we're really proud of this thing. The GTX 970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.</p> <p style="padding-left: 30px;">Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.</p> <p style="padding-left: 30px;">--Peter"</p> <p>It's important to note that Peter says he'll do his best to help, which is different than saying Nvidia will take care of things. In other words, if you're having trouble getting a refund, there's a chance you'll be stuck with it anyway. However, given the PR hit Nvidia's already taken on this one, we suspect those scenarios will be few and far between, if at all.</p> <p>For most people, what this boils down to is that your GTX 970 is going to get even faster courtesy of some forthcoming optimizations.&nbsp; And for the few that are truly affected by the way the GTX 970 handles memory above 3.5GB, you now have someone at Nvidia that's willing to help you obtain a refund.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_will_help_disgruntled_gtx_970_owners_get_refund_says_driver_update_coming_2015#comments Build a PC driver geforce gtx 970 gpu graphics card Hardware nvidia Video Card News Wed, 28 Jan 2015 19:13:53 +0000 Paul Lilly 29330 at http://www.maximumpc.com Nvidia GeForce GTX 960 Video Card Review http://www.maximumpc.com/nvidia_geforce_gtx_960_video_card_review <!--paging_filter--><h3>Asus and EVGA represent, plus DSR and VSR benchmarks</h3> <p>One of the nice things about PCs is that your budget has a wide range of entry points. If you don't need the heavy lifting of <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014" target="_blank">an Nvidia GeForce GTX 980</a> or an AMD Radeon R9 290X, you don't have to cough up hundreds of dollars for one. Both companies offer a variety of stuff to fit your budget. Historically, Nvidia's cards ending in "60" -- like the 560, 660 and 760 – have offered performance in between the premium cards and the more economical choices, putting them in a "Goldilocks zone" of balanced price and performance. Nvidia's latest, the GTX 960 is no exception. With a 128-bit memory bus, a little over a thousand shader cores, and 2GB of VRAM, it's not designed to be a giant leap over the GTX 660. But it's not designed to be modest, either.</p> <p>Let's take a look at the Asus Strix DirectCU II OC Edition of the GTX 960. (EVGA sent us a "Super Superclocked" version that uses the company's ACX 2.0 cooler, but Nvidia distributed the Asus card as the official one to test for review, so we'll talk about the Asus card first.) This mouthful of a card comes overclocked out of the box, and the company claims a 12% average increase in performance, versus Nvidia's stock or "reference" model. It features dual fans sitting on top of heatsinks that are fed by several heatpipes, and these fans are designed to not spin until the GPU core gets up to 65 degrees Celsius. When it does, the Strix fans are designed to operate quietly, yet still run the chip cooler than the stock version can. About 30% cooler, in fact.</p> <p><img src="/files/u160416/asus_960_620.jpg" alt="Asus Strix GTX 960" title="Asus Strix GTX 960" width="620" height="465" style="vertical-align: middle;" /></p> <p>The company also asserts that its cards will be free of coil whine, which is an annoying high-pitched squeal that some faulty electronics can emit, even when there are no moving parts. This is most commonly seen in power supplies but sometimes happens in video cards too. Since Asus says that their cards are free of this defect, it indicates that you can get a replacement if your card falls victim, rather than it being an issue that they can squirrel out of. It’s nice to see a company willing to address this issue. The cards also come with a 1-year "Premium" subscription to the Xsplit game broadcasting service, which lets you to stream your gaming online. That would usually cost you over a hundred bucks.</p> <p>Next up is the EVGA SSC version that we mentioned earlier. This one is a bit longer than the Asus card, taping out at 10 inches or so, versus about 8.5 inches. But its height barely rises above the bracket, so the screw will be easier to install in a cramped space. The EVGA card also does not have a backplate, but it also costs a few bucks less. Notably, this SSC version uses an 8-pin PCI Express cable, instead of 6 pins on the Asus card. That means that it could pull up to 225 watts instead of 150, hypothetically giving it a higher overclock ceiling. Its dual fans are also a bit larger, at 90mm versus 70mm. The SSC also uses a copper plate on top of the GPU core, which can move heat faster than the partly aluminum plate on the Asus card. However, the Asus MOSFET chips have small heatsinks attached with thermal pads, whereas the EVGA card's MOSFETs are sitting underneath a metal plate that runs the length of the card. In our exprience, heatsinks generally perform better than plates.</p> <p><img src="/files/u160416/evga_960_620.jpg" alt="EVGA GTX 960 SSC" title="EVGA GTX 960 SSC" width="620" height="465" /></p> <p>Considering the relatively low amount of power that these cards draw, however, the differences in cooling design may not matter that much. EVGA claims that its straight heat pipes cool 6% than the kind of bent heat pipes that the Asus card uses. What kind of gaming should you expect, though, with the 960's architecture?</p> <p>Let's take a look at the spec charts:</p> <div class="spec-table orange"> <table style="width: 620px; height: 266px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td><strong>GTX 960</strong></td> <td><strong>GTX 970</strong></td> <td><strong>GTX 660<br /></strong></td> <td><strong>R9 285<br /></strong></td> <td><strong>R9 290<br /></strong></td> </tr> <tr> <td class="item">Generation</td> <td>&nbsp;GM206</td> <td>&nbsp;GM204&nbsp;</td> <td>&nbsp;GK106</td> <td>&nbsp;Tonga</td> <td class="item-dark">&nbsp;Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1228</td> <td>&nbsp;1050</td> <td>&nbsp;980</td> <td>&nbsp;928</td> <td>947</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1291</td> <td>&nbsp;1178</td> <td>&nbsp;1033</td> <td>&nbsp;~1GHz</td> <td class="item-dark">~1GHz</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7010</td> <td>&nbsp;7000</td> <td>&nbsp;6000</td> <td>&nbsp;5500</td> <td>5000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;2GB</td> <td>&nbsp;4GB</td> <td>&nbsp;2GB/3GB</td> <td>&nbsp;2GB</td> <td>4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;128-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;192-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;32</td> <td>&nbsp;64</td> <td>&nbsp;24</td> <td>&nbsp;32</td> <td>64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;64</td> <td>&nbsp;104</td> <td>&nbsp;80</td> <td>&nbsp;112</td> <td>160</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;1024</td> <td>&nbsp;1664</td> <td>&nbsp;960</td> <td>&nbsp;1792</td> <td>2560</td> </tr> <tr> <td>SMs</td> <td>&nbsp;8</td> <td>&nbsp;13</td> <td>&nbsp;5</td> <td>&nbsp;N/A</td> <td>&nbsp;N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;120</td> <td>&nbsp;145</td> <td>&nbsp;140</td> <td>&nbsp;190</td> <td>&nbsp;275</td> </tr> <tr> <td>Street Price</td> <td>&nbsp;$210</td> <td>&nbsp;$330</td> <td>&nbsp;$150</td> <td>&nbsp;$200</td> <td>&nbsp;$250</td> </tr> </tbody> </table> </div> <p>&nbsp;</p> <p>The GTX 960 price and clock speeds noted here are specifically for the Asus Strix version. (<strong>Update:</strong> Asus tells us that the MSRP of their card changed this morning from $215 to $210.) The EVGA SSC card has a launch price of $210, and Nvidia expects the average launch price across all cards to be closer to $200. The Asus card also has an "OC mode" setting that increases its core clock speed to 1253MHz and its boost clock to 1317MHz. The default clock speeds of the GTX 960 are 1126MHz and and 1178MHz, respectively, so it's a sizeable jump.</p> <p>It's been a while since we've seen a mid-range gaming card with only a 128-bit memory bus. The GTX 660 was 192-bit, and the 560 was 256-bit. Nvidia tells us that its Maxwell chips use particularly good data compression techniques to effectively increase the bus speed, though. Nvidia has also said in the past that the shader cores in this new Maxwell generation of GPUs are up to 40% faster than the ones that Kepler (the GTX 660) uses. In fact, EVGA says that its GTX 960 is up to 60% faster than a GTX 660. Still, 1024 shader cores doesn't seem like a lot. The 960 has half the shader cores of the GTX 980, half its memory bandwidth, and half its VRAM. But it still supports MFAA, VXGI, Dynamic Super Resolution, and DirectX 12. So its added feature set alone is compelling, even if it turns out to be "only" 30% faster than a GTX 660. The 960s are also launching at a lower price than the 660 and 760. (We're not putting the 760 or 770 cards in the spec chart because they are just refinements of the 670 and 680, respectively.)</p> <h4 style="text-align: right;"><a title="Nvidia GTX 960 Review, Page 2" href="http://www.maximumpc.com/nvidia_geforce_gtx_960_video_card_review?page=0,1" target="_blank">Click here for Page 2, to check out the benchmarks!</a></h4> <hr /> <p>Our test rig is as follows:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Corsair AX1200 (1,200 watts)</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p>Armed with this knowledge, let's see how the GTX 960 stacks up. All of these games were tested at or near their highest settings, with 4xMSAA. What do we mean by "near"? For example, we turned off PhysX so as not to tilt the score too much in Nvidia's favor, and we turned off TressFX so that AMD could not tilt either. We did not use Nvidia's proprietary TXAA or MFAA either. Just straight-up 4xMSAA. (Tomb Raider does not have an MSAA setting, so we used 2xSSAA instead.) Lastly, we set the texture quality in Shadow of Mordor to medium, since the game itself says that higher settings are not intended for video cards that have less than 3GB of VRAM. Both the GTX 960 and the R9 285 have 2GB (though we may see 4GB versions later on). We wanted to look at GPU performance without the result being colored too much by brand-specific extensions. Our mix of games is intended to be a balance of Nvidia-friendly titles and AMD-friendly titles. We used these games' built-in benchmarks to conduct all tests, to keep the evaluated input the same each time.</p> <p>First off, here's the results at 1920x1080, which Nvidia considers the target resolution for the GTX 960. We're using the Asus Strix version of the GTX 960, since it's more or less the officially approved one. The R9 285 is a Sapphire ITX Compact version; the GTX 660 is an MSI Frozr II; the GTX 970 <a href="http://www.maximumpc.com/asus_points_shrink_ray_geforce_gtx_970_now_comes_mini_itx_form_2014" target="_blank">is an Asus Mini</a>; and the R9 290 is the reference model.</p> <h4>1920x1080 Bechmark Results, Average Frames Per Second</h4> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 960</td> <td>GTX 660</td> <td>GTX 970</td> <td>R9 285</td> <td>R9 290</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;90</td> <td>&nbsp;65</td> <td>&nbsp;135</td> <td class="item-dark">&nbsp;85</td> <td>&nbsp;126</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;53</td> <td>&nbsp;37</td> <td>&nbsp;72</td> <td>&nbsp;53</td> <td>&nbsp;72</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;67</td> <td>&nbsp;66</td> <td>&nbsp;102</td> <td>&nbsp;75</td> <td>&nbsp;103</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;39</td> <td>&nbsp;33</td> <td>&nbsp;61</td> <td><strong>&nbsp;</strong>47</td> <td><strong>&nbsp;</strong>63</td> </tr> <tr> <td>Shadow of Mordor</td> <td>&nbsp;53</td> <td>&nbsp;37</td> <td>&nbsp;78</td> <td>&nbsp;56</td> <td>&nbsp;80</td> </tr> <tr> <td>3DMark 2013 (score)</td> <td>&nbsp;6977</td> <td>&nbsp;4854</td> <td>&nbsp;9762</td> <td>&nbsp;6891</td> <td>&nbsp;9435</td> </tr> </tbody> </table> </div> <p>&nbsp;</p> <p>As we can see, the GTX 960's relatively narrow memory bus does not appear to be the obstacle that we feared. In fact, the card edges out the R9 285 in Tomb Raider, a game that is customarily in AMD's camp. However, AMD seems to have found some optimizations <a href="http://www.maximumpc.com/amds_year_end_gift_gamers_catalyst_omega_special_edition_driver_2014" target="_blank">in its new Omega drivers</a> elsewhere, and the R9 285 overtakes the GTX 960 in <a href="http://www.maximumpc.com/batman_arkham_origins_review_2014" target="_blank">Batman Arkham Origins</a>, a game that usually favors Nvidia gear by a healthy margin. The 960's performance there is a bit puzzling (since it pretty much matches that of the 660), but the score did not budge after several re-tests. The R9 290, meanwhile, quietly keeps pace with the GTX 970, falling short only in Tomb Raider. Before AMD released its Omega drivers, you could expect the 290 to be 5-10% slower across the board, but it looks like this is no longer the case.</p> <p>Overall, the GTX 960 and R9 285 cards do quite respectably at 1920x1080, with each game's visual settings cranked up. So everything is working by design.</p> <p>Both GTX 960s are also very quiet cards. You have to look at the fans to know they're spinning, because you'll probably never hear them. The EVGA SSC card uses one 8-pin PCI Express cable instead of a 6-pin, but Nvidia tells us that this is for higher overclock potential, not because of a higher power requirement. The card never cracked 70 degrees Celsius during our tests, and the Asus card ran in the low 60s. The 285, for its part, operated in the high 60s.</p> <p>Next up, we're taking DSR and VSR for a spin. DSR stands for Dynamic Super Resolution. Technically, this uses ordered-grid super-sample anti-aliasing with a 13-tap Gaussian filter. In more straightforward terms, DSR takes a higher resolution than your monitor can display, squishes it down to fit, and applies a filter to enhance smoothness on the edges of objects in the game world. It can scale up to 3840x2160, also known as "4K," and can stop at points in between, such as 2560x1440. A 1440p monitor has roughly 80% more pixels than a 1080p monitor, and it's a common resolution for gamers with deeper pockets.</p> <p>This next set of benchmarks is run using the same 1080p monitor as before, just with DSR and VSR applied. We couldn't get either resolution tech to work correctly with Shadow of Morder, however; it ran at 2880x1620 instead, which is exactly twice as many pixels as 1920x1080. So the performance there will be a little lower than someone with a 1440p monitor should expect.</p> <h4>2560x1440 Benchmark Results (via DSR and VSR), Average FPS</h4> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 960</td> <td>GTX 660</td> <td>GTX 970</td> <td>R9 285</td> <td>R9 290</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;55</td> <td>&nbsp;37</td> <td>&nbsp;80</td> <td class="item-dark">&nbsp;54</td> <td>&nbsp;80</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;34</td> <td>&nbsp;23</td> <td>&nbsp;47</td> <td>&nbsp;34</td> <td>&nbsp;48</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;45</td> <td>&nbsp;36</td> <td>&nbsp;68</td> <td>&nbsp;51</td> <td>&nbsp;75</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;23</td> <td>&nbsp;17</td> <td>&nbsp;39</td> <td><strong>&nbsp;</strong>30</td> <td><strong>&nbsp;</strong>50</td> </tr> <tr> <td>Shadow of Mordor*</td> <td>&nbsp;30</td> <td>&nbsp;25</td> <td>&nbsp;44</td> <td>&nbsp;33</td> <td>&nbsp;44</td> </tr> </tbody> </table> </div> <p style="text-align: right;"><em>*2880x1620 resolution<br /></em></p> <p>Despite the much higher bandwidth requirements, Nvidia's GTX 960 scales up quite respectably (as does the comparable Radeon R9 285). <a href="http://www.maximumpc.com/hitman_absolution_review" target="_blank">Hitman: Absolution</a> continues to be a thorn in Nvidia's side, but the 960 meets the R9 285 blow-for-blow. The R9 290 is looking pretty good once more, losing surprisingly little steam in Hitman: Absolution and running in stride with the GTX 970, even pushing decisively ahead of it in Hitman and Batman.</p> <p>Since we acquired two GTX 960s for this review, it would be a shame not to try them in SLI. We had two R9 285s on hand anyway, so we'd have to hand in our geek cards if we didn't give Crossfire a shot as well. The video cards are not identical in either case, but that's not necessary to get SLI or Crossfire to work. The cards just both need to be 960s or 285s. The one wrinkle is that the higher-clocked card will reduce its speed to match that of the lower-clocked card. So your results with identical pairs will be slightly different from what we got. Since SLI and CF don't add RAM together (the contents just get mirrored), Shadow of Mordor will remain running with "Medium" textures.</p> <p>For these benches, we paired our Sapphire R9 285 Compact with an Asus Strix R9 285. These cards in SLI/CF would be overkill for 1080p, so we focused on repeating our 1440p test set instead. 3DMark remained at 1920x1080, however. We wanted to make direct comparisons between this set of 3DMark score and the earlier set.</p> <h4>SLI and Crossfire (via DSR and VSR), Average FPS</h4> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 960 SLI</td> <td>R9 285 CF</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;102</td> <td>&nbsp;64</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;55</td> <td>&nbsp;57</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;78</td> <td>&nbsp;94</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;39</td> <td>&nbsp;56</td> </tr> <tr> <td>Shadow of Mordor*</td> <td>&nbsp;52</td> <td>&nbsp;54</td> </tr> <tr> <td>3DMark 2013 (score)</td> <td>&nbsp;11,173</td> <td>&nbsp;11,531</td> </tr> </tbody> </table> </div> <p style="text-align: right;"><em>*2880x1620 resolution</em></p> <p>The 285s in Crossfire didn't seem to like Tomb Raider and remained locked at 64 FPS throughout the test. No amount of fiddling seemed to fix it. We know that it was detecting the card, because we were getting higher performance than would be possible with just one 285. The GTX 960's performance in Batman is also a bit puzzling. We'll have to dig into both of these issues further. Other than that, the 285 and 960 both scale pretty well when paired with a buddy, though the 285 indicates better scaling overall in the games that correctly recognized the second card. The GTX 960 does really well with Tomb Raider, though. Given the 285's performance elsewhere, we doubt that it would pull ahead here. The picture might change at 4K, but this tier of card is not advisable for resolutions that high anyway. We'd recommend at least two GTX 970s or two Radeon R9 290s.</p> <p>At around $250, though, the R9 290 is not a bad choice, if you can stretch your budget a little. But we'd recommend a 600-watt power supply for a single 290, and an 850-watt PSU for two of them, so there may be additional costs involved. The GTX 960, meanwhile, is rated for a 400-watt PSU, so it can plug into a wide range of systems without needing additional upgrades. The 290 also does not do 4K VSR, maxing out instead at 3200x1800, because it uses the older Hawaii core instead of the newer Tonga core. The 960 and the 285 can go all the way. And definitely do not get the black-and red "reference" version of the 290 with the single fan. It runs really loudly. A better choice would be the Sapphire Tri-X model, or the Gigabyte GV-R929WF3-4GD. Both choices need a good 12 inches of space inside your case, though. If you have a mini-ITX case and want something both compact and beefy, Gigabyte and Asus both make shorty GTX 970s (and of course there's the compact Sapphire 285 that we used for this review).</p> <p>If you're a fan of the Green Team and have been holding off for a Maxwell card at this price point, Nvidia has delivered -- though AMD is no pushover, thanks in part to the optimizations in its new Omega drivers.</p> http://www.maximumpc.com/nvidia_geforce_gtx_960_video_card_review#comments 960 GeForce GTX graphics card nvidia Review Thu, 22 Jan 2015 14:01:20 +0000 Tom McNamara 29293 at http://www.maximumpc.com Pictures of Nvidia's GM200 GPU Leak to the Web http://www.maximumpc.com/pictures_nvidias_gm200_gpu_leak_web_2015 <!--paging_filter--><h3><img src="/files/u69/nvidia_gm200.jpg" alt="GM200" title="GM200" width="228" height="170" style="float: right;" />Maxwell unchained</h3> <p>What better way to end the work week than by spying a glimpse of the real-deal Maxwell part we've all been waiting for? Winning the lottery? Okay, you got us on that one, but this is a cool (not close) second. Assuming the pictures making the rounds in cyberspace are real, <strong>you can take a look at Nvidia's forthcoming GM200-400-A1 GPU</strong> nestled into an engineering board (180-1G600-1102-A04).</p> <p>The GM200-400-A1 part is Nvidia's newest Maxwell GPU that will reportedly appear in future GeForce Series graphics cards. According to <a href="http://videocardz.com/54358/nvidia-maxwell-gm200-pictured" target="_blank"><em>Videocardz.com</em></a>, the part is expected to be used in the next line of Titan graphics cards, which the site says will debut next month.</p> <p>It's a significant part because it represents the full Maxwell experience. Rumor has it the chip will sport 3,072 CUDA cores, which is 50 percent more than the GM204.</p> <p>As for the mysterious reference board, it's using 24 Hynix H5GQ4H24MFR modules (12GB in all) clocked at 7GHz.. There's no DVI port visible on the card, though you can see three DisplayPort connectors and HDMI 2.0 output.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/pictures_nvidias_gm200_gpu_leak_web_2015#comments GM200 gpu graphics card Hardware maxwell nvidia Video Card News Fri, 16 Jan 2015 16:48:42 +0000 Paul Lilly 29265 at http://www.maximumpc.com Nvidia Announces X1 SoC http://www.maximumpc.com/nvidia_announces_x1_soc2015 <!--paging_filter--><h3>Will help drive auto-piloted cars</h3> <p>You can’t buy driverless cars just yet, but Nvidia is hoping to change that over time with its new Tegra X1 system-on-a-chip. The SoC is an offshoot of Nvidia’s K1 chip and is based on the company’s Maxwell GPU architecture, which is currently implemented in its GeForce GTX 980 and 970 graphics cards. While consumer-grade self-driving cars are still a ways off, the X1 is being designed to help with auto-assisted driving. While consumer-grade self-driving cars are still a ways off, the X1 is being designed to help chip away at that (no pun intended).&nbsp;</p> <h3 style="text-align: center;"><img src="/files/u154082/dsc02319.jpg" width="620" height="349" /></h3> <p style="text-align: center;"><strong>A look at Nvidia's new Drive CX interface</strong></p> <div>The chip features 256 CUDA cores and eight CPU cores. Nvidia is saying the X1 offers nearly twice the performance improvement over its K1 SoC, which Nvidia used in its Shield tablet, while being nearly 2x more energy efficient. Nvidia equated the power of the X1 to Microsoft’s Xbox One, but requiring roughly one-tenth the power draw. The green team touted the X1 as “the world’s first teraflops mobile processor.” While the first computer to hit one teraflop came out all the way back in 2001, it consumed a massive one million watts, which is magnitudes more than the X1’s roughly 10-watt TDP equivalent.</div> <div style="text-align: center;"><img src="/files/u154082/dsc02292.jpg" alt="Nvidia Drive CX" width="620" height="349" /></div> <div style="text-align: center;"><strong>Nvidia's Drive CX is the company's new digital cockpit computer&nbsp;</strong></div> <p>Nvidia is banking a lot on the future of cars, and company CEO Jen-hsun Huang was on stage to say that he believes they will offer the “most advanced computers in the world.” To drive some of this home (again, no pun intended), the company created its Drive CX cockpit computer, which uses the X1. The computer will be able to power the displays in the car, such as the front RPM HUD, and will be able to generate over 16 million pixels. This cockpit computer will also support surround view cameras on the outside of the car, so that the car is well aware of its surroundings.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/dsc02302.jpg" width="620" height="349" /></p> <p style="text-align: center;"><strong>Nvidia's Drive PX is the company's newly announced auto-pilot car computer.</strong></p> <p>While the road to automated-driving cars is a difficult one, the company deduced that there are four major obstacles to overcome: these cars must be able to model out the environment, have a sense of situational awareness, be able to path-find, and also be able to learn. To combat these issues, Nvidia is working with car manufacturers to integrate its new Nvidia Drive PX system, which uses two X1 SoCs. With this system, these smart cars will offer 12 camera inputs, CUDA programmability, and what the company is referring to as a “deep neural network,” which will pull data together using a cloud-based system. With these tools, the car will be able to detect people, signs, different car types (e.g., ambulances and police cars), and will also be able to tell if a civilian is stepping outside of his/her car. According to Nvidia, this setup will be able to classify up to 150 objects at once.&nbsp;</p> <p>On stage, Nvidia revealed that it is working with car manufacturer Audi to implement these systems. While a lot of these tools are currently focused on assisting drivers, according to Audi’s Executive VP of Electronics Development Ricky Hudi, self-driving cars from the company are coming in the not-too-distant future.&nbsp;</p> <p>How do you feel about self driving cars? Think they’ll happen? Let us know in the comments below. &nbsp;</p> http://www.maximumpc.com/nvidia_announces_x1_soc2015#comments android automated CES 2014 driverless cars Google nvidia tegra x1 News Mon, 05 Jan 2015 08:15:50 +0000 Jimmy Thang 29183 at http://www.maximumpc.com Acer Announces Two New Gaming Monitors, Touts Their ‘World First Designs’ http://www.maximumpc.com/acer_announces_two_new_gaming_monitors_touts_their_%E2%80%98world_first_designs%E2%80%99322 <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u46168/xg270hu-02.jpg" alt="ACER XG270HU: The World's First Gaming Monitor with Edge-to-edge Frameless Display" title="ACER XG270HU" width="228" height="152" style="float: right;" /></h3> <h3>One is the world’s maiden G-Sync enabled IPS monitor and the other the first with an edge-to-edge frameless display</h3> <p>Acer has officially announced the <a href="http://www.maximumpc.com/acer_readying_27-inch_wqhd_display_g-sync_support321" target="_blank">27-inch XB270HU gaming monitor that we told you about in November</a>. Specs-wise, it’s not exactly what we had expected it to be, though. What we thought would be another TN (twisted nematic) panel display with G-Sync has, to our pleasant surprise, turned out to be <strong>the world’s first G-SYNC enabled gaming monitor with an IPS display</strong>. But it is not the only upcoming Acer gaming monitor to pack a world-first design choice.</p> <p><img src="/files/u46168/xb270hu_right_angle.jpg" alt="Acer XB270HU IPS Gaming Display with Nvidia G-Sync" title="Acer XB270HU" width="228" height="155" style="float: left;" />A 27-incher with WQHD (2560 x 1440) resolution and 144Hz refresh rate like the XB270HU (pictured left), the XG270HU (right) is another sui generis monitor announced by the company. The XG270HU, which has a fast 1ms response time, is unique for its edge-to-edge frameless display — a world first, according to the company. Built from recycled ABS plastic, it features HDMI 2.0, DVI, and DisplayPort 1.2.</p> <p>Both models will be available around the world in March 2015, the company said in its announcement, without disclosing their prices.</p> <p><em>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></em></p> http://www.maximumpc.com/acer_announces_two_new_gaming_monitors_touts_their_%E2%80%98world_first_designs%E2%80%99322#comments Acer display g-sync Gaming gaming monitor ips g-sync nvidia XB270HU xg270hu News Mon, 05 Jan 2015 00:41:21 +0000 Pulkit Chandna 29180 at http://www.maximumpc.com Top 14 News Stories of 2014 http://www.maximumpc.com/top_14_news_stories_2014 <!--paging_filter--><h3><img src="/files/u69/2014_news.jpg" alt="2014 News" title="2014 News" width="228" height="153" style="float: right;" />Looking back at another wild year in the tech sector</h3> <p>Two years ago, the world was supposed to end, based on the Mayan calendar. And last year, we heard about the death of the PC ad nauseam. Of course, neither of those things happened, setting up yet another event-filled 12 months of technology news that ran the gamut from a major security flaw affecting nearly every website on the Internet, to Blizzard announcing its first new PC game franchise in 17 years, plus a whole lot more.</p> <p>We're more anxious than ever to see what's in store for 2015, both internally (we have a new Editor-in-Chief, <a href="http://www.maximumpc.com/meet_maximum_pcs_new_editor--chief2014">say hello to Tuan Nguyen!</a>) and of course externally, with a new Windows OS release on the horizon. Prices of solid-state drives continue to fall, graphics cards are getting faster and offering more bang for your buck, virtual reality is closer than ever to being a mainstream thing, and Intel and AMD keep piling on more CPU cores for that inevitable day when software developers finally take full advantage of multi-core processing. It's going to be an exciting year indeed!</p> <p>However, we're getting ahead of ourselves. Before we cross over into 2015, let's take a moment to look back and give 2014 a proper goodbye. To do that, <strong>we've put together a gallery highlighting 14 of the more interesting tech stories of the past year</strong>. It's a trip back in time, if you will, so grab a bottle of grog, sit back, and let's toast another fun year together before we embark on a new one!</p> http://www.maximumpc.com/top_14_news_stories_2014#comments blizzard features gallery haswell-e heartbleed hgst intel maxwell microsoft net neutrality news Nokia north korea nvidia windows 10 windows xp Features Mon, 29 Dec 2014 17:45:39 +0000 Paul Lilly 29129 at http://www.maximumpc.com Nvidia Rumored to Release GeForce GTX 960 on January 22, 2015 http://www.maximumpc.com/nvidia_rumored_release_geforce_gtx_960_january_22_2015 <!--paging_filter--><h3><img src="/files/u69/nvidia_graphics_card_2.jpg" alt="Nvidia Graphics Card" title="Nvidia Graphics Card" width="228" height="173" style="float: right;" />A new mid-range GTX 900 Series card may be imminent</h3> <p>We expect to see quite a few product announcements at the Consumer Electronics Show (CES) in Las Vegas next month, which runs from January 6-9. However, rumor has it one part that won't make the trip is Nvidia's GeForce GTX 960 graphics card. Instead, a Chinese-language website thinks it's privy to <strong>Nvidia's plan to launch the GeForce GTX 960 on January 22, 2015</strong>.</p> <p>Credit goes to <a href="http://www.fudzilla.com/home/item/36625-nvidia-gtx-960-launch-date-pinned-down" target="_blank"><em>Fudzilla</em></a> for the heads up from <a href="https://translate.google.com/translate?sl=auto&amp;tl=en&amp;js=y&amp;prev=_t&amp;hl=en&amp;ie=UTF-8&amp;u=http%3A%2F%2Fwww.gdm.or.jp%2Fvoices%2F2014%2F1220%2F97775&amp;edit-text=&amp;act=url" target="_blank"><em>Hermitage Akihabara</em></a>, which says that Nvidia is still working out the exact specifications for the forthcoming mid-range graphics card. About the only details the site has, other than the proposed launch date, is that it will sport a single 6-pin power connector and cost around 25,000 yen (~US$207).</p> <p>Based on other information from around the web, the card will feature Nvidia's Maxwell architecture with a GM206 GPU, details of which are pretty sparse. It's expected to debut with 2GB of GDDR5 on a 128-bit bus.</p> <p>The card was initially rumored to release in October 2014, though it's <a href="http://wccftech.com/nvidia-geforce-gtx-960-allegedly-postponed-q1-2015-due-strong-geforce-gtx-980-gtx-970-sales/" target="_blank">believed</a> that Nvidia postponed the launch because of strong sales for its GeForce GTX 980 and 970 graphics cards.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_rumored_release_geforce_gtx_960_january_22_2015#comments Build a PC geforce gtx 960 gpu graphics card Hardware nvidia rumor Video Card News Mon, 29 Dec 2014 16:09:39 +0000 Paul Lilly 29157 at http://www.maximumpc.com Heads Up: Nvidia GeForce 347.09 WHQL Driver is Available to Download http://www.maximumpc.com/heads_nvidia_geforce_34709_whql_driver_available_download_2014 <!--paging_filter--><h3><img src="/files/u69/nvidia_geforce_gtx_980_closeup.jpg" alt="GeForce GTX 980 Closeup" title="GeForce GTX 980 Closeup" width="228" height="184" style="float: right;" />GeForce GTX 980M SLI and GTX 970M SLI notebook users steer clear of this one</h3> <p><strong>Nvidia today made available its latest GeForce Game Ready driver, release 347.09 WHQL</strong>, which the company claims will give users the best possible gaming experience for Metal Gear Solid V: Ground Zeroes and Elite: Dangerous. Otherwise, there's not a ton to cover in the new driver update -- it includes the usual round of 3D Vision Profile updates, along with the addition of 3D Compatibility Mode for select titles. There's also a warning.</p> <p>"GeForce GTX 980M SLI and GTX 970M SLI notebooks are not supported with this driver. With SLI enabled on these notebooks, a black screen will occur upon uninstalling this driver. To workaround this issue, disable SLI before uninstalling the driver," Nvidia states in its <a href="http://us.download.nvidia.com/Windows/347.09/347.09-win8-win7-winvista-desktop-release-notes.pdf" target="_blank">release notes (PDF)</a>.</p> <p>Our advice for GTX 980M SLI and GTX 970M SLI is to stay away from the driver update unless you have a really compelling reason to install it, and give Nvidia time to work out the issue.</p> <p>As for 3D Compatibility Mode support, Nvidia gave "Excellent" ratings to Alien: Isolation, Elite: Dangerous, Escape Dead Island, and Far Cry 4, while tagging Middle-Earth - Shadow of Mordor with a "Good" rating.</p> <p>You can download the <a href="http://www.geforce.com/drivers/results/80913" target="_blank">new driver here</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/heads_nvidia_geforce_34709_whql_driver_available_download_2014#comments driver geforce 347.09 nvidia Software whql News Tue, 23 Dec 2014 18:57:10 +0000 Paul Lilly 29145 at http://www.maximumpc.com Nvidia Releases GeForce 347.09 Beta Driver http://www.maximumpc.com/nvidia_releases_geforce_34709_beta_driver_2014 <!--paging_filter--><h3><img src="/files/u69/nvidia_1.jpg" alt="Nvidia" title="Nvidia" width="228" height="176" style="float: right;" />For best performance in Metal Gear Solid V: Ground Zeroes</h3> <p>Standard warning always applies to pre-release software and drivers -- if you'e running a mission critical rig or otherwise need to minimize the risk of buggy behavior, then stay away. Savvy? For everyone else, especially those who want to live on the cutting edge, be advised that <strong>Nvidia just made its GeForce Game Ready driver, release 347.09, available to download in beta form</strong>.</p> <p>According to Nvidia, 347.09 offers the best possible gaming experience for Metal Gear Solid: V Ground Zeroes and Elite: Dangerous. Otherwise, it's a pretty straightforward driver update with new and updated application and 3D Vision profiles, and updates to 3D compatibility mode in certain games.</p> <p>Specifically, Nvidia added an application profile for Project CARS, along with the following 3D Vision Profile updates:</p> <ul> <li>Alien: Isolation - Not recommended</li> <li>Elite: Dangerous - Not recommended</li> <li>Escape Dead Island - rated as Fair</li> <li>Far Cry 4 - Not recommended</li> <li>Metal Gear Solid: Ground Zeroes - Not recommended</li> <li>Middle-EArth - Shadow of Mordor - Not recommended</li> <li>Reisdent Evil Revelations 2 - Disabled</li> <li>The Vanishing of Ethan Carter - rated as Fair</li> </ul> <p>You can download the <a href="http://www.nvidia.com/Download/index.aspx?lang=en-us" target="_blank">driver here</a> and check out the <a href="http://us.download.nvidia.com/Windows/347.09/347.09-win8-win7-winvista-desktop-release-notes.pdf" target="_blank">release notes here (PDF)</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_releases_geforce_34709_beta_driver_2014#comments Drivers elite: dangerous geforce 347.09 beta metal gear solid v: ground zeroes nvidia Software News Wed, 17 Dec 2014 18:54:37 +0000 Paul Lilly 29108 at http://www.maximumpc.com Nvidia Gives Shield Hub a Material Design Makeover http://www.maximumpc.com/nvidia_gives_shield_hub_material_design_makeover_2014 <!--paging_filter--><h3><img src="/files/u69/grid_ui.jpg" alt="Grid UI" title="Grid UI" width="228" height="143" style="float: right;" />Nvidia was all nestled all snug in its bed, while visions of Lollipop's Material Design danced in its head</h3> <p><strong>Nvidia today announced that its Shield Hub received some major updates along with a new look</strong>, improved GameStrem experience in console mode, and better connectivity to GRID, the company's on-demand game streaming service that resides in the cloud. The new layout comes hot on the heels of Android 5.0 Lollipop making its way to Nvidia's Shield Tablet, which is where the "Material Design" comes into play.</p> <p>Following the redesign, you can see more of your games with less scrolling in the Shield Hub. It's supposed to be easier to navigate with large lists that make finding games a cinch. It all reflects Google's Material Design language introduced in Lollipop with subtle animations, layers, and updated iconography.</p> <p>There's also an improved GameStream experience in console mode. If you have a fast Wi-Fi connection (5GHz 802.11n/ac required), you can configure a display output of 1080p at 60 frames per second. Just go to My Games &gt; Streaming Setting (top right) &gt; Show Advanced Options &gt; Quality and set the Max Resolution to 1080p and Frame Rate to 60fps.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_gives_shield_hub_material_design_makeover_2014#comments games material design nvidia shield News Tue, 16 Dec 2014 20:10:02 +0000 Paul Lilly 29102 at http://www.maximumpc.com Asus Points Shrink Ray at GeForce GTX 970, Now Comes in Mini ITX Form http://www.maximumpc.com/asus_points_shrink_ray_geforce_gtx_970_now_comes_mini_itx_form_2014 <!--paging_filter--><h3><img src="/files/u69/asus_geforce_gtx_970_dc_mini.jpg" alt="Asus GeForce GTX 970 DC Mini" title="Asus GeForce GTX 970 DC Mini" width="228" height="178" style="float: right;" />Maxwell gets the mini ITX treatment</h3> <p>The mini ITX form factor has been gaining some serious street cred as of late. Of course, it was only a matter of time, with advances in technology leading to increasingly smaller parts that are much more powerful than their sizes suggest. <strong>The newest tiny treat for mini ITX builders is the <a href="http://www.asus.com/Graphics_Cards/GTX970DCMOC4GD5/" target="_blank">Asus GeForce GTX 970 CD Mini</a><a style="&quot;float:" title="&quot;Asus" href="http://www.asus.com/Graphics_Cards/GTX970DCMOC4GD5/" target="_blank"></a></strong>, a small form factor graphics card that willl be right at home in your mini ITX motherboard.</p> <p>It measures just 17cm in length, making for an easy fit inside a compact gaming PC. And not only is the card rocking Nvidia's Maxwell architecture underneath the hood, it also received Asus' DirectCU custom cooling treatment. According to Asus, the DirectCU cooler with vapor chamber offers 20 percent cooler temps and a "vastly quieter" experience compared to reference.</p> <p>Asus didn't gimp the card, either. The GPU is actually overclocked to 1,088MHz base and 1,228MHz boost, up from Nvidia's reference blueprint that calls for 1,050MHz base and 1,178MHz boost clockspeeds. It's also paired with 4GB of GDDR5 memory clocked at 7,010MHz (effective) on a 256-bit bus for 224GB/s of memory bandwidth (same as reference).</p> <p>Connectivity options consist of DVI-I and DVI-D ports (one each), HDMI, and DisplayPort.</p> <p>No word yet on the card's price or availability. As points of reference, the Asus Strix GeForce GTX 970 (full size) commands about $350 street. Gigabyte also makes a mini ITX variant of the GTX 970 (GIGABYTE GV-N970IXOC-4GD), and that one streets for around $340.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/asus_points_shrink_ray_geforce_gtx_970_now_comes_mini_itx_form_2014#comments asus Build a PC geforce gtx 970 graphics card Hardware mini itx nvidia News Wed, 03 Dec 2014 19:37:27 +0000 Paul Lilly 29027 at http://www.maximumpc.com Nvidia Rolls Out Two Shield Bundles for Black Friday http://www.maximumpc.com/nvidia_rolls_out_two_shield_bundles_black_friday_2014 <!--paging_filter--><h3><img src="/files/u69/shield_devices.jpg" alt="Nvidia Shield Devices" title="Nvidia Shield Devices" width="228" height="144" style="float: right;" />Choose your weapon: Shield portable or Shield tablet</h3> <p>If you happen to be in the market for a <strong>Shield (in either handheld portable or tablet form)</strong>, or are otherwise looking for gift ideas this holiday season, hold your horses for just a few more days and you'll score a better deal. Like many, <strong>Nvidia is participating in the Black Friday festivities, and it's doing it with a couple of bundle options for the aforementioned mobile gaming devices</strong>, both of which will come a few extra incentives.</p> <p>On Friday, you can buy the 32GB Shield tablet with LTE capabilities for $399 and receive a free Shield controller, Green Box bundle, and Nvidia GRID on-demand gaming service. Included in the Green Box bundle is Half Life 2, Portal, and Half Life 2: Episode One, which is the first time the latter has been available on mobile. While that's the same price as the 32GB Shield tablet with LTE normally sells for, the extras bump up the total value to $488.</p> <p>As for the handheld Shield device, it too will sell for its regular price of $199, but will come with a carrying case and glossy black armor, and of course Nvidia's GRID on-demand gaming service. All combined, that's a $258 value.</p> <p>That's not a bad score with the <a href="http://shield.nvidia.com/grid-game-streaming/" target="_blank">GRID service</a>, which is free for Shield owners until June 30, 2015. GRID's on-demand game library is worth over $400 and includes nearly two dozen titles, including Batman: Arkham City, Borderlands (1+2), Lego Batman 2, The Witcher 2: Assassin's of Kings, and more.</p> <p>Nvidia's <a href="http://shield.nvidia.com/promotions/" target="_blank">Black Friday promotion</a> runs on November 28 from 12:01 AM PST to 11:59 PM PST, or while supplies last. And for more Black Friday deals, be sure to bookmark and check often our regularly updated <a href="http://www.maximumpc.com/black_friday_ads_2014_round">Black Friday Deals 2014 Round Up</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_rolls_out_two_shield_bundles_black_friday_2014#comments black friday games Hardware mobile nvidia shield shield tablet News Tue, 25 Nov 2014 17:25:00 +0000 Paul Lilly 28988 at http://www.maximumpc.com Samsung Files ITC Complaint Seeking Ban on Nvidia Chips in U.S. http://www.maximumpc.com/samsung_files_itc_complaint_seeking_ban_nvidia_chips_us300 <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u46168/samsung_logo.jpg" alt="Samsung Logo" title="Samsung Logo" width="228" height="197" style="float: right;" /></h3> <h3>Both companies are trying to get each other’s products banned</h3> <p>As companies faced with the prospect of having their products banned in the U.S. over a patent infringement complaint by an adversary are wont to do, <a href="http://www.bloomberg.com/news/2014-11-21/samsung-files-complaint-to-block-nvidia-chips-from-u-s-1-.html" target="_blank"><strong>Samsung has hit back at Nvidia with a sales ban request of its own</strong></a>. The world’s leading smartphone maker <a href="http://usitc.gov/petitions_and_complaints/3042.htm" target="_blank">filed a complaint with the United States International Trade Commission (USITC)</a> Friday, requesting that the latter institute an investigation against Nvidia “under section 337 of the Tariff Act of 1930, as amended, regarding certain graphics processing chips, Systems on a Chip, and products containing the same.”</p> <p>Section 337 of the said act deals with unfair import trade practices, which include, inter alia, the importation into the United States of goods that “infringe a valid and enforceable United States patent.” A successful section 337 complaint can result in such goods being banned from entering the States; the goods can even be excluded from entry during the course of the investigation if the ITC so deems necessary.</p> <p>This complaint shouldn’t come as a surprise to anyone as the <a href="http://www.pcworld.com/article/2851452/samsung-wants-the-itc-to-block-nvidia-chips-in-the-us.html" target="_blank">South Korean company pretty much telegraphed the move </a>when it filed a patent infringement lawsuit against Nvidia earlier this month.</p> <h4 style="text-align: left;"><strong>The Samsung-Nvidia Patent Battle So Far</strong></h4> <ul> <li>September 4, 2014: <a href="http://www.maximumpc.com/nvidia_initiates_patent_lawsuit_against_samsung_and_qualcomm_2014" target="_blank">Nvidia files patent infringement complaints against Samsung and Qualcomm</a> with both the ITC and the U.S. District Court, in Delaware, seeking a ban on Samsung devices containing Qualcomm’s Adreno, ARM’s Mali or Imagination’s PowerVR graphics chips.</li> <li>November 4, 2014: Samsung responds with a patent infringement lawsuit of its own against Nvidia, alleging that the latter infringes on six of its patents.</li> <li>November 11, 2014: Taiwanese company labels <a href="http://blogs.nvidia.com/blog/2014/11/11/nvidia-responds-to-samsung/" target="_blank">Samsung’s lawsuit a “predictable tactic.”</a></li> <li>November 21, 20014: Samsung follows up the lawsuit with a section 337 complaint with the ITC.</li> </ul> <p><em>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></em></p> http://www.maximumpc.com/samsung_files_itc_complaint_seeking_ban_nvidia_chips_us300#comments ban chips gpu itc legal nvidia Patent Infringement samsung samsung-nvidia legal battle soc usitc News Mon, 24 Nov 2014 05:45:54 +0000 Pulkit Chandna 28978 at http://www.maximumpc.com Nvidia GeForce 344.75 WHQL Drivers Now Available to Download http://www.maximumpc.com/nvidia_geforce_34475_whql_drivers_now_available_download <!--paging_filter--><h3><img src="/files/u69/geforce_gtx_980_gpu.jpg" alt="GeForce GTX 980 GPU" title="GeForce GTX 980 GPU" width="228" height="160" style="float: right;" />Incremental update boosts performance in Far Cry 4</h3> <p>Nvidia's been on a driver-releasing frenzy lately. Following two prior GeForce driver releases already this month, <strong>Nvidia has now made available its GeForce 344.75 WHQL driver</strong>, its third release in as many weeks. As with previous versions, the focus here is on new games --specifically, installing the GeForce 344.75 driver is supposed to offer the best gaming performance for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor.</p> <p>According to the release notes (PDF), the driver also introduces support for Multi-Frame Sampled Anti-Aliasing (MFAA) mode. Bug fixes are plentiful, too.</p> <p>The latest driver is supposed to fix an issue that was causing choppy video playback of 3D Blu-ray moves, which is just one of many fixes. However, there are still some open issues that need to be resolved, like video picture-in-picture content not being played within Internet Explorer in 32-bit versions of Windows Vista and Windows 7.</p> <p>You can download the latest driver <a href="http://www.nvidia.com/download/driverResults.aspx/79891/en-us" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_geforce_34475_whql_drivers_now_available_download#comments Drivers Gaming geforce 344.75 graphics nvidia whql News Tue, 18 Nov 2014 17:06:04 +0000 Paul Lilly 28949 at http://www.maximumpc.com Nvidia's Mightly Tesla K80 Accelerator Packs Two GPUs, 24GB of GDDR5 Memory http://www.maximumpc.com/nvidias_mightly_tesla_k80_accelerator_packs_two_gpus_24gb_gddr5_memory <!--paging_filter--><h3><img src="/files/u69/nvidia_tesla_k80.jpg" alt="Nvidia Tesla K80" title="Nvidia Tesla K80" width="228" height="184" style="float: right;" />Meet the world's fastest accelerator</h3> <p>If there are any kids within earshot, send them off to play or cover their ears before reading any further, we wouldn't want their little ears being exposed to excited obscenities that may follow. As in, "Holy sh*t, how much RAM!?" Try 24GB of GDDR5 memory, which is how much <strong>Nvidia decided to use on its Tesla K80 dual-GPU accelerator</strong>, the new flagship offering in the Tesla family.</p> <p>According to Nvidia, the Tesla K80 offers nearly double the performance and memory bandwidth of its predecessor, the Tesla K40. It also boasts ten times higher performance than today's fastest CPUs, as it was designed for the most difficult computational challenges around -- astrophysics, genomics, quantum chemistry, data analytics, and so forth.</p> <p>With two GPUs on board, 24GB of GDDR5 memory, and 4,992 CUDA parallal processing cores, the Tesla K80 boasts 480GB/s of memory bandwidth, 8.74 teraflops of single-precision peak floating point performance, and up to 2.91 teraflops of double-precision peak floating point performance. For the sake of comparison, Nvidia's Maxwell-based GeForce GTX 980 offers 5 teraflops of single-precision performance.</p> <p>The Tesla K80 has begun shipping to server manufactures -- no word on price.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidias_mightly_tesla_k80_accelerator_packs_two_gpus_24gb_gddr5_memory#comments Build a PC graphics card Hardware nvidia tesla k80 Video Card News Mon, 17 Nov 2014 17:08:44 +0000 Paul Lilly 28944 at http://www.maximumpc.com