nvidia http://www.maximumpc.com/taxonomy/term/320/ en No BS Podcast #231: AMD and Origin PC Settle Past Dispute on the Show http://www.maximumpc.com/no_bs_podcast_231_amd_and_origin_pc_settle_past_dispute_show <!--paging_filter--><h3>Plus: AMD's commitment to high-end CPUs, DDR4, 5-way GPU support, 20nm GPUs, and more!</h3> <p>In a bit of a surprise to us, <a title="amd" href="http://www.maximumpc.com/tags/amd" target="_blank">AMD</a> and <a title="origin pc" href="http://www.maximumpc.com/tags/Origin_PC" target="_blank">Origin PC</a> wanted to come into the podcast room together for <a title="No BS podcast 231" href="http://dl.maximumpc.com/maxpc_231_20140828.mp3" target="_blank"><strong>episode 231 of the No BS Podcast</strong></a>. As you may recall, this pairing is kind of surprising considering that last October, Origin PC’s co-founder and CEO Kevin Wasielewski announced that the company would be <a title="origin pc drops amd gpus" href="http://www.maximumpc.com/origin_pc_now_dealing_exclusively_nvidia_graphics_claims_amd_gpus_are_problematic2013" target="_blank">dropping AMD graphics cards from its systems</a>, stating, “This decision was based on a combination of many factors including customer experiences, GPU performance/drivers/stability, and requests from our support staff.” He then later added, “Based on our 15+ years of experience building and selling award winning high-performance PCs, we strongly feel the best PC gaming experience is on Nvidia GPUs.”</p> <p>Well, not only did we get Wasielewski in the room, but we also got AMD’s VP of Global Channel Sales Roy Taylor and AMD’s Director of Public Relations Chris Hook to come on. In the show, the two parties settle their past dispute with Taylor suggesting that AMD is now committed to giving hardware partners like Origin PC more support and communication. In the podcast, he outlines some of the strategies to do so. Wasielewski also confirmed that you can now get AMD video cards in Origin PCs again and shot down any <a href="http://semiaccurate.com/2013/10/07/nvidias-program-get-oems-like-origin-pc-dump-amd-called-tier-0/" target="_blank">rumors</a> that Nvidia was compensating Origin PC to slander AMD late last year when the announcement came about.</p> <p>Taylor also asserts that AMD’s graphics drivers have gotten a lot better over the past year, but admits this wasn’t always the case and that the company is still getting burned by that bad reputation.&nbsp;</p> <p>While Gordon was away on vacation, he did submit several questions for the rest of the crew to ask on the air, and in the show we cover a ton of ground from topics that range from:</p> <ul> <li>The possibility of 5-way GPU support</li> <li>AMD’s renewed commitment to battling Intel at the high-end CPU market</li> <li>AMD’s plans to start using DDR4</li> <li>Origin PC and AMD’s thoughts on Valve’s upcoming <a title="maximum pc steam machine" href="http://www.maximumpc.com/everything_you_need_know_about_steam_machines_2014" target="_blank">Steam Machine</a> initiative</li> <li>AMD’s take on the&nbsp;<a title="oculus rift" href="http://www.maximumpc.com/tags/oculus_rift" target="_blank">Oculus Rift</a>/VR</li> <li>Freesync monitor availability</li> <li>Why <a title="AMD ssd" href="http://www.maximumpc.com/amd_reportedly_gearing_sell_radeon-branded_line_ssds_2014" target="_blank">AMD is getting into the SSD market</a></li> <li>AMD’s presence (or lack thereof) in the laptop/gaming notebook segment</li> <li>20nm GPUs</li> <li>And then we of course top it off with your fan questions!&nbsp;</li> </ul> <p><iframe src="//www.youtube.com/embed/dTB2Uk43LKU" width="620" height="349" frameborder="0"></iframe></p> <p>The old format isn’t going away, and Gordon’s rants will return, but in the meantime, give this episode a listen, and let us know what you think!</p> <p><a title="Download Maximum PC Podcast #231 MP3" href="http://dl.maximumpc.com/maxpc_231_20140828.mp3" target="_blank"><img src="/files/u160416/rss-audiomp3.png" width="80" height="15" /></a>&nbsp;<a title="Maximum PC Podcast RSS Feed" href="http://feeds.feedburner.com/maximumpc/1337" target="_blank"><img src="/files/u160416/chicklet_rss-2_0.png" width="80" height="15" /></a>&nbsp;<a href="https://itunes.apple.com/podcast/maximum-pc-no-bs-podcast/id213247824"><img src="/files/u160416/chicklet_itunes.gif" alt="Subscribe to Maximum PC Podcast on iTunes" title="Subscribe to Maximum PC Podcast on iTunes" width="80" height="15" /></a></p> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Subscribe to the magazine for only 99 cents an issue:</h4> <h5><a title="Subscribe to Maximum PC Magazine" href="https://w1.buysub.com/pubs/IM/MAX/MAX_subscriptionpage.jsp?cds_page_id=63027" target="_blank">In print</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Zinio" href="https://www.zinio.com/checkout/publisher/?productId=500663614" target="_blank">On Zinio</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Google Play" href="https://play.google.com/store/newsstand/details/Maximum_PC?id=CAoww6lU&amp;hl=en" target="_blank">On Google Play</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on iTunes" href="http://goo.gl/UIkW4" target="_blank">On iTunes</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Amazon Kindle" href="http://www.amazon.com/Maximum-PC/dp/B005XD5144/ref=sr_1_1?ie=UTF8&amp;qid=1406326197">On the Amazon Kindle Store</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Your Nook" href="http://www.barnesandnoble.com/w/maximum-pc-future-us-future-publishing/1119741259" target="_blank">On the Barnes &amp; Noble Nook Store</a></h5> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Stalk us in a number of ways:</h4> <p>Become a fan&nbsp;<a title="Maximum PC Facebook page" href="https://www.facebook.com/maximumpc" target="_blank">on Facebook</a></p> <p>Follow us&nbsp;<a href="https://twitter.com/maximumpc" target="_blank">on Twitter</a></p> <p>Subscribe to us&nbsp;<a title="Maximum PC Youtube page" href="https://www.youtube.com/user/MaximumPCMag" target="_blank">on Youtube</a></p> <p>Subscribe&nbsp;<a title="Maximum PC RSS Feed" href="http://feeds.feedburner.com/maximumpc/1337">to our RSS feed</a></p> <p>Subscribe&nbsp;<a href="https://itunes.apple.com/us/podcast/maximum-pc-no-bs-podcast/id213247824" target="_blank">to the podcast on iTunes</a></p> <p>email us at:&nbsp;<a href="mailto:maximumpcpodcast@gmail.com">maximumpcpodcast AT gmail DOT com</a></p> <p>Leave us a voicemail at 877-404-1337 x1337</p> http://www.maximumpc.com/no_bs_podcast_231_amd_and_origin_pc_settle_past_dispute_show#comments 231 amd cpu ddr4 episode graphics cards maximum pc No BS Podcast nvidia origin pc rumors Gaming News No BS Podcast Thu, 28 Aug 2014 20:37:32 +0000 The Maximum PC Staff 28441 at http://www.maximumpc.com Nvidia Retains Lead in Discrete Graphics Card Business, Shipments Down Overall http://www.maximumpc.com/nvidia_retains_lead_discrete_graphics_card_business_shipments_down_overall_2014 <!--paging_filter--><h3><img src="/files/u69/nvidia_0.jpg" alt="Nvidia" title="Nvidia" width="228" height="171" style="float: right;" />Tablets and embedded graphics are eating into the add-in board market</h3> <p>The latest report from Jon Peddie Research (JPR) shows that <strong>graphics add-in board (AIB) shipments during the second quarter of 2014 declined 17.5 percent compared to the previous quarter</strong>. JPR says the market is behaving according to past years, though the decrease was more than the 10-year average. What's also interesting is that the drop in discrete graphics card shipments coincided with a 1.3 percent increase in desktop PC shipments.</p> <p><a href="http://jonpeddie.com/publications/add-in-board-report/" target="_blank">According to JPR</a>, tablets and embedded graphics caused part of the decline. However, "PC gaming momentum continues to build and is the bright spot in the AIB market," with Nvidia reaping the lion's share of the rewards.</p> <p>Nvidia's share of the discrete graphics card market slipped sequentially from 64.9 percent to 62 percent, though barely budged compared to the same quarter a year ago when Nvidia held a 61.9 percent share of the market. Meanwhile, AMD ended the quarter wit ha 37.9 percent share, up from 35 percent in the previous quarter and down slightly from 38 percent a year ago.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_retains_lead_discrete_graphics_card_business_shipments_down_overall_2014#comments amd graphics card jon peddie research jpr nvidia Video Card News Tue, 26 Aug 2014 15:11:58 +0000 Paul Lilly 28419 at http://www.maximumpc.com Digital Storm Targets Gamers with Bolt II Battle Box Titan Z Special Edition http://www.maximumpc.com/digital_storm_targets_gamers_bolt_ii_battle_box_titan_z_special_edition_2014 <!--paging_filter--><h3><img src="/files/u69/digital_storm_battle_box.jpg" alt="Digital Storm Bolt II Battle Box" title="Digital Storm Bolt II Battle Box" width="228" height="183" style="float: right;" />Liquid cooled and ready for the heat of battle</h3> <p><strong>Digital Storm today unveiled its Bolt II Battle Box Titan Z Special Edition</strong>, which is a specially priced Bolt II small form factor (SFF) rig wielding a dual-GPU Nvidia GeForce GTX Titan Z graphics card. In addition to adding a Titan Z, Digital Storm went back to the drawing board and redesigned the Bolt II to accommodate a new Hardline Cooling System consisting of a 240mm radiator, pump, and "stunning" acrylic tubing with yellow coolant.</p> <p>"Nvidia launched the GTX Battle Box Program to allow gamers to play AAA, combat-focused games at max settings and super high resolutions," said Harjit Chana, Chief Brand Officer. "But gaming in 4K requires much more than simply upgrading components. Our Hardline Cooling System allows gamers to unlock the Bolt II’s full potential and experience games in ways they never thought possible."</p> <p>The Bolt II Battle Box is available now for just under $5,000, down from what its regular selling price should be, which is $6,658. At that starting price, the Bolt II Battle Box comes with a painted chassis, an overclocked Intel Core i7 4790K processor, Asus Maximus VI Impact motherboard, 16GB of DDR3-1600 memory, Blu-ray player, 250GB Samsung 840 EVO SSD, 1TB Seagate HDD, Titan Z graphics card, liquid cooling, internal lighting, 700W power supply, and Windows 8.1 64-bit.</p> <p>You can find out more (and/or place an order) on <a href="https://www.digitalstormonline.com/configurator.asp?id=1034531" target="_blank">Digital Storm's website</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/digital_storm_targets_gamers_bolt_ii_battle_box_titan_z_special_edition_2014#comments battle boz bolt ii Digital Storm Gaming geforce gtx titan z Hardware nvidia OEM rigs News Tue, 26 Aug 2014 14:35:10 +0000 Paul Lilly 28418 at http://www.maximumpc.com Nvidia Shield Tablet Review http://www.maximumpc.com/nvidia_shield_tablet_review_2014 <!--paging_filter--><h3>Updated: Now with video review!&nbsp;</h3> <p>Despite its problems, we actually liked <a title="Nvidia Shield review" href="http://www.maximumpc.com/nvidia_shield_review_2013" target="_blank">Nvidia’s original Shield Android gaming handheld</a>. Our biggest issue with it was that it was bulky and heavy. With rumors swirling around about a Shield 2, we were hoping to see a slimmer, lighter design. So consider us initially disappointed when we learned that the next iteration of Shield would just be yet another Android tablet. Yawn, right? The fact of the matter is that the Shield Tablet may be playing in an oversaturated market, but it’s still great at what it sets out to be.</p> <p><iframe src="//www.youtube.com/embed/dGigsxi9-K4" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>We've updated our review to include the video review above.</strong></p> <p>At eight inches, the Shield Tablet features a gorgeous 1900x1200 display, which shares the same resolution as Google’s flagship <a title="nexus 7 review" href="http://www.maximumpc.com/google_nexus_7_review_2013" target="_blank">Nexus 7</a> tablet. At 13.1 ounces, the Shield Tablet is about three ounces heavier than the Nexus 7 but still a lot lighter than the original’s 1 lb. 4.7 ounces.&nbsp;</p> <p>Part of the weight increase with the Shield Tablet over the Nexus 7 is due to the extra inch that you’re getting from the screen, but also because the Shield Tablet is passively cooled and has an extra thermal shield built inside to dissipate heat. It’s a little heavier than we like, but isn’t likely to cause any wrist problems. On the back of the Shield is an anti-slip surface and a 5MP camera, and on the front of the tablet is a front-facing 5MP camera and two front-facing speakers. While the speakers are not going to blow away dedicated Bluetooth speakers, they sound excellent for a tablet. In addition to the speakers, the Shield Tablet has a 3.5mm headphone jack up at the top. Other ports include Micro USB, Mini HDMI out, and a MicroSD card slot capable of taking up to 128GB cards. Buttons on the Shield include a volume rocker and a power button, which we found to be a little small and shallow for our liking.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_exploded_view_black_bckgr.jpg" alt="Nvidia Shield Tablet guts" title="Nvidia Shield Tablet guts" width="620" height="349" /></p> <p style="text-align: center;"><strong>The guts of the Nvidia Shield Tablet.</strong></p> <p>All of this is running on the latest version of Android KitKat (4.4). Nvidia says that it will update the tablet to Android L within a few weeks of Google’s official release. If Nvidia’s original Shield is any indication of how well the company keeps up with OS updates, you should be able to expect to get the latest version of Android after a couple of weeks, if not a months, after release. Regardless, the Shield Tablet is running a pretty stock version of Android to begin with, the main difference being that Nvidia has pre-loaded the tablet with its Shield Hub, which is a 10-foot UI used to purchase, download, and launch games.</p> <p>Arguably, the real star of the tablet is Nvidia’s new Tegra K1 mobile superchip. The 2.2GHz quad-core A15 SOC features Nvidia’s Kepler GPU architecture and 192 CUDA cores along with 2GB of low-power DDR3. K1 supports many of the graphical features commonplace in GeForce graphics cards, including tesselation, HDR lighting, Global illumination, subsurface scattering, and more.</p> <p>In our performance benchmarks, the K1 killed it. Up until now, the original Shield’s actively cooled Tegra 4 is arguably one of the most, if not <em>the</em> most, powerful Android SOC on the market, but the K1 slaughters it across the board. In Antutu and GeekBench benchmark, we saw modest gains of 12 percent to 23 percent in Shield vs. Shield Tablet action. But in Passmark and GFX Bench’s Trex test, we saw nearly a 50 percent spread, and in 3DMark’s mobile Icestorm Unlimited test, we saw an astounding 90 percent advantage for the Shield Tablet. This is incredible when you consider that the tablet has no fans and a two-watt TDP. Compared to the second-gen Nexus 7, the Shield Tablet benchmarks anywhere from 77 percent to 250 percent faster. This SOC is smoking fast.</p> <p>In terms of battery life, Nvidia claims you’ll get 10 hours watching/surfing the web and about five hours from gaming with its 19.75 Wh battery. This is up 3.75 Wh up from Google’s Nexus 7 equivalent, and from our experiential tests, we found those figures to be fairly accurate if not a best-case scenario. It will pretty much last you all day, but you'll still want to let it sip juice every night.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_war_thunder.jpg" alt="Shield Tablet review" title="Shield Tablet review" width="620" height="343" /></p> <p style="text-align: center;"><strong>The new wireless controller uses Wi-Fi Direct instead of Bluetooth for lower latency.</strong></p> <p>Of course, if you’re going to game with it, you’re going to need Nvidia’s new wireless Shield Controller. Sold separately for $60, the 11.2-ounce Shield Controller maintains the same button layout as the original Shield controller, but feels a lot lighter and is more comfortable to hold. While most Android game controllers operate over Bluetooth, Nvidia opted to go with Wi-Fi Direct, stating that it offers 2x faster response time and more bandwidth. The extra bandwidth allows you to plug a 3.5mm headphone into the controller and also allows you to link up to four controllers to the device, which is an appreciated feature when you hook up the tablet to your HDTV via the Shield Tablet’s <a title="shield console mode" href="http://www.maximumpc.com/nvidia_sweetens_shield_console_android_442_kitkat_price_drop_199_through_april" target="_blank">Console Mode</a>. Other unique features of the controller include capacitive-touch buttons for Android’s home, back, and play buttons. There’s also a big green Nvidia button that launches Shield Hub. The controller also has a small, triangle-shaped clickable touch pad which allows you to navigate your tablet from afar. One quibble with it is that we wish the trackpad was more square, to at least mimic the dimensions of the tablet; the triangle shape was a little awkward to interface with. Another problem that we initially had with the controller was that the + volume button stopped working after a while. We contacted Nvidia about this and the company sent us a new unit, which remedied the issue. One noticeable feature missing from the controller is rumble support. Nvidia said this was omitted on the original Shield to keep the weight down; its omission is a little more glaring this time around, however, since there's no screen attached to the device.</p> <p>The controller isn’t the only accessory that you’ll need to purchase separately if you want to tap into the full Shield Tablet experience. To effectively game with the tablet, you’ll need the Shield Tablet cover, which also acts as a stand. Like most tablets, a magnet in the cover shuts off the Shield Tablet when closed, but otherwise setting up the cover and getting it to act as a stand is initially pretty confusing. The cover currently only comes in black, and while we’re generally not big on marketing aesthetics, it would be nice to have an Nvidia green option to give the whole look a little more pop. We actually think the cover should just be thrown in gratis, especially considering that the cheapest 16GB model costs $300. On the upside though, you do get Nvidia’s new passive DirectStylus 2 that stows away nicely in the body of the Shield Tablet. Nvidia has pre-installed note-writing software and its own Nvidia Dabbler painting program. The nice thing about Dabbler is that it leverages the K1’s GPU acceleration so that you can virtually paint and blend colors in real time. There’s also a realistic mode where the “paint” slowly drips down the virtual canvas like it would in real life.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_trine2_0.jpg" alt="Shield tablet review" title="Shield tablet review" width="620" height="404" /></p> <p style="text-align: center;"><strong>The Shield Controller is a lot lighter and less blocky than the original Shield Portable.</strong></p> <p>But that’s probably not why you’re interested in the Shield Tablet. This device is first and foremost a gaming tablet and even comes with a free Android copy of Trine 2. Trine 2 was originally a PC game and it’s made a great transition to the Shield Tablet. While the game was never known to be a polygon pusher, it looks just as good as it ever did on its x86 debut.&nbsp;</p> <p>With gaming as the primary driver for Shield Tablet, you may wonder why Nvidia didn’t bundle its new controller. The company likely learned from Microsoft’s mistake with Kinect and the Xbox One: Gamers don’t like to spend money and getting the price as low as possible was likely on Nvidia’s mind. Of course, not everyone may even want a controller, with the general lack of support for them in games. Nvidia says there are now around 400 Android titles that support its controller, but that’s only a small percentage of Android games and the straight truth is that the overwhelming majority of these games are garbage.&nbsp;</p> <p>Nvidia is making a push for Android gaming, however. The company worked with Valve to port over Half Life 2 and Portal to the Shield and they look surprisingly fantastic and are easily the two prettiest games on Android at the moment. Whether Android will ever become a legitimate platform for hardcore gaming is anyone’s guess, but at least the Shield Tablet will net you a great front seat if the time ever arises.</p> <p>Luckily, you won’t have to rely solely on the Google Play store to get your gaming fix. Emulators run just as well here as they did on the original Shield and this iteration of Shield is also compatible with Gamestream, which is Nvidia’s streaming technology that allows you to stream games from your PC to your Shield. Gamestream, in theory, lets you play your controller-enabled PC games on a Shield.</p> <p>At this point, Nvidia says Gamestream supports more than 100 games such as Batman: Arkham Origins and Titanfall from EA’s Origin and Valve’s Steam service. The problem, though, is that there are hundreds more games on Steam and Origin that support controllers—but not the Shield Tablet’s controller. For example, Final Fantasy VII, a game that we couldn’t get to work with the original Shield, still isn't supported even though it works with an Xbox controller on the PC. When Gamestream does work, however, it’s relatively lag-free and kind of wonderful. The one caveat here is that you’ll have to get a 5GHz dual-band router to effectively get it working.&nbsp;</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/rh7fWdQT2eE" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Nvidia Shield Video demo.</strong></p> <p>Would we buy the Shield Tablet if we owned the original Shield (now renamed the Shield Portable)? Probably not. If we were looking for a new tablet and top-notch gaming performance was on the checklist, the Shield Tablet is easily the top contender today. We’d take it over the second-gen Nexus 7 in a heartbeat. While we understand why Nvidia decided to separate the cover and controller to keep the prices down and avoid the Kinect factor, we think a bundled package with a small price break as an alternative would have been nice. All things considered though, consider us surprised. The Shield Tablet is pretty dang cool.&nbsp;</p> <p><strong>$300</strong></p> <p><em><strong>Update:</strong> The original article incorrectly labled the Shield Portable benchmarks with the Nexus 7 figures. The issue has been resolved and both benchmark charts are listed below.&nbsp;</em></p> http://www.maximumpc.com/nvidia_shield_tablet_review_2014#comments android Google Hardware KitKat maximum pc nvidia portable Review shield tablet wireless controller News Reviews Tablets Mon, 18 Aug 2014 21:36:57 +0000 Jimmy Thang 28263 at http://www.maximumpc.com Free Copy of Borderlands: The Pre-Sequel with Purchase of Select GeForce GTX GPUs http://www.maximumpc.com/free_copy_borderlands_pre-sequel_purchase_select_geforce_gtx_gpus_2014 <!--paging_filter--><h3><img src="/files/u166440/borderlands_the_pre-sequel.jpg" alt="Borderlands The PreSequel" title="Borderlands The PreSequel" width="200" height="113" style="float: right;" />Offer available at participating retailers</h3> <p>If you have been looking to upgrade your GPU, then now would be a good time to do so. Especially if you are a fan of the Borderlands franchise. Those who <strong>purchase select Nvidia GeForce GTX GPUs will get a free copy of Borderlands: The Pre-Sequel</strong>.</p> <p>To get a free copy of the FPS, consumers will need to purchase the GeForce GTX Titan, <a title="MPC 780Ti benchmarks" href="http://www.maximumpc.com/nvidia_geforce_gtx_780_ti_benchmarks" target="_blank"><span style="color: #ff0000;">780Ti</span></a>, 780, or 770 desktop GPUs from a list of <a title="Nvidia website" href="http://www.geforce.com/GetBorderlands" target="_blank"><span style="color: #ff0000;">participating retailers</span></a>. &nbsp;However, the deal is only good while supplies last.&nbsp;</p> <p>Like Borderlands 2, Borderlands: The Pre-Sequel will make use of Nvidia’s PhysX technology. “If you have a high-end Nvidia GPU, Borderlands: The Pre-Sequel will offer higher fidelity and higher performance hardware-driven special effects including awesome weapon impacts, moon-shatteringly cool cryo explosions, and ice particles, and cloth and fluid simulation that blows me away every time I see it,” said Gearbox Software ceo Randy Pitchford.</p> <p>Borderlands: The Pre-Sequel takes place during Borderlands and Borderlands 2. In it, players will learn about Borderlands 2’s villain, Handsome Jack, and how he rose to power while shooting and looting their way through the game as his henchmen.</p> <p>Borderlands: The Pre-Sequel will be available in North America October 14 and October 17 Internationally.&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/free_copy_borderlands_pre-sequel_purchase_select_geforce_gtx_gpus_2014#comments Borderlands The Pre-Sequel free copy Boderlands Pre-Sequel GeForce GTX GPU nvidia News Wed, 13 Aug 2014 22:45:24 +0000 Sean D Knight 28342 at http://www.maximumpc.com Falcon Northwest Tiki-Z Micro Tower Totes a Titan Z Graphics Card http://www.maximumpc.com/falcon_northwest_tiki-z_micro_tower_totes_titan_z_graphics_card_2014 <!--paging_filter--><h3><img src="/files/u69/tiki-z.jpg" alt="Falcon Northwest Tiki Z" title="Falcon Northwest Tiki Z" width="228" height="155" style="float: right;" />A tiny system with the gaming performance of a Titan Z</h3> <p>Of all the systems featuring an <strong>Nvidia GeForce GTX Titan Z graphics card, the Tiki-Z Special Edition from Falcon Northwest </strong>might be the most impressive. That's because the Tiki-Z Special Edition is a micro-tower measuring just 4 inches wide and 13 inches tall --the same size as the standard Tiki and roughly equivalent to the original Xbox console -- yet has enough space to accommodate Nvidia's Titan Z, which is powered by a pair of Kepler GPUs.</p> <p>"Tiki-Z gives our customers the dual GPU option they’ve wanted since Tiki was first released," said Kelt Reeves, president of Falcon Northwest. "They can now play truly demanding 3D games at 4K resolution in a slim PC that can easily fit on anyone’s desk. Tiki-Z takes our power-per-cubic-inch mission to an entirely new level."</p> <p>In order to make room for Nvidia's largest graphics card and keep it cool, Falcon Northwest had to make several modifications, including laser-cut venting with a special exhaust, and the addition of a side window with lighting, which also serves as a custom air intake duct. It also needed help from its hardware partners -- SilverStone created a new version of its tiny 600W PSU.</p> <p>Pricing for the Tiki-Z starts at $5,614 and, for a limited time, will come with an Asus PB287Q 28-inch 4K monitor at no extra charge. Other features include an Asus Z97I Plus motherboard, Intel Core i7 4790K processor, Asetek liquid cooling, 8GB of DDR3-1866 RAM, GeForce GTX Titan Z, Crucial M550 256GB SSD, DVD writer, Windows 8.1, and three-year warranty.</p> <p>The Falcon Northwest Tiki-Z Special Edition is <a href="http://www.falcon-nw.com/promo/tiki-z" target="_blank">available now</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/falcon_northwest_tiki-z_micro_tower_totes_titan_z_graphics_card_2014#comments falcon northwest geforce gtx titan z graphics card Hardware nvidia OEM rigs tiki-z special edition Video Card News Wed, 13 Aug 2014 13:33:27 +0000 Paul Lilly 28337 at http://www.maximumpc.com Best Cheap Graphics Card http://www.maximumpc.com/best_cheap_graphics_card_2014 <!--paging_filter--><h3>Six entry-level cards battle for budget-board bragging rights</h3> <p>The video-card game is a lot like Hollywood. Movies like My Left Foot and The Artist take home the Oscars every year, but movies like Grown Ups 2 and Transformers 3 pull in all the cash. It's the same with GPUs, in that everyone loves to talk about $1,000 cards, but the actual bread-and-butter of the market is made up of models that cost between $100 and $150. These are not GPUs for 4K gaming, obviously, but they can provide a surprisingly pleasant 1080p gaming experience, and run cool and quiet, too.</p> <p>This arena has been so hot that AMD and Nvidia have recently released no fewer than six cards aimed at budget buyers. Four of these cards are from AMD, and Nvidia launched two models care of its all-new Maxwell architecture, so we decided to pit them against one another in an old-fashioned GPU roundup. All of these cards use either a single six-pin PCIe connector or none at all, so you don't even need a burly power supply to run them, just a little bit of scratch and the desire to get your game on. Let's dive in and see who rules the roost!</p> <h3>Nvidia's Maxwell changes the game</h3> <p>Budget GPUs have always been low-power components, and usually need just a single six-pin PCIe power connector to run them. After all, a budget GPU goes into a budget build, and those PCs typically don't come with the 600W-or-higher power supplies that provide dual six- or eight-pin PCIe connectors. Since many budget PSUs done have PCIe connectors, most of these cards come with Molex adapters in case you don't have one. The typical thermal design power (TDP) of these cards is around 110 watts or so, but that number fluctuates up and down according to spec. For comparison, the Radeon R9 290X has a TDP of roughly 300 watts, and Nvidia's flagship card, the GTX 780 Ti, has a TDP of 250W, so these budget cards don't have a lot of juice to work with. Therefore, efficiency is key, as the GPUs need to make the most out of the teeny, tiny bit of wattage they are allotted. During 2013, we saw AMD and Nvidia release GPUs based on all-new 28nm architectures named GCN and Kepler, respectively, and though Nvidia held a decisive advantage in the efficiency battle, it's taken things to the next level with its new ultra-low-power Maxwell GPUs that were released in February 2014.</p> <p>Beginning with the GTX 750 Ti and the GTX 750, Nvidia is embarking on a whole new course for its GPUs, centered around maximum power efficiency. The goal with its former Kepler architecture was to have better performance per watt compared to the previous architecture named Fermi, and it succeeded, but it's taken that same philosophy even further with Maxwell, which had as its goal to be twice as efficient as Kepler while providing 25 percent more performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/maxwell_small_0.jpg"><img src="/files/u152332/maxwell_small.jpg" alt="Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. " width="620" height="279" /></a></p> <p style="text-align: center;"><strong>Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. </strong></p> <p>Achieving more performance for the same model or SKU from one generation to the next is a tough enough challenge, but to do so by cutting power consumption in half is an even trickier gambit, especially considering the Maxwell GPUs are being fabricated on the same 28nm process it used for Kepler. We always expect more performance for less power when moving from one process to the next, such as 32nm to 28nm or 22nm to 14nm, but to do so on the same process is an amazing achievement indeed. Though Nvidia used many technological advances to reduce power consumption, the main structural change was to how the individual CUDA cores inside the Graphics Processing Clusters (GPCs) are organized and controlled. In Kepler, each GPC contained individual processing units, named SMX units, and each unit featured a piece of control logic that handled scheduling for 192 CUDA cores, which was a major increase from the 32 cores in each block found in Fermi. In Maxwell, Nvidia has gone back to 32 CUDA cores per block, but is putting four blocks into each unit, which are now called SM units. If you're confused, the simple version is this—rather than one piece of logic controlling 192 cores, Maxwell has a piece of logic for each cluster of 32 cores, and there are four clusters per unit, for a total of 128 cores per block. Therefore, it's reduced the number of cores per block by 64, from 192 to 128, which helps save energy. Also, since each piece of control logic only has to pay attention to 32 cores instead of 192, it can run them more efficiently, which also saves energy.</p> <p>The benefit to all this energy-saving is the GTX 750 cards don't need external power, so they can be dropped into pretty much any PC on the market without upgrading the power supply. That makes it a great upgrade for any pre-built POS you have lying around the house.</p> <h4>Gigabyte GTX 750 Ti WindForce</h4> <p>Nvidia's new Maxwell cards run surprisingly cool and quiet in stock trim, and that's with a fan no larger than an oversized Ritz cracker, so you can guess what happens when you throw a mid-sized WindForce cooler onto one of them. Yep, it's so quiet and cool you have to check with your fingers to see if it's even running. This bad boy ran at 45 C under load, making it the coolest-running card we've ever tested, so kudos to Nvidia and Gigabyte on holding it down (the temps, that is). This board comes off the factory line with a very mild overclock of just 13MHz (why even bother, seriously), and its boost clock has been massaged up to 1,111MHz from 1,085MHz, but as always, this is just a starting point for your overclocking adventures. The memory is kept at reference speeds however, running at 5,400MHz. The board sports 2GB of GDDR5 memory, and uses a custom design for its blue-colored PCB. It features two 80mm fans and an 8mm copper heat pipe. Most interesting is the board requires a six-pin PCIe connector, unlike the reference design, which does not.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/9755_big_small_0.jpg"><img src="/files/u152332/9755_big_small.jpg" alt="The WindForce cooler is overkill, but we like it that way. " title="Gigabyte GTX 750 Ti WindForce" width="620" height="500" /></a></p> <p style="text-align: center;"><strong>The WindForce cooler is overkill, but we like it that way. </strong></p> <p>In testing, the GTX 750 Ti WindForce was neck-and-neck with the Nvidia reference design, proving that Nvidia did a pretty good job with this card, and that its cooling requirements don't really warrant such an outlandish cooler. Still, we'll take it, and we loved that it was totally silent at all times. Overclocking potential is higher, of course, but since the reference design overclocked to 1,270MHz or so, we don’t think you should expect moon-shot overclocking records. Still, this card was rock solid, whisper quiet, and extremely cool.</p> <p><strong>Gigabyte GTX 750 Ti WindForce</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$160(Street), <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h4>MSI GeForce GTX 750 Gaming</h4> <p>Much like Gigabyte's GTX 750 Ti WindForce card, the MSI GTX 750 Gaming is a low-power board with a massive Twin Frozr cooler attached to it for truly exceptional cooling performance. The only downside is the formerly waifish GPU has been transformed into a full-size card, measuring more than nine inches long. Unlike the Gigabyte card though, this GPU eschews the six-pin PCIe connector, as it's just a 55W board, and since the PCIe slot delivers up to 75W, it doesn't even need the juice. Despite this card's entry-level billing, MSI has fitted it with “military-class” components for better overclocking and improved stability. It uses twin heat pipes to dual 100mm fans to keep it cool, as well. It also includes a switch that lets you toggle between booting from an older BIOS in case you run into overclocking issues.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msigtx750_small_0.jpg"><img src="/files/u152332/msigtx750_small.jpg" alt="MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card." title="MSI GeForce GTX 750 Gaming" width="620" height="364" /></a></p> <p style="text-align: center;"><strong>MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.</strong></p> <p>Speaking of which, this board lives up to its name and has a beefy overclock right out of the box, running at 1,085MHz base clock and 1,163MHz boost clock. It features 1GB of GDDR5 RAM on a 128-bit interface.</p> <p>The Twin Frozr cooler handles the miniscule amount of heat coming out of this board with aplomb—we were able to press our finger forcibly on the heatsink under load and felt almost no warmth, sort of like when we give Gordon a hug when he arrives at the office. As the only GTX 750 in this test, it showed it could run our entire test suite at decent frame rates, but it traded barbs with the slightly less expensive Radeon R7 260X. On paper, both the GTX 750 and the R7 260X are about $119, but rising prices from either increased demand or low supply have placed both cards in the $150 range, making it a dead heat. Still, it's a very good option for those who want an Nvidia GPU and its ecosystem but can't afford the Ti model.</p> <p><strong>MSI GeForce GTX 750 Gaming</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$140, <a href="http://www.msi.com/ " target="_blank">www.msi.com</a></strong></p> <h4>Sapphire Radeon R7 265 Dual-X</h4> <p>The Sapphire Radeon R7 265 is the odds-on favorite in this roundup, due to its impressive specs and the fact that it consumes more than twice the power of the Nvidia cards. Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270. This card actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. At 150W, its TDP is very high—or at least it seems high, given that the GTX 750 Ti costs the exact same $150 and is sitting at just 60W. Unlike the lower-priced R7 260X Bonaire part, though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire, basically). However, it will support the Mantle API, someday.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small_0.jpg"><img src="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small.jpg" alt="Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. " title="Sapphire Radeon R7 265 Dual-X" width="620" height="473" /></a></p> <p style="text-align: center;"><strong>Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. </strong></p> <p>The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damned impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall, though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check, too, making this the go-to GPU for those with small boxes or small monitors.</p> <p><strong> Sapphire Radeon R7 265 Dual-X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9ka.jpg" alt="score:9ka" title="score:9ka" width="210" height="80" /></div> </div> </div> <p><strong>$150 (MSRP), <a href="http://www.sapphiretech.com/ " target="_blank">www.sapphiretech.com</a></strong></p> <h4>AMD Radeon R7 260X</h4> <p>The Radeon R7 260X was originally AMD's go-to card for 1080p gaming on a budget. It’s the only card in the company’s sub-$200 lineup that supports all the next-gen features that appeared in its Hawaii-based flagship boards, including support for TrueAudio, XDMA Crossfire, Mantle (as in, it worked at launch), and it has the ability to drive up to three displays —all from this tiny $120 GPU. Not bad. In its previous life, this GPU was known as the Radeon HD 7790, aka Bonaire, and it was our favorite "budget" GPU when pitted against the Nvidia GTX 650 Ti Boost due to its decent performance and amazing at-the-time game bundles. It features a 128-bit memory bus, 896 Stream Processors, 2GB of RAM (up from 1GB on the previous card), and a healthy boost clock of 1,100MHz. TDP is just 115W, so it slots right in between the Nvidia cards and the higher-end R7 265 board. Essentially, this is an HD 7790 card with 1GB more RAM, and support for TrueAudio, which we have yet to experience.</p> <p style="text-align: center;"><a class="thickbox" href="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small_0.jpg"><img src="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small.jpg" alt="This $120 card supports Mantle, TrueAudio, and CrossFire. " title="AMD Radeon R7 260X" width="620" height="667" /></a></p> <p style="text-align: center;"><strong>This $120 card supports Mantle, TrueAudio, and CrossFire. </strong></p> <p>In testing, the R7 260X delivered passable performance, staking out the middle ground between the faster R7 265 and the much slower R7 250 cards. It ran at about 30fps in tests like Crysis 3 and Tomb Raider, but hit 51fps on CoD: Ghosts and 40fps on Battlefield 4, so it's certainly got enough horsepower to run the latest games on max settings. The fact that it supports all the latest technology from AMD is what bolsters this card's credentials, though. And the fact that it can run Mantle with no problems is a big plus for Battlefield 4 players. We like this card a lot, just like we enjoyed the HD 7790. While it’s not the fastest card in the bunch, it’s certainly far from the slowest.</p> <p><strong>AMD Radeon R7 260X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$120 <a href="http://www.amd.com/ " target="_blank">www.amd.com</a></strong></p> <h4> <hr />MSI Radeon R7 250 OC</h4> <p>In every competition, there must be one card that represents the lower end of the spectrum, and in this particular roundup, it’s this little guy from MSI. Sure, it's been overclocked a smidge and has a big-boy 2GB of memory, but this GPU is otherwise outgunned, plain and simple. For starters, it has just 384 Stream Processors, which is the lowest number we've ever seen on a modern GPU, so it's already severely handicapped right out of the gate. Board power is a decent 65W, but when looking at the specs of the Nvidia GTX 750, it is clearly outmatched. One other major problem, at least for those of us with big monitors, is we couldn't get it to run our desktop at 2560x1600 out of the box, as it only has one single-link DVI connector instead of dual-link. On the plus side, it doesn't require an auxiliary power connector and is just $100, so it's a very inexpensive board and would make a great upgrade from integrated graphics for someone on a strict budget.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msi_radeon_r7_250_oc_2gb_small_0.jpg"><img src="/files/u152332/msi_radeon_r7_250_oc_2gb_small.jpg" alt="Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB." title="MSI Radeon R7 250 OC" width="620" height="498" /></a></p> <p style="text-align: center;"><strong>Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.</strong></p> <p>That said, we actually felt bad for this card during testing. The sight of it limping along at 9 frames per second in Heaven 4.0 was tough to watch, and it didn't do much better on our other tests, either. Its best score was in Call of Duty: Ghosts, where it hit a barely playable 22fps. In all of our other tests, it was somewhere between 10 and 20 frames per second on high settings, which is simply not playable. We'd love to say something positive about the card though, so we'll note that it probably runs fine at medium settings and has a lot of great reviews on Newegg from people running at 1280x720 or 1680x1050 resolution.</p> <p><strong>MSI Radeon R7 250 OC 1TB</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_6.jpg" alt="score:6" title="score:6" width="210" height="80" /></div> </div> </div> <p><strong>$90 <a href=" http://us.msi.com/ " target="_blank">http://us.msi.com</a></strong></p> <h4>PowerColor Radeon R7 250X</h4> <p>The PowerColor Radeon R7 250X represents a mild bump in specs from the R7 250, as you would expect given its naming convention. It is outfitted with 1GB of RAM however, and a decent 1,000MHz boost clock. It packs 640 Stream Processors, placing it above the regular R7 250 but about mid-pack in this group. Its 1GB of memory runs on the same 128-bit memory bus as other cards in this roundup, so it's a bit constrained in its memory bandwidth, and we saw the effects of it in our testing. It supports DirectX 11.2, though, and has a dual-link DVI connector. It even supports CrossFire with an APU, but not with another PCIe GPU&shy;—or at least that's our understanding of it, since it says it supports CrossFire but doesn't have a connector on top of the card.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r7-250x-angle_small_0.jpg"><img src="/files/u152332/r7-250x-angle_small.jpg" alt="The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. " title="PowerColor Radeon R7 250X " width="620" height="369" /></a></p> <p style="text-align: center;"><strong>The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. </strong></p> <p>When we put the X-card to the test, it ended up faring a smidgen better than the non-X version, but just barely. It was able to hit 27 and 28 frames per second in Battlefield 4 and CoD: Ghosts, and 34 fps in Batman: Arkham Origins, but in the rest of the games in our test suite, its performance was simply not what we would call playable. Much like the R7 250 from MSI, this card can't handle 1080p with all settings maxed out, so this GPU is bringing up the rear in this crowd. Since it's priced "under $100" we won't be too harsh on it, as it seems like a fairly solid option for those on a very tight budget, and we'd definitely take it over the vanilla R7 250. We weren't able to see "street" pricing for this card, as it had not been released at press time, but our guess is even though it's one of the slowest in this bunch, it will likely be the go-to card under $100.</p> <p><strong>PowerColor Radeon R7 250X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$100, <a href="http://www.powercolor.com/ " target="_blank">www.powercolor.com</a></strong></p> <h3>Should you take the red pill or the green pill?</h3> <p><strong>Both companies offer proprietary technologies to lure you into their "ecosystems," so let’s take a look at what each has to offer</strong></p> <h4>Nvidia's Offerings</h4> <p><strong>G-Sync</strong></p> <p>Nvidia's G-Sync technology is arguably one of the strongest cards in Nvidia's hand, as it eliminates tearing in video games caused by the display's refresh rate being out of sync with the frame rate of the GPU. The silicon syncs the refresh rate with the cycle of frames rendered by the GPU, so movement onscreen looks buttery smooth at all times, even below 30fps. The only downside is you must have a G-Sync monitor, so that limits your selection quite a bit.</p> <p><strong>Regular driver releases</strong></p> <p>People love to say Nvidia has "better drivers" than AMD, and though the notion of "better" is debatable, it certainly releases them much more frequently than AMD. That's not to say AMD is a slouch—especially now that it releases a new "beta" build each month—but Nvidia seems to be paying more attention to driver support than AMD.</p> <p><strong>GeForce Experience and ShadowPlay</strong></p> <p>Nvidia's GeForce Experience software will automatically optimize any supported games you have installed, and also lets you stream to Twitch as well as capture in-game footage via ShadowPlay. It's a really slick piece of software, and though we don't need a software program to tell us "hey, max out all settings," we do love ShadowPlay.</p> <p><strong>PhysX</strong></p> <p>Nvidia's proprietary PhysX software allows game developers to include billowing smoke, exploding particles, cloth simulation, flowing liquids, and more, but there's just one problem—very few games utilize it. Even worse, the ones that do utilize it, do so in a way that is simply not that impressive, with one exception: Borderlands 2.</p> <h4>AMD's Offerings</h4> <p><strong>Mantle and TrueAudio</strong></p> <p>AMD is hoping that Mantle and TrueAudio become the must-have "killer technology" it offers over Nvidia, but at this early stage, it's difficult to say with certainty if that will ever happen. Mantle is a lower-level API that allows developers to optimize a game specifically targeted at AMD hardware, allowing for improved performance.</p> <p><strong>TressFX</strong></p> <p>This is proprietary physics technology similar to Nvidia's PhysX in that it only appears in certain games, and does very specific things. Thus far, we've only seen it used once—for Lara Croft's hair in Tomb Raider. Instead of a blocky ponytail, her mane is flowing and gets blown around by the wind. It looks cool but is by no means a must-have item on your shopping list, just like Nvidia's PhysX. <strong>&nbsp;</strong></p> <p><strong>Gaming Evolved by Raptr<br /></strong></p> <p>This software package is for Radeon users only, and does several things. First, it will automatically optimize supported games you have installed, and it also connects you to a huge community of gamers across all platforms, including PC and console. You can see who is playing what, track achievements, chat with friends, and also broadcast to Twitch.tv, too. AMD also has a "rewards" program that doles out points for using the software, and you can exchange those points for gear, games, swag, and more.</p> <p><strong>Currency mining</strong></p> <p>AMD cards are better for currency mining than Nvidia cards for several reasons, but their dominance is not in question. The most basic reason is the algorithms used in currency mining favor the GCN architecture, so much so that AMD cards are usually up to five times faster in performing these operations than their Nvidia equivalent. In fact, the mining craze has pushed the demand for these cards is so high that there's now a supply shortage.</p> <h3>All the cards, side by side</h3> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>MSI Geforce GTX 750 Gaming</td> <td>GigaByte GeForce GTX 750 Ti </td> <td>GeForce GTX 650 Ti Boost *</td> <td>GeForce GTX 660 *</td> <td>MSI Radeon R7 250</td> <td>PowerColor Radeon R7 250X</td> <td>AMD Radeon R7 260X</td> <td>Sapphire Radeon R7 265</td> </tr> <tr> <td class="item">Price</td> <td class="item-dark">$120 </td> <td>$150</td> <td>$160</td> <td>$210</td> <td>$90</td> <td>$100</td> <td>$120</td> <td>$150</td> </tr> <tr> <td>Code-name</td> <td>Maxwell</td> <td>Maxwell</td> <td>Kepler</td> <td>Kepler</td> <td>Oland</td> <td>Cape Verde</td> <td>Bonaire</td> <td>Curaco</td> </tr> <tr> <td class="item">Processing cores</td> <td class="item-dark">512</td> <td>640</td> <td>768</td> <td>960</td> <td>384</td> <td>640</td> <td>896</td> <td>1,024</td> </tr> <tr> <td>ROP units</td> <td>16</td> <td>16</td> <td>24</td> <td>24</td> <td>8</td> <td>16</td> <td>16</td> <td>32</td> </tr> <tr> <td>Texture units</td> <td>32</td> <td>40<strong><br /></strong></td> <td>64</td> <td>80</td> <td>24</td> <td>40</td> <td>56</td> <td>64</td> </tr> <tr> <td>Memory</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>1GB</td> <td>1GB</td> <td>2GB</td> <td>2GB</td> </tr> <tr> <td>Memory speed</td> <td>1,350MHz</td> <td>1,350MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,125MHz</td> <td>1,500MHz</td> <td>1,400MHz</td> </tr> <tr> <td>Memory bus</td> <td>128-bit</td> <td>128-bit</td> <td>192-bit</td> <td>192-bit</td> <td>128-bit</td> <td>128-bit</td> <td>128-bit</td> <td>256-bit</td> </tr> <tr> <td>Base clock</td> <td>1,020MHz</td> <td>1,020MHz</td> <td>980MHz</td> <td>980MHz</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> </tr> <tr> <td>Boost clock</td> <td>1,085MHz</td> <td>1,085MHz</td> <td>1,033MHz</td> <td>1,033MHz</td> <td>1,050MHz</td> <td>1,000MHz</td> <td>1,000MHz</td> <td>925MHz</td> </tr> <tr> <td>PCI Express version</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> </tr> <tr> <td>Transistor count</td> <td>1.87 billion</td> <td>1.87 billion</td> <td>2.54 billion</td> <td>2.54 billion</td> <td>1.04 billion</td> <td>1.04 billion</td> <td>2.08 billion</td> <td>2.8 billion</td> </tr> <tr> <td>Power connectors</td> <td>N/A</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>1x six-pin</td> </tr> <tr> <td>TDP</td> <td>54W</td> <td>60W</td> <td>134W</td> <td>140W</td> <td>65W</td> <td>80W</td> <td>115W</td> <td>150W</td> </tr> <tr> <td>Fab process</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> </tr> <tr> <td>Multi-card support</td> <td>No</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>Yes</td> </tr> <tr> <td>Outputs</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, <br />2x HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, <br />HDMI, DisplayPort</td> <td>DVI-S, VGA, HDMI</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, HDMI, DisplayPort</td> </tr> </tbody> </table> <p><em>Provided for reference purposes.<br /></em></p> </div> </div> </div> </div> </div> <h3>How we tested</h3> <p><strong>We lowered our requirements, but not too much</strong></p> <p>We normally test all of our video cards on our standardized test bed, which has now been in operation for a year and a half, with only a few changes along the way. In fact, the only major change we've made to it in the last year was swapping the X79 motherboard and case. The motherboard had endured several hundred video-card insertions, which is well beyond the design specs. The case had also become bent to the point where the video cards were drooping slightly. Some, shall we say, "overzealous" overclocking also caused the motherboard to begin behaving unpredictably. Regardless, it's a top-tier rig with an Intel Core i7-3960X Extreme processor, 16GB of DDR3 memory, an Asus Rampage IV Extreme motherboard, Crucial M500 SSD, and Windows 8 64-bit Enterprise.</p> <p>For the AMD video cards, we loaded Catalyst driver 14.1 Beta 1.6, as that was the latest driver, and for the Nvidia cards, we used the 334.89 WHQL driver that was released just before testing began. We originally planned to run the cards at our normal "midrange GPU" settings, which is 1920x1080 resolution with maximum settings and 4X AA enabled, but after testing began, we realized we needed to back off those settings just a tad. Instead of dialing it down to medium settings, though, as that would run counter to everything we stand for as a magazine, we left the settings on "high" across the board, but disabled AA. These settings were a bit much for the lower-end cards, but rather than lower our settings once again, we decided to stand fast at 1080p with high settings, since we figured that's where you want to be gaming and you deserve to know if some of the less-expensive cards can handle that type of action.</p> <h3>Mantle Reviewed</h3> <p><strong>A word about AMD's Mantle API</strong></p> <p>AMD's Mantle API is a competitor to DirectX, optimized specifically for AMD's GCN hardware. In theory, it should allow for better performance since its code knows exactly what hardware it's talking to, as opposed to DirectX's "works with any card" approach. The Mantle API should be able to give all GCN 1.0 and later AMD cards quite a boost in games that support it. However, AMD points out that Mantle will only show benefits in scenarios that are CPU-bound, not GPU-bound, so if your GPU is already working as hard as it can, Mantle isn’t going to help it. However, if your GPU is always waiting around for instructions from an overloaded CPU, then Mantle can offer respectable gains.</p> <p>To test it out, we ran Battlefield 4 on an older Ivy Bridge quad-core, non-Hyper-Threaded Core i5-3470 test bench with the R7 260X GPU at 1920x1080 and 4X AA enabled. As of press time, there are only two games that support Mantle—Battlefield 4 and an RTS demo on Steam named Star Swarm. In Battlefield 4, we were able to achieve 36fps using DirectX, and 44fps using Mantle, which is a healthy increase and a very respectable showing for a $120 video card. The benefit was much smaller in Star Swarm, however, showing a negligible increase of just two frames per second.</p> <p><img src="/files/u152332/bf4_screen_swap_small.jpg" alt="Enabling Mantle in Battlefield 4 does provide performance boosts for most configs." title="Battlefield 4" width="620" height="207" /></p> <p>We then moved to a much beefier test bench running a six-core, Hyper-Threaded Core i7-3960X and a Radeon R9 290X, and we saw an increase in Star Swarm of 100 percent, going from 25fps with DirectX to 51fps using Mantle in a timed demo. We got a decent bump in Battlefield 4, too, going from 84 fps using DirectX to 98 fps in Mantle.</p> <p>Overall, Mantle is legit, but it’s kind of like PhysX or TressFX in that it’s nice to have when it’s supported, and does provide a boost, but it isn’t something we’d count on being available in most games.</p> <h3>Final Thoughts</h3> <h3>If cost is an issue, you've got options</h3> <p>Testing the cards for this feature was an enlightening experience. We don’t usually dabble in GPU waters that are this shallow, so we really had no idea what to expect from all the cards assembled. To be honest, if we were given a shot of sodium pentothal, we’d have to admit that given these cards’ price points, we had low expectations but thought they’d all at least be able to handle 1920x1080 gaming. As spoiled gamers used to running 2K or 4K resolution, 1080p seems like child’s play to us. But we found out that running that resolution at maximum settings is a bridge too far for any GPU that costs less than $120 or so. The $150 models are the sweet spot, though, and are able to game extremely well at 1080 resolution, meaning the barrier to entry for “sweet gaming” has been lowered by $100, thanks to these new GPUs from AMD and Nvidia.</p> <p>Therefore, the summary of our results is that if you have $150 to spend on a GPU, you should buy the Sapphire Radeon R7 265, as it’s the best card for gaming at this price point, end of discussion. OK, thanks for reading.</p> <p>Oh, are you still here? OK, here’s some more detail. In our testing, the Sapphire R7 265 was hands-down the fastest GPU at its price point—by a non-trivial margin in many tests—and is superior to the GTX 750 Ti from Nvidia. It was also the only GPU to come close to the magical 60fps we desire in every game, making it pretty much the only card in this crowd that came close to satisfying our sky-high demands. The Nvidia GTX 750 Ti card was a close second, though, and provides a totally passable experience at 1080p with all settings maxed. Nvidia’s trump card is that it consumes less than half the power of the R7 265 and runs 10 C cooler, but we doubt most gamers will care except in severely PSU-constrained systems.</p> <p>Moving down one notch to the $120 cards, the GTX 750 and R7 260X trade blows quite well, so there’s no clear winner. Pick your ecosystem and get your game on, because these cards are totally decent, and delivered playable frame rates in every test we ran.</p> <p>The bottom rung of cards, which consists of the R7 250(X) cards, were not playable at 1080p at max settings, so avoid them. They are probably good for 1680x1050 gaming at medium settings or something in that ballpark, but in our world, that is a no-man’s land filled with shattered dreams and sad pixels.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>Nvidia GTX 750 Ti (reference)</td> <td>Gigabyte GTX 750 Ti</td> <td>MSI GTX 750 Gaming</td> <td>Sapphire Radeon R7 265</td> <td>AMD Radeon R7 260X</td> <td>PowerColor Radeon R7 250X</td> <td>MSI Radeon R7 250 OC</td> </tr> <tr> <td class="item">Driver</td> <td class="item-dark">334.89</td> <td>334.89</td> <td>334.89</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> </tr> <tr> <td>3DMark Fire Storm</td> <td>3,960</td> <td>3,974</td> <td>3,558</td> <td><strong>4,686</strong></td> <td>3,832</td> <td>2,806</td> <td>1,524</td> </tr> <tr> <td class="item">Unigine Heaven 4.0 (fps)</td> <td class="item-dark"><strong>30</strong></td> <td><strong>30<br /></strong></td> <td>25</td> <td>29</td> <td>23</td> <td>17</td> <td>9</td> </tr> <tr> <td>Crysis 3 (fps)</td> <td>27</td> <td>25</td> <td>21</td> <td><strong>32</strong></td> <td>26</td> <td>16</td> <td>10</td> </tr> <tr> <td>Far Cry 3 (fps)</td> <td><strong>40</strong></td> <td><strong>40<br /></strong></td> <td>34</td> <td><strong>40</strong></td> <td>34</td> <td>16</td> <td>14</td> </tr> <tr> <td>Tomb Raider (fps)</td> <td>30</td> <td>30</td> <td>26</td> <td><strong>36</strong></td> <td>31</td> <td>20</td> <td>12</td> </tr> <tr> <td>CoD: Ghosts (fps)</td> <td>51</td> <td>49</td> <td>42</td> <td><strong>67</strong></td> <td>51</td> <td>28</td> <td>22</td> </tr> <tr> <td>Battlefield 4 (fps)</td> <td>45</td> <td>45</td> <td>32</td> <td><strong>49</strong></td> <td>40</td> <td>27</td> <td>14</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td><strong>74</strong></td> <td>71</td> <td>61</td> <td>55</td> <td>43</td> <td>34</td> <td>18</td> </tr> <tr> <td>Assassin's Creed: Black Flag (fps)</td> <td>33</td> <td>33</td> <td>29</td> <td><strong>39</strong></td> <td>21</td> <td>21</td> <td>14</td> </tr> </tbody> </table> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7-3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 1920x1080 with no AA except for the 3DMark tests.<br /></em></p> </div> </div> </div> </div> </div> http://www.maximumpc.com/best_cheap_graphics_card_2014#comments 1080p affordable amd benchmarks budget cheap cheap graphics card gpu Hardware Hardware maximum pc may 2014 nvidia Video Card Features Tue, 12 Aug 2014 21:43:32 +0000 Josh Norem 28304 at http://www.maximumpc.com Nvidia Tegra K1 Claims Fame as First 64-Bit ARM Chip for Android http://www.maximumpc.com/nvidia_tegra_k1_claims_fame_first_64-bit_arm_chip_android_2014 <!--paging_filter--><h3><img src="/files/u69/tegra_k1.jpg" alt="Nvidia Tegra K1" title="Nvidia Tegra K1" width="228" height="163" style="float: right;" />Android enters the 64-bit ARM era</h3> <p>Say hello to <strong>"Denver," the codename for Nvidia's 64-bit Tegra K1 System-on-Chip (SoC), which also happens to be the first 64-bit ARM processor for Android</strong>. The new version of Nvidia's Tegra K1 SoC pairs the company's Kepler architecture-based GPU with its own custom-designed, 64-bit, dual-core "Project Denver" CPU, which Nvidia says is fully ARMv8 architecture compatible.</p> <p>So, what's special about this chip besides a 64-bit instruction set? Nvidia designed Denver to offer the highest single-core CPU throughput and industry-leading dual-core performance. Each Denver core (and there are two) sports a 7-way superscaler microarchitecture and includes a 128KB 4-way L1 instruction cache, a 64KB 4-way L1 data cache, and a 2MB 16-way L2 cache that services both cores.</p> <p>Using a process called Dynamic Code Optimization, Denver optimizes frequently used software routines at runtime into dense, highly tuned microcode-equivalent routines stored in a dedicated 128MB main-memory based optimization cache. This allows for faster access and execution, which translates into faster performance, in part because it lessens the need to re-optimize the software routine.</p> <p>Denver will also benefit Android platforms with new low latency power-state transitions. This is in addition to extensive power-gating and dynamic voltage and clock scaling routines based on workloads. The end result is more efficient power usage, which allows Denver's performance to rival even some mainstream PC-class CPUs at significantly reduced power consumption, <a href="http://blogs.nvidia.com/blog/2014/08/11/tegra-k1-denver-64-bit-for-android/" target="_blank">Nvidia says</a>.</p> <p>If you want to dig even further into the architecture, you can get more details <a href="http://www.tiriasresearch.com/downloads/nvidia-charts-its-own-path-to-armv8/" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_tegra_k1_claims_fame_first_64-bit_arm_chip_android_2014#comments 64-bit android ARM cpu denver Hardware nvidia processor tegra k1 News Tue, 12 Aug 2014 17:32:21 +0000 Paul Lilly 28334 at http://www.maximumpc.com Origin Rolls Out Battlebox Titan Z Systems for 4K Gaming http://www.maximumpc.com/origin_rolls_out_battlebox_titan_z_systems_4k_gaming_2014 <!--paging_filter--><h3><img src="/files/u69/battlebox_0.jpg" alt="Battlebox" title="Battlebox" width="228" height="171" style="float: right;" />Refreshed desktops offer 4K gaming performance starting at under $4,000</h3> <p>The 4K era is in its very early stages, and though the technology still has room for improvement (especially on the monitor side), you can make the leap if you're determined. Boutique builder <strong>Origin PC is all too happy to satisfy your 4K gaming needs with its Nvidia Battlebox Titan Z systems</strong> that are now available. These are basically refreshed Genesis, Millennium, and Chronos machines equipped with Nvidia GeForce Titan Z graphics cards.</p> <p>"Whether you’re new to 4K gaming or not, each Origin PC Battlebox Titan Z system was designed to provide the best 4K gaming experience right out of the box," Origin PC explains. "With a wide variety of special bundled options for each system, such as the inclusion of a 4K monitor bundled with a Titan Z graphics card, only Origin PC’s Battlebox TITAN Z systems can deliver the ultimate 4K gaming experience at an incredible value."</p> <p>All three setups start at under $4,000 with a single Titan Z graphics card. Each one also offers different starting configurations, though depending on the model and config, you can find yourself above the $4,000 starting point in a hurry. For example, the Chronos Z is available with a single Titan Z for under $4,000 or with a Titan Z and Asus PB287Q 4K Ultra HD 28-inch monitor, though in this case, the latter option still stays under $4,000 - it's the other two that bump up above, depending on which setup you start with.</p> <p>The same options are available for the Millennium Z, plus a third configuration consisting of dual Titan Z graphics cards for the price of one. Same goes for the Genesis Z, though it adds a fourth option -- dual Cryogenic liquid cooled Titan Z graphics cards for the price of one.</p> <p>If you're interested, just head over to the special <a href="http://www.originpc.com/promotion/4k-titan-z/" target="_blank">4K Battlebox landing page</a> on Origin's website.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/origin_rolls_out_battlebox_titan_z_systems_4k_gaming_2014#comments 4k battlebox chronos z Gaming genesis z Hardware millennium z nvidia OEM origin rigs titan z ultra hd News Mon, 11 Aug 2014 15:40:25 +0000 Paul Lilly 28324 at http://www.maximumpc.com Nvidia Stacks Two Cheap LCD Panels to Quadruple Display Resolution http://www.maximumpc.com/nvidia_stacks_two_cheap_lcd_panels_quadruple_display_resolution_2014 <!--paging_filter--><h3><img src="/files/u166440/cascaded_diplays.jpg" alt="Cascaded Display Parts" title="Cascaded Display Parts" width="200" height="110" style="float: right;" />Company focusing on head-mounted displays</h3> <p>Developers of head-mounted displays (HMDs) could benefit from Nvidia’s recent efforts sometime in the future. <strong>Nvidia was able to quadruple display resolution by stacking two cheap LCD panels</strong> on top of one another.</p> <p>Called cascaded displays, the technique involved the use of two 7-inch 1280x800 LCD monitors. The LCD panels were removed from the casings and the backlight removed from one panel. Both panels were then placed slightly offset, about a quarter-pixel, on top of each other with a quarter-wave film placed between them. The reason the panels were offset, according to the company, is that it acts like a “shutter” for a cluster of four pixels. This is how the resolution is quadrupled. In addition, both panels are able to provide refresh rates over 60Hz.&nbsp;</p> <p>During its research Nvidia created its own HMD prototype, and special software to take advantage of the cascaded displays, and provided screenshots that compared how a game would look on a conventional LCD HMD and a Cascaded LCD HMD. The result was that text and details were a lot clearer on the cascaded displays than a conventional one.</p> <p style="text-align: center;"><img src="/files/u166440/cascaded_displays_001.jpg" alt="Cascaded Display screenshot" title="Cascaded Display screenshot" width="600" height="338" /></p> <p>According to Nvidia’s research paper, the technique is an alternative to the “brute force solution of addressable pixel count” that results in 4K monitors and even mobile displays.&nbsp;</p> <p>If you want to know about cascaded displays in full detail then check out <a title="Nvidia cascaded display research" href="https://research.nvidia.com/publication/cascaded-displays-spatiotemporal-superresolution-using-offset-pixel-layers" target="_blank"><span style="color: #ff0000;">Nvidia’s research</span></a> and <a title="Cascaded display demo" href="https://www.youtube.com/watch?v=0XwaARRMbSA" target="_blank"><span style="color: #ff0000;">YouTube demo</span></a>.&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/nvidia_stacks_two_cheap_lcd_panels_quadruple_display_resolution_2014#comments cascaded displays nvidia nvidia cascaded displays nvidia HMD quadruple display resolution News Tue, 05 Aug 2014 03:30:08 +0000 Sean D Knight 28288 at http://www.maximumpc.com Nvidia GeForce 340.52 WHQL Drivers Sit Just a Few Clicks Away http://www.maximumpc.com/nvidia_geforce_34052_whql_drivers_sit_just_few_clicks_away <!--paging_filter--><h3><img src="/files/u69/geforce_1.jpg" alt="GeForce" title="GeForce" width="228" height="198" style="float: right;" />Latest GeForce drivers add a bunch of SLI profiles</h3> <p>Attention GeForce graphics card owners -- <strong>you can now download new GeForce 340.52 WHQL drivers</strong> from Nvidia's website, or update automatically through GeForce Experience. Either way, new drivers are available, and with them, you can take advantage of GameStream technology to stream PC games to the new Shield tablet, which launches today to e-tailers and retailers, Nvidia says.</p> <p>That's really the big reason for the new drivers, though if you're running multiple GPUs in SLI, you'll potentially benefit from a number of SLI profiles that have been added. Specific to the 340.52 release are profiles for Battlefield: Hardline, Dark Souls II, 3DMark SkyDriver Subtest, Divinity: Original Sin, Elder Scrolls Online, GRID Autosport, LuDaShi Benchmark, and WildStar.</p> <p>There are also some new 3D Vision profiles for stereoscopic 3D gamers. They include Banished, BioShock Infinite: Burial at Sea, and Krater.</p> <p>You can out more in the <a href="http://us.download.nvidia.com/Windows/340.52/340.52-win8-win7-winvista-desktop-release-notes.pdf" target="_blank">Release Notes (PDF)</a> and can download the new drivers <a href="http://www.geforce.com/drivers" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_geforce_34052_whql_drivers_sit_just_few_clicks_away#comments Drivers Gaming geforce 340.52 nvidia Software News Tue, 29 Jul 2014 19:01:52 +0000 Paul Lilly 28262 at http://www.maximumpc.com No BS Podcast #229: Nvidia Responds to AMD's Allegations (and Shows off Shield Tablet) http://www.maximumpc.com/no_bs_podcast_229_nvidia_responds_amds_allegations_and_shows_shield_tablet <!--paging_filter--><h3><img src="/files/u160416/gameworks.jpg" width="339" height="104" class="img-float-right" /></h3> <h3>Nvidia defends themselves against AMD's cheating allegations</h3> <p>A few weeks back in <a title="Maximum PC No BS Podcast #226" href="http://www.maximumpc.com/no_bs_podcast_226_-depth_interview_amd_graphics_guru_richard_huddy">the Maximum PC No BS Podcast #226</a>, AMD's newly arrived Gaming Scientist Richard Huddy made some bold accusations about Nvidia's developer relations, such as accusing the company of handing out "black box" files designed to make Radeon cards look bad, and using sketchy contract clauses.&nbsp;Nvidia's Distinguished Engineer Tom Petersen and Senior Director of Engineering Rev Lebaradian&nbsp;came on to <a title="no bs podcast 229" href="http://dl.maximumpc.com/maxpc_229_20140724.mp3 " target="_blank">podcast 229</a> to tell Nvidia's side of the story (<strong>Spoilers:</strong>&nbsp;They deny the cheating allegations). Also,they bring in the newly launched <a href="http://www.maximumpc.com/nvidia_shield_tablet_revealed_2014">Shield Tablet</a> and talk about it for a bit. They also answer several reader questions.</p> <p><iframe src="//www.youtube.com/embed/aG2kIUerD4c" width="560" height="315" frameborder="0"></iframe></p> <p><a title="Download Maximum PC Podcast #229 MP3" href="http://dl.maximumpc.com/maxpc_229_20140724.mp3 " target="_blank"><img src="/files/u160416/rss-audiomp3.png" width="80" height="15" /></a> <a title="Maximum PC Podcast RSS Feed" href="http://feeds.feedburner.com/maximumpc/1337" target="_blank"><img src="/files/u160416/chicklet_rss-2_0.png" width="80" height="15" /></a> <a href="https://itunes.apple.com/podcast/maximum-pc-no-bs-podcast/id213247824"><img src="/files/u160416/chicklet_itunes.gif" alt="Subscribe to Maximum PC Podcast on iTunes" title="Subscribe to Maximum PC Podcast on iTunes" width="80" height="15" /></a></p> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Subscribe to the magazine for only 99 cents an issue:</h4> <h5><a title="Subscribe to Maximum PC Magazine" href="https://w1.buysub.com/pubs/IM/MAX/MAX_subscriptionpage.jsp?cds_page_id=63027" target="_blank">In print</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Zinio" href="https://www.zinio.com/checkout/publisher/?productId=500663614" target="_blank">On Zinio</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Google Play" href="https://play.google.com/store/newsstand/details/Maximum_PC?id=CAoww6lU&amp;hl=en" target="_blank">On Google Play</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on iTunes" href="http://goo.gl/UIkW4" target="_blank">On iTunes</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Amazon Kindle" href="http://www.amazon.com/Maximum-PC/dp/B005XD5144/ref=sr_1_1?ie=UTF8&amp;qid=1406326197">On the Amazon Kindle Store</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Your Nook" href="http://www.barnesandnoble.com/w/maximum-pc-future-us-future-publishing/1119741259" target="_blank">On the Barnes &amp; Noble Nook Store</a></h5> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Stalk us in a number of ways:</h4> <p>Become a fan <a title="Maximum PC Facebook page" href="https://www.facebook.com/maximumpc" target="_blank">on Facebook</a></p> <p>Follow us <a href="https://twitter.com/maximumpc" target="_blank">on Twitter</a></p> <p>Subscribe to us <a title="Maximum PC Youtube page" href="https://www.youtube.com/user/MaximumPCMag" target="_blank">on Youtube</a></p> <p>Subscribe <a title="Maximum PC RSS Feed" href="http://feeds.feedburner.com/maximumpc/1337">to our RSS feed</a></p> <p>Subscribe <a href="https://itunes.apple.com/us/podcast/maximum-pc-no-bs-podcast/id213247824" target="_blank">to the podcast on iTunes</a></p> <p>email us at: maximumpcpodcast AT gmail DOT com</p> <p>Leave us a voicemail at 877-404-1337 x1337</p> http://www.maximumpc.com/no_bs_podcast_229_nvidia_responds_amds_allegations_and_shows_shield_tablet#comments 229 amd gameworks Gaming maximum pc No BS Podcast nvidia Podcast rev lebaredian richard huddy shield tablet tom petersen No BS Podcast Fri, 25 Jul 2014 23:08:48 +0000 The Maximum PC Staff 28243 at http://www.maximumpc.com Asus ROG Readies Swift PG278Q Monitor with Nvidia G-Sync for August Release http://www.maximumpc.com/asus_rog_readies_swift_pg278q_monitor_nvidia_g-sync_august_release <!--paging_filter--><h3><img src="/files/u69/asus_swift_pg278q.jpg" alt="Asus Swift PG278Q" title="Asus Swift PG278Q" width="228" height="219" style="float: right;" />This might be the gaming monitor you've been looking for</h3> <p>Slow your roll, early adopter -- before you go checking out with that 4K Ultra HD monitor in your virtual shopping cart, you should familiarize yourself with Asus ROG's Swift PG278Q panel. Teased earlier this year, <strong>Asus ROG officially announced the Swift PG278Q this week</strong>, a 27-inch display with a 144Hz refresh rate, blazing fast 1ms response time, and Nvidia G-Sync technology. It isn't 4K, but it is prepped and primed for gaming at high res.</p> <p>User can cycle through 60Hz, 120Hz, or 144Hz display modes courtesy of the panel's Turbo key for one-click switching. With a compatible graphics card, Nvidia's G-Sync technology will synchronize the display's refresh rate to the GPU to eliminate screen tearing and minimize display stutter and input lag. And regardless of refresh rate, you'll enjoy a 2560x1440 resolution spread out over 27 inches of screen real estate for a pixel density of 109 ppi.</p> <p>"The ROG Swift PG278Q Gaming Monitor features the ASUS-exclusive GamePlus hotkey with crosshair overlay and timer functions. The former gives gamers a choice of four different crosshairs to suit their gaming environment, while the latter provides an on-screen timer to help keep track of spawn and build times," <a href="http://rog.asus.com/340122014/gaming-monitors/pr-asus-republic-of-gamers-announces-swift-pg278q-gaming-monitor/" target="_blank">Asus explains</a>.</p> <p>There's also a 5-way navigation joystick for easy access to the OSD settings, a smart air vent design that helps in dissipating heat after extended gaming sessions, a slim profile with a 6mm bezel, a cable management design feature found on the back of the panel, built-in USB 3.0 hub (2 down, 1 up), and an ergonomic stand that supports tilt, swivel, and pivot.</p> <p>The Swift PG278Q will be available in North America by the end of August for $799.</p> <p><iframe src="//www.youtube.com/embed/ekXHvEK2bbI" width="620" height="349" frameborder="0"></iframe></p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/asus_rog_readies_swift_pg278q_monitor_nvidia_g-sync_august_release#comments asus display g-sync Hardware monitor nvidia panel pg278q Republic of Gamers ROG swift News Fri, 25 Jul 2014 16:05:57 +0000 Paul Lilly 28238 at http://www.maximumpc.com Nvidia Shield Tablet Revealed http://www.maximumpc.com/nvidia_shield_tablet_revealed_2014 <!--paging_filter--><h3>8-inch tablet, Wi-Fi Direct controller, and Tegra K1</h3> <p>After many rumors of a new Shield device, Nvidia has revealed its new Shield Tablet. Powered by Android, the 8-inch gaming tablet succeeds Nvidia’s original Shield handheld gaming device, which is now dubbed the Shield Portable.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_trine2.jpg" alt="shield tablet" title="shield tablet" width="620" height="404" /></p> <p style="text-align: center;"><strong>Introducing the Nvidia Shield Tablet</strong></p> <p>At the heart of the tablet is the company’s new top-tier mobile SoC, the Nvidia Tegra K1. The quad-core ARM chip features 192 CUDA cores with a 2.3GHz max clock speed. K1 supports a variety of APIs and features which include OpenGL ES 3.1, AEP, OpenGL 4.4, DX12, Tessellation, CUDA 6.0. Nvidia claims that the K1 can do all of this while consuming less than two watts of power.&nbsp;</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/rh7fWdQT2eE" width="560" height="315" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Play the video above to see an Nvidia rep give us a demo of the Shield Tablet.</strong></p> <p>For the display, the Shield Tablet uses a 1900x1200-resolution IPS panel. In terms of content, Nvidia has created a new Shield hub which allows you to access your game library, purchase more games, or to launch other media apps. You’ll also be able to stream PC games through the hub. The Shield tablet will support both Nvidia GameStream (local PC streaming) and Nvidia Grid (remote PC streaming). In addition, Nvidia has partnered up with Twitch to allow you to stream your Shield gameplay sessions to the social gaming site.&nbsp;</p> <h3 style="text-align: center;"><img src="/files/u154082/dsc00799.jpg" alt="nvidia shield tablet" title="nvidia shield tablet" width="620" height="349" /></h3> <p><strong>Lots of users said they wanted a bigger screen on the Shield so Nvidia decided to detach the screen from the controller.</strong></p> <p>Of course, all of this wouldn’t sound the least bit appealing if you didn’t have a controller to game with (because playing real games with tablet controllers is just…eww.). Luckily, the Shield Tablet is not only compatible with Bluetooth controllers, but Nvidia is also making a new Wi-Fi Direct controller for the device. The company claims that the Wi-Fi Direct has 2x lower latency than Bluetooth and offers more bandwidth. What will this extra bandwidth allow you to do? For starters, you’ll be able to plug in a headphone into the controller. Secondly, you will be able to connect up to four Shield controllers to the tablet. This is ideal for when you plug the Shield Tablet into your HDTV via HDMI while in “Console Mode.”&nbsp;</p> <p>While you could use the device as a console of sorts, it is first and foremost a tablet. If you’re out and about watching movies, Nvidia says you should be able to expect 10 hours of battery life. Five to six hours is what the company claims you should be able to get out of the device when gaming.&nbsp;</p> <p>What’s interesting about the device is that unlike the first Shield which used fans for active cooling, the Shield Tablet is passively cooled. To get away with this, Nvidia installed a thermal shield (pun not intended) to dissipate more heat. This adds about a millimeter of thickness to the device, but fortunately the Shield Tablet isn’t unbearably thick to begin with.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/dsc00811.jpg" alt="nvidia shield tablet" title="nvidia shield tablet" width="620" height="349" style="text-align: center;" /></p> <p><strong>The new Shield controller features a touchpad at the bottom and is much lighter and more comfortable to hold compared to the original Shield.</strong></p> <p>Other features of the device include two 5MP cameras (one on the front and one on the back), an included stylus, an SD card slot, and support for 4K output at 30FPS.&nbsp;</p> <p>In terms of pricing, the Shield Tablet will come in two flavors. The 16GB version will retail for $299, whereas the 32GB LTE version will retail for $399. Unfortunately the controller will be sold separately and will retail for $59. In addition, there will also be a $39 Shield Tablet cover which can also act as a stand.&nbsp;</p> <p>The Shield Tablet launches July 29th in the US and Canada, will release in Europe mid-August, and other regions in the fall.&nbsp;</p> <p>Expect a full review of the tablet soon after we get one in. Are you intrigued about the Shield Tablet? Let us know in the comments below!</p> http://www.maximumpc.com/nvidia_shield_tablet_revealed_2014#comments controller gaming tablet nvidia shield tablet tegra k1 News Tue, 22 Jul 2014 13:01:27 +0000 Jimmy Thang 28209 at http://www.maximumpc.com Nvidia Supposedly Working on New PC-Streaming Device http://www.maximumpc.com/nvidia_supposedly_working_new_pc-streaming_device_2014 <!--paging_filter--><h3><img src="/files/u166440/nvidia_geforce_logo.jpg" alt="Nvidia GeForce logo" title="Nvidia GeForce logo" width="200" height="193" style="float: right;" />Another contender for the living room</h3> <p>Looks like Nvidia isn’t done trying to get into the living room. According to the <a title="BBC News" href="http://www.bbc.com/news/technology-28290861" target="_blank"><span style="color: #ff0000;">BBC</span></a>, <strong>Nvidia is developing a new device that will play PC games</strong> on televisions, making use of the developer’s GeForce Experience software. It will also run Android software and, BBC reports, will have a “budget-priced separate controller.”</p> <p>Purported to be powered by Nvidia’s Tegra K1 chip, the unnamed device boasts a 192-core GPU and was shown last month running a demo of the Unreal Engine 4 on the Android L mobile operating system. While it will run Android games natively, it is also rumored to have the ability to stream PC games via Nvidia’s GeForce Experience. However, if the device makes use of the software, then it would be restricted to the company’s more recent cards.&nbsp;</p> <p>Aside from the additional cost of purchasing a new GPU to take full advantage of the device, it would also be in contention with Valve’s own in-home streaming service. Not to mention that it would go up against Valve's Steam Machine which has been pushed <a title="Steam Machine delay" href="http://www.maximumpc.com/controller_tweaks_prompt_valve_delay_steam_machines_until_2015" target="_blank"><span style="color: #ff0000;">back to 2015</span></a>.&nbsp;</p> <p>For now, Nvidia has declined to confirm the existence of the new device.&nbsp;</p> <p>Could this device be a sequel to the Nvidia Shield, which hasn’t been very successful, or will this be a brand new product?</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/nvidia_supposedly_working_new_pc-streaming_device_2014#comments geforce experience Hardware nvidia Nvidia gaming device shield Gaming News Tue, 15 Jul 2014 23:59:58 +0000 Sean D Knight 28172 at http://www.maximumpc.com