Maximum PC - All Articles http://www.maximumpc.com/articles/all/feed en ARChron Hack Brings Android Apps to Chrome for Linux, OS X and Windows http://www.maximumpc.com/archron_hack_brings_android_apps_chrome_linux_os_x_and_windows <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u46168/chrome-os-android-app.png" alt="ARChron" title="ARChron" width="228" height="151" style="float: right;" /></h3> <h3>Theoretically, almost any app can be ported</h3> <p>When Google first announced Chrome OS in 2009, among the few people who were polite enough to not dismiss it outright, and predict for it either a stillbirth or an early demise, were those who saw a merger with Android as its ultimate fate. Of course, let alone a full-blown merger, we have yet to see substantial interplay between the two platforms. The best we have seen, all these years down the line, is the <strong>ability to run a grand total of four Android apps on Chrome OS</strong> — and that too is a very recent development. Even now, Google is only working with “a select group of Android developers” and is unlikely to bring more than a handful of mobile apps to Chrome OS in the near future. Well, that’s what hacks are for, right?</p> <p>A developer named Vlad Filippov (a.k.a Vladikoff) has not not only figured out a <a href="http://arstechnica.com/gadgets/2014/09/hack-runs-android-apps-on-windows-mac-and-linux-computers/?utm_medium=referral&amp;utm_source=pulsenews" target="_blank">way to run virtually any Android app on Chrome OS</a>, he has also found a way to do so using the Chrome browser on major desktop OSes like OS X, Linux and Windows. To this end, he has released a custom version of <a href="http://chrome.blogspot.in/2014/09/first-set-of-android-apps-coming-to.html" target="_blank">App Runtime for Chrome (ARC)</a>, the Native Client-based Chrome OS extension that enables Android apps to run on Chrome — the APK needs to be converted into a Chrome extension. Unlike ARC, which is only compatible with Chrome OS,&nbsp; the hacked version called <a href="https://github.com/vladikoff/chromeos-apk/blob/master/archon.md">ARChon</a> can be used to run Android apps inside the Chrome browser. As for converting APKs into Chrome extensions, the Toronto-based developer has released a tool called “<a href="https://github.com/vladikoff/chromeos-apk" target="_blank">chromeos-apk</a>”.</p> <p>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></p> http://www.maximumpc.com/archron_hack_brings_android_apps_chrome_linux_os_x_and_windows#comments app runtime for chrome apps archron chrome chrome os hack linux nacl native client OS OS X Software Windows News Mon, 22 Sep 2014 06:54:08 +0000 Pulkit Chandna 28578 at http://www.maximumpc.com New ‘Crescent Bay’ Rift Prototype Impresses with Better Display, 360-degree Tracking http://www.maximumpc.com/new_%E2%80%98crescent_bay%E2%80%99_rift_prototype_impresses_better_display_360-degree_tracking <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u46168/oculus-crescent-bay-prototype.png" alt="Crescent Bay Oculus Rift Prototype" title="Crescent Bay Oculus Rift Prototype" width="228" height="129" style="float: right;" /></h3> <h3>Oculus making steady progress on road to consumer Rift</h3> <p>At its two-day Oculus Connect developer conference in Los Angeles this week, Facebook-owned Oculus VR introduced a new, improved version of its Rift virtual reality head mounted display (HMD).<strong> Called Crescent Bay, this latest prototype packs a number of improvements over the DK2 model.&nbsp;</strong> These improvements, the company says, are enough to ensure a level of immersion “that’s impossible to achieve with DK2.”</p> <p><a href="http://www.oculus.com/blog/oculus-connect-2014/" target="_blank">Crescent Bay</a> is said to have a higher-resolution display than the DK2, which began shipping in July with a full HD OLED display (960 x 1080 pixels per eye). Although the company has chosen to keep Crescent Bay’s specs under wraps, most reports point to a 1440p screen or better. According to <a href="http://www.theverge.com/2014/9/20/6661525/oculus-crescent-bay-prototype-headset-hands-on" target="_blank">The Verge</a>, a higher resolution is not the only notable improvement where the display is concerned but there are a number of other improvements that make it better overall than the DK2.</p> <p>Besides the improved display, the Crescent Bay also packs 360° head tracking, expanded positional tracking volume, improved weight and ergonomics, and integrated audio.</p> <p>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></p> http://www.maximumpc.com/new_%E2%80%98crescent_bay%E2%80%99_rift_prototype_impresses_better_display_360-degree_tracking#comments crescent bay Gaming headset hmd oculus rift virtual reality vr News Mon, 22 Sep 2014 04:14:07 +0000 Pulkit Chandna 28577 at http://www.maximumpc.com Acer’s XB280HK 4K G-Sync Display Arriving Next Month for $800 (Updated) http://www.maximumpc.com/acer%E2%80%99s_xb280hk_4k_g-sync_display_now_available_north_america764 <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u69/acer_monitor_2.jpg" alt="Acer 4K G-Sync Display" title="Acer 4K G-Sync Display" width="228" height="192" style="float: right;" /></h3> <h3>First 4K UHD display with G-Sync</h3> <p><em>Correction: An earlier version of this article wrongly claimed that the Acer XB280HK was already up for sale in North America when, in fact, it's the full HD <a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16824009657" target="_blank">XB270H</a> ($599) and not the 4K XB280HK that has gone on sale. The error is regretted.</em></p> <p>The <a href="http://www.maximumpc.com/acer_rolls_out_worlds_first_4k_display_nvidia_g-sync_technology_2014">28-inch 4K Ultra HD (3840 x 2160) computer monitor Acer announced back in May</a> will be available in the United States and Canada sometime next month, the company announced in a <a href="http://us.acer.com/ac/en/US/press/2014/82208" target="_blank">press release</a> Friday. The <strong>Acer XB280HK is the first 4K monitor to come with an Nvidia G-Sync module.</strong></p> <p>The XB280HK is a TN (twisted nematic) display with a 60Hz refresh rate. Of course, support for Nvidia’s G-Sync technology means that this refresh rate can be adjusted automatically to match that of the GPU (<a href="http://www.geforce.com/hardware/technology/g-sync/system-requirements" target="_blank">see eligible models</a>), thereby obviating any screen tearing and keeping both stutter and input lag in check.</p> <p>“The incredibly sharp and smooth images provided by NVIDIA G-SYNC technology are sure to thrill the most avid gamers. Combined with Acer’s highly flexibly ergonomic stand, non-glare ComfyView panel and low dimming technology, users are assured long hours of both comfortable and visually stunning game play,” Ronald Lau, Acer America business manager, said in the press release.</p> <p>The US$799 ($849.99 CAD) XB280HK delivers 170/170 viewing angles (horizontal and vertical), 1.07 billion colors, 1000:1 native contrast ratio, 300 nits brightness, and 72 percent NTSC color saturation. Further, it is equipped with DisplayPort and as many as four USB 3.0 ports.</p> <p>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></p> http://www.maximumpc.com/acer%E2%80%99s_xb280hk_4k_g-sync_display_now_available_north_america764#comments Acer display Gaming Hardware monitor nvidia g-sync tn XB280HK News Mon, 22 Sep 2014 03:51:29 +0000 Pulkit Chandna 28576 at http://www.maximumpc.com Syber’s Upcoming Vapor Extreme Gaming Console Packs Nvidia’s GTX 980 GPU http://www.maximumpc.com/syber%E2%80%99s_upcoming_vapor_extreme_gaming_console_packs_nvidia%E2%80%99s_gtx_980_gpu890 <!--paging_filter--><h3><img src="http://www.maximumpc.com/files/u46168/cyberpowerpc_syber_vapor_extreme.jpg" alt="Syber Vapor Extreme" title="Syber Vapor Extreme" width="228" height="171" style="float: right;" /></h3> <h3>Runs Windows 8.1 but boots straight into Steam’s Big Picture Mode</h3> <p>Syber, a division of California-based boutique system builder CyberPowerPC, has announced an <strong>Nvidia GeForce GTX 980-packing gaming PC that boots directly into Steam’s Big Picture Mode</strong>. The Syber Vapor Extreme, as the machine is called, is not an altogether new product, but the latest addition to an <a href="http://www.maximumpc.com/e3_2014_cyberpower_pc_shows_syber_console_big_picture_mode_video" target="_blank">upcoming line of Windows 8.1-based gaming consoles the company unveiled at E3 2014 back in June. </a></p> <p>If it isn’t already apparent from its “Extreme” moniker, this particular product is going to be the most powerful member of Syber’s upcoming Vapor family of Windows-based gaming consoles — devices that would have almost certainly hit the market as Steam Machines had that platform not been <a href="http://www.maximumpc.com/controller_tweaks_prompt_valve_delay_steam_machines_until_2015" target="_blank">delayed by Valve</a>. Since we’ve already covered the other two SKUs in the Vapor line, we are going to limit ourselves to the Vapor Extreme here. </p> <p>Inside the 36-lb Vapor Extreme you will find a 4.0GHz Intel Core i7-4790K processor, 8GB of RAM, Nvidia GTX 980 with 4GB of DDR5 VRAM, 1TB SATA III hard drive, one USB 3.0 port, two USB 2.0 port, HDMI port, DVI-D port, Gigabit Ethernet, and 802.11ac/g/n. It will come with a 450W power supply, Logitech F710 wireless gamepad and a wireless keyboard with touchpad.</p> <p>The <a href="http://www.sybergaming.com/specifications.aspx" target="_blank">$1,500 Vapor Extreme</a> is listed as “coming soon” on Syber’s website. However, the two other SKUs in this line of SteamOS-ready machines, the <a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16883229610" target="_blank">$700 Vapor I</a> and the <a href="http://www.amazon.com/Syber-Vapor-A-Athlon-Desktop-SVAB100/dp/B00MW7UUK4/ref=sr_sp-atf_title_1_2?ie=UTF8&amp;qid=1411342785&amp;sr=8-2&amp;keywords=syber+vapor" target="_blank">$600 Vapor A</a>,&nbsp; are both available for pre-order now and should begin shipping in October.</p> <p>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></p> http://www.maximumpc.com/syber%E2%80%99s_upcoming_vapor_extreme_gaming_console_packs_nvidia%E2%80%99s_gtx_980_gpu890#comments cyberpowerpc Gaming geforce gtx 980 Steam Big Picture Mode steam machine steamos syber vapor extreme Valve vapor a vapor i windows 8.1 News Sun, 21 Sep 2014 23:44:52 +0000 Pulkit Chandna 28575 at http://www.maximumpc.com Asus Rolls Out Overclocked and Custom Cooled Strix GTX 980 and GTX 970 Cards http://www.maximumpc.com/asus_rolls_out_overclocked_and_custom_cooled_strix_gtx_980_and_gtx_970_cards <!--paging_filter--><h3><img src="/files/u69/asus_strix_gtx_980.jpg" alt="Asus Strix GTX 980" title="Asus Strix GTX 980" width="228" height="205" style="float: right;" />Cooler and quieter than reference</h3> <p>Are you of the opinion that speed limits are more like suggestions than enforced rules? Do you believe reference designs are for suckers? If you answered yes to one or both questions, you might be interested in the new <strong>Strix GTX 980 and Strix GTX 970 graphics cards from Asus</strong>. Both of these <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014">Maxwell-powered</a> parts sport factory overclocks and custom cooling solutions that are supposedly 30 percent quieter than Nvidia's reference cooler.</p> <p>As Asus is prone to do, it outfitted its new cards with its Digi+ voltage-regulator module (VRM) technology and Super Alloy Power components -- fancy terms for higher end components intended to increase durability and cooling. Both cards also feature a DisplayPort interface to support gaming at up to 4K Ultra HD.</p> <p>The Strix GTX 980 is overclocked to 1279MHz, up from 1126MHz, and the Strix GTX 970 is goosed to 1253MHz, up from 1050MHz. Both sport 4GB of GDDR5 memory that runs at boosted speeds of up to 7010MHz, which is essentially reference.</p> <p>With regards to cooling, the cards use Asus' DirectCU II cooling technology that consists of 10mm heat pipes capable of transporting 40 percent more heat away from the GPU. The design also offers 220 percent more heat dissipation. <a href="http://rog.asus.com/362672014/gaming-graphics-cards-2/press-release-asus-announces-strix-gtx-980-and-strix-gtx-970/" target="_blank">According to Asus</a>, the cards run 30 percent cooler and 3x quieter than reference.</p> <p>Both cards should be available today. At the time of this writing, Newegg shows the Strix 970 as out of stock and priced at $340 (plus $6.31 shipping); there's no listing yet for the Strix 980.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/asus_rolls_out_overclocked_and_custom_cooled_strix_gtx_980_and_gtx_970_cards#comments asus Build a PC Gaming graphics card Hardware strix gtx 970 strix gtx 980 Video Card News Fri, 19 Sep 2014 16:58:35 +0000 Paul Lilly 28572 at http://www.maximumpc.com Turtle Beach Launches Ear Force Z60, Claims First PC Gaming Headset with DTS Headphone:X 7.1 http://www.maximumpc.com/turtle_beach_launches_ear_force_z60_claims_first_pc_gaming_headset_dts_headphonex_71 <!--paging_filter--><h3><img src="/files/u69/turtle_beach_ear_force_z60.jpg" alt="Turtle Beach Ear Force Z60" title="Turtle Beach Ear Force Z60" width="228" height="192" style="float: right;" />Now available in stores</h3> <p>Turtle Beach hasn't forgotten about PC gamers. In fact, <strong>Turtle Beach just announced the availability of its Ear Force Z60 PC gaming headset</strong> that was first unveiled at the Consumer Electronics Show (CES) earlier this year in Las Vegas. According to Turtle Beach, the Ear Force Z60 is the first and only PC gaming headset to utilize the new DTS Headphone:X 7.1 surround sound.</p> <p>DTS Headphone:X is supposed to make games, movies, and music sound better over headphones by enabling discrete control over speaker position and angles. Turtle Beach said it created audio presets using DTS Headphone:X for more realistic sound. How so?</p> <p>"Most surround sound mixes put the center channel about eight feet in front of the listener perceptually, which is the default for TV and film because the dialogue is coming from the screen," <a href="http://www.turtlebeach.com/forum/viewtopic.php?f=7&amp;t=20850" target="_blank">Turtle Beach explains</a>. "With a first person shooter and many other game genres, much of the center channel audio comes from the player's character for example the sound of the player’s feet walking in the snow, their inbound radio, and the sound of spent cartridges being ejected from their gun. Using DTS Headphone:X, the audio engineers at Turtle Beach created gaming presets that pull the center channel in towards the players' midsection to make the placement of those sounds more realistic."</p> <p>The Ear Force Z60 boasts large 60mm speakers. It also features an inline control unit to adjust chat and game audio individually, surround sound modes, and microphone mute.</p> <p>You can purchase the Ear Force Z60 now for <a href="http://www.turtlebeach.com/product-detail/pc-headsets/ear-force-z60/496" target="_blank">$120 MSRP</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/turtle_beach_launches_ear_force_z60_claims_first_pc_gaming_headset_dts_headphonex_71#comments audio ear force z60 Gaming Hardware headphone:x 7.1 Peripherals Sound turtle beach News Fri, 19 Sep 2014 16:16:58 +0000 Paul Lilly 28571 at http://www.maximumpc.com NFL TV Announcers Still Referring to Microsoft Surface Tablets as iPads http://www.maximumpc.com/nfl_tv_announcers_still_referring_microsoft_surface_tablets_ipads_2014 <!--paging_filter--><h3><img src="/files/u69/surface_2_3.jpg" alt="Surface 2" title="Surface 2" width="228" height="148" style="float: right;" />Microsoft has some coaching to do</h3> <p>Microsoft's arch rival Apple is receiving some free publicity for its iPad line during NFL games. That's because TV announcers can't seem to tell fathom that not all tablets are iPads. Take the Surface, for example. After paying the NFL $400 million for Surface to be the official tablet of the league, <strong>Microsoft is understandably ticked that its slate keeps being referred to as an iPad on national television</strong>.</p> <p>It happened on more than one occasion through the <a href="http://www.maximumpc.com/nfL_announcers_slip_400_million_surface_deal_calls_microsofts_tablets_ipads_2014" target="_blank">end of last week</a>, and then again this past Sunday when the San Diego Chargers played against the defending champions Seattle Seahawks. In that instance, it was a local TV announcer who was confused when told the teams were using Surface tablets.</p> <p>"What? I thought it was an iPad," the announcer said, according to a <a href="http://www.latimes.com/business/la-fi-microsoft-nfl-surface-ipad-20140913-story.html#page=1" target="_blank">report in the <em>Los Angeles Times</em></a>.</p> <p>In addition to paying the NFL a reported $400 million to feature its Surface as the official tablet of the league, it also provided the NFL with hundreds of Surface 2 systems. Each team received 13 Surface tablets to use on the sideline and 12 to use in the coaches' booth. They're not connected to the Internet and teams are only allowed to view photos, not videos. The NFL then collects them after each game to prevent tampering.</p> <p>"Despite the majority of our friends in the booth correctly identifying the Surface on NFL sidelines, we're working with the league to coach up a select few," a Microsoft spokesman said.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nfl_tv_announcers_still_referring_microsoft_surface_tablets_ipads_2014#comments microsoft NFL Surface tablet News Fri, 19 Sep 2014 15:57:17 +0000 Paul Lilly 28570 at http://www.maximumpc.com Qualcomm's Awesome Vuforia SDK Brings Augmented Reality to Eyewear http://www.maximumpc.com/qualcomms_awesome_vuforia_sdk_brings_augmented_reality_eyewear_2014 <!--paging_filter--><p><img src="/files/u69/vuforia_eyewear.jpg" alt="Vuforia Eyewear" title="Vuforia Eyewear" width="228" height="151" style="float: right;" />Some exciting things are happening in the world of virtual reality<a href="http://hothardware.com/Tags/vr.aspx"></a>, and we're not just talking about the Oculus Rift. Multiple companies are jumping on board with the VR movement, including chip maker <strong>Qualcomm, which unveiled its Vuforia mobile vision platform</strong> that developers can use to build augmented reality (AR) applications for a new generation of digital hardware.</p> <p>The Vuforia Software Development Kit (SDK) for Digital Eyewear offers a "major advance in user experience," Qualcomm says. Using the SDK, developers can program apps with interactive 3D content that's visually aligned with the underlying world. This has all kinds of potential uses -- everything from gaming to shopping and education, etc.</p> <p>"The promise of digital eyewear is to create a heads-up display for our daily lives. While the realization of this promise remains in the future, Vuforia is taking a big step in the right direction by enabling a first generation of applications for consumer and enterprise use," <a href="https://www.qualcomm.com/news/releases/2014/09/18/qualcomm-announces-vuforia-digital-eyewear-powering-next-generation" target="_blank">said Jay Wright</a>, vice president of product management for Qualcomm Connected Experiences, Inc. "Developers will now have the tools required to build experiences that will drive the adoption of the digital eyewear category. We look forward to seeing what developers make possible."</p> <p><iframe src="//www.youtube.com/embed/HVTEGz2Rf74" width="620" height="349" frameborder="0"></iframe></p> <p>Two new features of the Vuforia SDK include Smart Terrain and HD Camera View. Short and to the point, these technologies allow users to build their own gaming environment using everyday objects. Smart Terrain then creates a 3D map of the environment you created in real-time, which in turn gives apps the ability to interact with those objects. In the video above, you can see various McDonald's objects -- cola cup, fry box, burger box -- placed on a table. A soccer app then maps the objects and allows you to make trick shots off of those items.</p> <p><iframe src="//www.youtube.com/embed/vfCQWjI9sYc" width="620" height="349" frameborder="0"></iframe></p> <p>As it pertains to wearable headsets, a lot of neat things are possible with the Vuforia SDK, and not just for gaming. For example, you could be staring at an object like a water pump through a digital headset and see how to tear it down, step-by-step. Or an engine or any other object.</p> <p>The Vuforia SDK for Digital Eyewear will be available this fall in beta form for a select group of developers.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/qualcomms_awesome_vuforia_sdk_brings_augmented_reality_eyewear_2014#comments augmented reality qualcomm SDK virtual reality vr vuforia News Fri, 19 Sep 2014 15:00:13 +0000 Paul Lilly 28569 at http://www.maximumpc.com Newegg Daily Deals: PNY Optima 240GB SSD, AMD A8-5600K Trinity Quad-Core APU, and More! http://www.maximumpc.com/newegg_daily_deals_pny_optima_240gb_ssd_amd_a8-5600k_trinity_quad-core_apu_and_more <!--paging_filter--><p><img src="/files/u69/pny_optima_240gb.jpg" alt="PNY Optima 240GB" title="PNY Optima 240GB" width="300" height="240" style="float: right;" /><img src="/files/u154082/newegg_logo_small.png" alt="newegg logo" title="newegg logo" width="200" height="80" /></p> <p><strong>Top Deal:</strong></p> <p>Say it with us now: "Fast is better than slow. Fast is better than slow. Fast is better than slow." It's true, right? Of course it is! That's why solid state drives exist. Don't you wish you had one to replace that aging and slow hard drive? A wish and a few pieces of eight is all it takes. Just wait until you take a gander at today's top deal for a <strong>PNY Optima 2.5-inch 240GB SATA III SSD</strong> for $100 with free shippping (additional $20 mail-in-rebate available). That's just $80 after rebate!</p> <p><strong>Other Deals:</strong></p> <p><a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16819113281&amp;cm_mmc=BAC-MaximumPC-_-DailyDeals-_-CPU-N82E16819113281-_-0919&amp;nm_mc=ExtBanner&amp;AID=5555555" target="_blank">AMD A8-5600K Trinity Quad-Core 3.6GHz (3.9GHz Turbo) Socket FM2 100W Desktop APU</a> for <strong>$80</strong> with free shipping (normally $100 - use coupon code: [<strong>EMCPAWB32</strong>])</p> <p><a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16820231486&amp;cm_mmc=BAC-MaximumPC-_-DailyDeals-_-RAM-N82E16820231486-_-0919&amp;nm_mc=ExtBanner&amp;AID=5555555" target="_blank">G.Skill Ripjaws X Series 16GB (2 x 8GB) 240-Pin DDR3 1333 (PC3 10666) Desktop Memory</a> for <strong>$144</strong> with free shipping (normally $160 - use coupon code: [<strong>EMCPAWB36</strong>])</p> <p><a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16824009483&amp;cm_mmc=BAC-MaximumPC-_-DailyDeals-_-DISPLAY-N82E16824009483-_-0919&amp;nm_mc=ExtBanner&amp;AID=5555555" target="_blank">Acer H6 Series H236HLbid Black 23-inch 5ms (GTG) HDMI Widescreen LED Backlight LED Backlit LCD Monitor</a> for <strong>$130</strong> with free shipping (normally $160 - use coupon code: [<strong>EMCPAWB224</strong>])</p> <p><a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16822236519&amp;cm_mmc=BAC-MaximumPC-_-DailyDeals-_-EXT-HDD-N82E16822236519-_-0919&amp;nm_mc=ExtBanner&amp;AID=5555555" target="_blank">WD 2TB WD Elements Portable USB 3.0 Hard Drive Storage</a> for <strong>$90</strong> with free shipping (normally $100 - use coupon code: [<strong>EMCPAWB54</strong>])</p> http://www.maximumpc.com/newegg_daily_deals_pny_optima_240gb_ssd_amd_a8-5600k_trinity_quad-core_apu_and_more#comments Daily Deals daily deals Newegg Fri, 19 Sep 2014 13:14:27 +0000 The Maximum PC Staff 28568 at http://www.maximumpc.com Nvidia GeForce GTX 980 Review http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014 <!--paging_filter--><h3><span style="font-size: 1.17em;">4K and SLI tested on Nvidia's high-end Maxwell card</span></h3> <p>Sometimes things don't go according to plan. Both AMD and Nvidia were supposed to have shifted to 20-nanometer parts by now. In theory, that's supposed to get you lower temperatures, higher clock speeds and quieter operation. Due to circumstances largely out of its control, Nvidia has had to go ahead with a 28nm high-end Maxwell part instead, dubbed GM204. This is not a direct successor to the GTX 780, which has more transistors, texture mapping units, and things like that. The 980 is actually the next step beyond the GTX 680, aka GK104, which was launched in March 2012.</p> <p>Despite that, our testing indicates that the GTX 980 can still be meaningfully faster than the GTX 780 and 780 Ti (and AMD’s Radeon R9 290 and 290X, for that matter, though there are a of couple games better optimized for Radeon hardware). When 20nm processes become available sometime next year, we’ll probably see the actual successor to the GTX 780. But right now, the GTX 980 is here, and comes in at $500. That seems high at first, but recall that the GTX 680, 580, and 480 all launched at this price. And keep in mind that it’s a faster card than the 780 and 780 Ti, which currently cost more. (As we wrote this, AMD announced that it was dropping the base price of the R9 290X from $500 to $450, so that war rages on.) The GTX 970 at $329 may be a better deal, but we have not yet obtained one of those for testing.</p> <p>In other news, Nvidia told us that they were dropping the price of the GTX 760 to $219, and the GTX 780 Ti, 780 and 770 are being officially discontinued. So if you need a second one of those for SLI, now is a good time.</p> <p>Let's take a look at the specs:</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>GM204</td> <td>GM204&nbsp;</td> <td>GK104&nbsp;</td> <td>GK104&nbsp;</td> <td class="item-dark">GK110</td> <td>&nbsp;Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1126</td> <td>&nbsp;1050</td> <td>&nbsp;1006</td> <td>&nbsp;863</td> <td>876</td> <td>&nbsp;"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1216</td> <td>&nbsp;1178</td> <td>&nbsp;1058</td> <td>&nbsp;900</td> <td class="item-dark">928</td> <td>&nbsp;N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7000</td> <td>&nbsp;7000</td> <td>&nbsp;6000</td> <td>&nbsp;6000</td> <td>7000</td> <td>&nbsp;5000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;2GB/4GB</td> <td>&nbsp;3GB/6GB</td> <td>3GB</td> <td>&nbsp;4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;64</td> <td>&nbsp;64</td> <td>&nbsp;32</td> <td>&nbsp;48</td> <td>48</td> <td>&nbsp;64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;128</td> <td>&nbsp;192</td> <td>240</td> <td>&nbsp;176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;2048</td> <td>&nbsp;1664</td> <td>&nbsp;1536</td> <td>&nbsp;2304</td> <td>2880</td> <td>&nbsp;2816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;8</td> <td>&nbsp;12</td> <td>&nbsp;15</td> <td>&nbsp;N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;195</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;290</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$499</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>&nbsp;$549</td> </tr> </tbody> </table> </div> <p>On paper, the 980 and 970 don't look like much of a jump from the 680. In fact, the 980 has only 128 shaders (aka "CUDA cores") per streaming multi-processor (SM). Performance tends to increase with a higher number of shaders per SM, so how did the 980 GTX perform so well in our benches, despite having a worse ratio than all the other cards? Well, Nvidia claims that they've improved the performance of each CUDA core by 40%. Provided that this calculation is accurate, the GTX 980 effectively has about as many CUDA cores as a 780 Ti. Add the GTX 980's bigger clock speeds, and performance should be higher.</p> <p><img src="/files/u160416/7g0a0209_620.jpg" width="620" height="349" /></p> <p>You probably also noticed the unusually low price for the GTX 970. The GTX 670 launched at $400 in May 2012, and the GTX 570 launched at $350 in December 2010. These earlier two cards were also had more similar specs compared to their bigger brothers. For example, the GTX 570 had 480 CUDA cores, while the 580 had 512 cores. This is a difference of just 6.25%, although the memory bus was reduced from 384-bits to 320-bits. In contrast, the 970 gets nearly 20% fewer CUDA cores than the 980, though its memory bus remains unchanged. As we said, we haven't gotten a 970 in yet, but, based on its specs, we doubt that we can compensate with overclocking, as we've been able to do in the past with the GTX 670 and 760, and the Radeon R9 290.</p> <p>Nvidia also says that the official boost clock on these new Maxwell cards is not set in stone. We witnessed our cards boosting up to 1,253MHz for extended periods of time (i.e., 20 seconds here, 30 seconds there). When the cards hit their thermal limit of 80 degrees Celsius, they would fall down as low as 1,165Mhz, but we never saw them throttle below the official base clock of 1,126MHz. In SLI, we also noted that the upper card would go up to 84 C. According to Nvidia, these cards have an upper boundary of 95 C, at which point they will throttle below the base clock to avoid going up in smoke. We were not inclined to test that theory, for now.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,1" target="_blank">Next Page: Voxels, new anti-aliasing, and VR</a></h4> <hr /> <p>The company also says that its delta color compression algorithms have improved bandwidth requirements by about 25 percent on average (it varies from game to game). This extra headroom provides more space for increased frame rates. Since DCC directly affects pixels, this effect should scale with your resolution, becoming increasingly helpful as you crank your res higher.</p> <p>You can also combine these gains with Nvidia’s new Multi-Frame Sampled Anti-Aliasing (MFAA). This technique rotates a pixel’s sampling points from one frame to the next, so that two of these points can simulate the visual results of four sampling points whose locations remain static. The effect starts to shimmer at about 20FPS, whereupon it’s automatically disabled. But when running well, Nvidia claims that it can be 30 percent faster, on average, than the visually equivalent level of Multi-Sample Anti-Aliasing (MSAA). Like TXAA (Temporal Anti-Aliasing), this technique won’t be available on AMD cards (or if it is, it will be built by AMD from the ground up and called something else).</p> <p><img src="/files/u160416/7g0a0238_resize.jpg" width="620" height="349" /></p> <p>Unfortunately, MFAA was not available in the version 344.07 beta drivers given to us for testing, but Nvidia said it would be in the driver after this one. This means that the package will not be complete on launch day. Support will trickle down to the older Kepler cards later on. Nvidia hasn’t been specific about timelines of specific cards, but it sounded like the 750 and 750 Ti (also technically Maxwell cards), will not be invited to this party.</p> <p>Another major upgrade is Voxel Global Illumination, or VXGI. Nvidia positions this as the next step beyond ambient occlusion. With VXGI, light bounces off of surfaces to illuminate nooks and crannies that would otherwise not be lit realistically, in real time. Ordinarily, light does not bounce around in a 3D game engine like it does in meatspace. It simply hits a surface, illuminates it, and that’s the end. Sometimes the lighting effect is just painted onto the texture. So there’s a lot more calculation going on with VXGI.</p> <p><img src="/files/u160416/maxwell_die_620.jpg" width="620" height="349" /></p> <p>But Nvidia has not made specific performance claims because the effect is highly scalable. A developer can choose how many cones of light they want to use, and the degree of bounced light resolution (you can go for diffused/blurry spots of light, or a reflection that’s nearly a mirror image of the bounced surface), and they balance this result against a performance target. Since this is something that has to be coded into the game engine, we won’t see that effect right away by forcing it in the drivers, like Nvidia users can with ambient occlusion.</p> <p>Next is Dynamic Super Resolution (in the 344.11 drivers released today, so we'll be giving this one a peek soon). This tech combines super-sampling with a custom filter. Super sampling takes a higher resolution that your monitor can display and squishes it down. This is a popular form of anti-aliasing, but the performance hit is pretty steep. The 13-tap Gaussian filter that the card lays on top can further smooth out jaggies. It's a post-process effect that's thankfully very light, and you can also scale DSR down from 3840x2160 to 2560x1440. It's our understanding that this effect is only available to owners of the 980 and 970, at least for now, but we'll be checking on that ASAP.</p> <p>Nvidia is also investing more deeply into VR headsets with an initiative called VR Direct. Their main bullet point is a reduction in average latency from 50ms to 25ms, using a combination of code optimization, MFAA, and another new feature called Auto Asynchronous Warp (AAW). This displays frames at 60fps even when performance drops below that. Since each eye is getting an independently rendered scene, your PC effectively needs to maintain 120FPS otherwise, which isn’t going to be common with more demanding games. AAW takes care of the difference. However, we haven’t had the opportunity to test the GTX 980 with VR-enabled games yet.</p> <p>Speaking of which, Nvidia is also introducing another new feature called Auto Stereo. As its name implies, it forces stereoscopic rendering in games that were not built with VR headsets in mind. We look forward to testing VR Direct at a later date.</p> <p>Lastly, we also noticed that GeForce Experience can now record at this resolution. It was previously limited to 2560x1600.</p> <p>Until we get our hands on MFAA and DSR, we have some general benchmarks to tide you over. We tested the GTX 980 in two-way SLI and by itself, at 2560x1600 and 3820x2160. We compared it to roughly equivalent cards that we've also run in solo and two-way configs.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,2" target="_blank">Next Page: SLI Benchmarks!</a></h4> <hr /> <p>Here's the system that we've been using for all of our recent GPU benchmarks:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Thermaltake Toughpower Grand (1,050 watts)</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 Update 1</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p class="MsoNormal" style="text-align: left;"><span style="text-align: center;">Now, let’s take a look at our results at 2560x1600 with 4xMSAA. For reference, this is twice as many pixels as 1920x1080. So gamers playing at 1080p on a similar PC can expect roughly twice the framerate, if they use the same graphical settings. We customarily use the highest preset provided by the game itself; for example, <em>Hitman: Absolution</em> is benchmarked with the “Ultra” setting. 3DMark runs the Firestrike test at 1080p, however. We also enable TressFX in Tomb Raider, and PhysX in Metro: Last Light.</span></p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>33</strong></td> <td>&nbsp;19</td> <td>25</td> <td class="item-dark">&nbsp;27</td> <td>&nbsp;26</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>46</strong></td> <td>&nbsp;21</td> <td>&nbsp;22</td> <td>&nbsp;32</td> <td>&nbsp;30</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;75</td> <td>&nbsp;51</td> <td>&nbsp;65</td> <td>&nbsp;<strong>78</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;27</td> <td>&nbsp;40</td> <td>&nbsp;45</td> <td>&nbsp;<strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;45</td> <td>&nbsp;30</td> <td>&nbsp;43</td> <td>&nbsp;<strong>48</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;64</td> <td>&nbsp;35</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;34</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>11,490</strong></td> <td>&nbsp;6,719</td> <td>&nbsp;8,482</td> <td>&nbsp;9,976</td> <td>9,837</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong><strong>)</strong></p> <p class="MsoNormal" style="text-align: left;">To synthesize the results into a few sentences, we would say that the 980 is doing very well for its price. It’s not leapfrogging over the 780 and 780 Ti, but Nvidia indicates that it’s not supposed to anyway. It dominates the GTX 680, but that card is also two years old and discontinued, so the difference is not unexpected or likely to change buying habits. The R9 290X, meanwhile, is hitting $430, while the not-much-slower 290 can be had for as little as $340. And you can pick up a 780 Ti for $560. So the GTX 980's price at launch is going to be a bit of a hurdle for Nvidia.</p> <p class="MsoNormal" style="text-align: left;">Performance in Metro: Last Light has also vastly improved. (We run that benchmark with “Advanced PhysX” enabled, indicating that Nvidia has made some optimizations there. Further testing is needed.) Loyal Radeon fans will probably not be swayed to switch camps, at least on the basis of pure performance. Hitman in particular does not appear to favor the Green Team.</p> <p class="MsoNormal" style="text-align: left;">We were fortunate enough to obtain a second GTX 980, so we decided to set them up in SLI, at the same resolution of 2560x1600. Here, the differences are more distinct. We’ve honed the comparison down to the most competitive cards that we have SLI/CF benchmarks for. (Unfortunately, we do not have a second GTX 680 in hand at this time. But judging by its single-card performance, it's very unlikely to suddenly pull ahead.) For this special occasion, we brought in the Radeon R9 295X2, which has two 290X GPUs on one card and has been retailing lately for about a thousand bucks.</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>66</strong></td> <td>&nbsp;45</td> <td>&nbsp;56</td> <td>&nbsp;50</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>70</strong></td> <td>&nbsp;52</td> <td>&nbsp;53</td> <td>&nbsp;48</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;131</td> <td>&nbsp;122</td> <td>&nbsp;<strong>143</strong></td> <td>&nbsp;90</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;77</td> <td>&nbsp;74</td> <td>&nbsp;<strong>79</strong></td> <td>&nbsp;79</td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;80</td> <td>&nbsp;72</td> <td>&nbsp;<strong>87</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;73</td> <td>&nbsp;60</td> <td><strong>&nbsp;77</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>17,490</strong></td> <td>&nbsp;14,336</td> <td>&nbsp;16,830</td> <td>&nbsp;15,656</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p class="MsoNormal" style="text-align: left;">While a solo 980 GTX is already a respectable competitor for the price, its success is more pronounced when we add a second card—as is the gap between it and the 780 Ti. It still continues to best the GTX 780, getting us over 60 FPS in each game with all visual effects cranked up. That's an ideal threshold. It also looks like Nvidia's claim of 40 percent improved CUDA core performance may not be happening consistently. Future driver releases should reveal if this is a matter of software optimization, or if it's a limitation in hardware. Or just a random cosmic anomaly.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,3" target="_blank">Next Page: 4K benchmarks and conclusion</a></h4> <hr /> <p class="MsoNormal" style="text-align: left;">So, what happens when we scale up to 3840x2160, also known as “4K”? Here we have almost twice as many pixels as 2560x1600, and four times as many as 1080p. Can the GTX 980’s 256-bit bus really handle this much bandwidth?</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;16</td> <td>&nbsp;8.7*</td> <td>&nbsp;26</td> <td class="item-dark">&nbsp;<strong>28</strong></td> <td>&nbsp;28</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>36</strong></td> <td>&nbsp;12</td> <td>&nbsp;18</td> <td>&nbsp;19</td> <td>&nbsp;18</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;35</td> <td>&nbsp;25</td> <td>&nbsp;33</td> <td>&nbsp;<strong>38</strong></td> <td>&nbsp;38</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;20</td> <td>&nbsp;15</td> <td>&nbsp;20</td> <td>&nbsp;24</td> <td><strong>&nbsp;28</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;19</td> <td>&nbsp;15</td> <td>&nbsp;<strong>30</strong></td> <td><strong>&nbsp;30</strong></td> <td>&nbsp;26</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;19</td> <td>&nbsp;11</td> <td>&nbsp;<strong>23</strong></td> <td><strong>&nbsp;23</strong></td> <td>&nbsp;18</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>*TressFX disabled</p> <p>The 980 is still scaling well, but the 384-bit 780 and 780 Ti are clearly scaling better, as is the 512-bit 290X. (<strong>Update:</strong>&nbsp;We've re-checked our test results for Hitman: Absolution, and the AMD cards weren't doing nearly as well as we originally thought, though they're still the best option for that particular game. The Batman tests have been re-done as well.) We had to disable TressFX when benchmarking the 680, because the test would crash otherwise, and it was operating at less than 1FPS anyway. At 4K, that card basically meets its match, and almost its maker.</p> <p>Here's 4K SLI/Crossfire. All tests are still conducted at 4xMSAA, which is total overkill at 4K, but we want to see just how hard we can push these cards. (Ironically, we have most of the SLI results for the 290X here, but not for 2560x1600. That's a paddlin'.)</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;33</td> <td>&nbsp;41</td> <td>&nbsp;44</td> <td class="item-dark">&nbsp;52</td> <td>&nbsp;<strong>53</strong></td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;21</td> <td>&nbsp;27</td> <td>&nbsp;29</td> <td>&nbsp;26</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;<strong>68</strong></td> <td>&nbsp;60</td> <td>&nbsp;65</td> <td>&nbsp;67</td> <td>&nbsp;66</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;40</td> <td>&nbsp;44</td> <td><strong>&nbsp;53</strong></td> <td><strong>&nbsp;</strong><strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;39</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;40</td> <td>&nbsp;24</td> <td>&nbsp;19</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;34</td> <td>&nbsp;33</td> <td>&nbsp;<strong>44</strong></td> <td>&nbsp;17</td> <td>&nbsp;34</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>It does appear that the raw memory bandwidth of the 780, 780 Ti, and 290X come in handy at this resolution, despite the optimizations of Maxwell CUDA cores. That Metro: Last Light score remains pretty interesting. It's the only one we run with PhysX enabled (to balance out using TressFX in Tomb Raider). It really does look like Maxwell is much better at PhysX than any other GPU before it. That tech isn't quite common enough to change the game. But if the difference is as good as our testing indicates, more developers may pick it up.</p> <p>Even a blisteringly fast card can be brought down by high noise levels or prodigious heat. Thankfully, this reference cooler is up to the task. Keep in mind that this card draws up to 165 watts, and its cooler is designed to handle cards that go up to 250W. But even with the fan spinning up to nearly 3,000rpm, it’s not unpleasant. With the case side panels on, you can still hear the fan going like crazy, but we didn’t find it distracting. These acoustics only happened in SLI, by the way. Without the primary card sucking in hot air from the card right below it, its fan behaved much more quietly. The GTX 980’s cooling is nothing like the reference design of the Radeon R9 290 or 290X.</p> <p><img src="/files/u160416/key_visual_620.jpg" width="620" height="349" /></p> <p>With a TDP of just 165W, a respectable 650-watt power supply should have no trouble powering two 980 GTXs. Meanwhile, the 290-watt R9 290X really needs a nice 850-watt unit to have some breathing room, and even more power would not be unwelcome.</p> <p>Since MFAA and DSR were not available in the driver that was supplied for testing, there’s more story for us to tell over the coming weeks. (<strong>Update</strong>: DSR settings are actually in this driver, just not in the location that we were expecting.) And we still need to do some testing with VR. But as it stands right now, the GTX 980 is another impressive showing for Nvidia. Its 4K scaling isn't as good as we'd like, especially since Maxwell is currently the only tech that will have Dynamic Super Resolution. If you want to play at that level, it looks like the 290 and 290X are better choices, price-wise, while the overall performance crown at 4K still belongs to the 780 and 780 Ti. But considering the price difference between the 980 and the 780, its similar performance is commendable.</p> <p>For 2560x1600 or lower resolutions, the 980 GTX emerges as a compelling option, but we're not convinced that it's over $100 better than a 290X. Then again, you have MFAA, DSR, and VR Direct, (and the overall GeForce Experience package that's a bit slicker than AMD's Gaming Evolved) which might work some people, or for Nvidia loyalists who've been waiting for an upgrade from their 680 that's not quite as expensive as the 780 or 780 Ti.</p> <p><a href="http://www.pcgamer.com/2014/09/19/nvidia-gtx-980-tested-sli-4k-and-single-gpu-benchmarks-and-impressions/" target="_blank">Our amigo Wes Fenlon over at PC Gamer has a write-up of his own</a>, so go check it out.</p> http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014#comments 4k 980 GTX benchmarks comparison geforce gpu nvidia performance Review sli Video Card Videocards Fri, 19 Sep 2014 03:04:15 +0000 Tom McNamara 28564 at http://www.maximumpc.com