nvidia http://www.maximumpc.com/taxonomy/term/320/ en MSI Injects Gaming All-in-One Systems with Nvidia's Latest Mobile GPUs http://www.maximumpc.com/msi_injects_gaming_all--one_systems_nvidias_latest_mobile_gpus_2014 <!--paging_filter--><h3><img src="/files/u69/2014-10_product_ag270_1.jpg" alt="MSI AIO" title="MSI AIO" width="228" height="155" style="float: right;" />High power gaming on an all-in-one</h3> <p><strong>MSI has gone and upgraded its 27-inch all-in-one gaming PCs with Nvidia's recently announced Maxwell-based mobile GPUs</strong>, the GeForce GTX 970M and 980M. These are supposedly the first AIO systems to feature Maxwell in mobile form, though the story doesn't end there -- they also feature a 4th generation Intel Core i7 4860HQ quad-core processor clocked at 2.4GHz (up to 3.6GHz via Turbo) and up to 16GB of DDR3L-1600 RAM.</p> <p>The 27-inch display on both the <a href="http://eu.msi.com/product/aio/AG270-2QC.html#hero-overview" target="_blank">AG270 2QC</a> (GTX 970M) and <a href="http://eu.msi.com/product/aio/AG270-2QE.html#hero-overview" target="_blank">AG270 2QE</a> (GTX 980M) models features a Full HD 1080p (1920x1080) resolution with multi-touch support, anti-flicker technology, and "Less Blue Light" technology applied to its anti-glare implementation -- <a href="http://game.msi.com/us/news?List=42&amp;N=3326" target="_blank">according to MSI</a>, the end result is less eyestrain during extended gaming sessions.</p> <p>"To provide gamers with an even better gaming experience, the AG270 uses an anti-glare matte display featuring Anti-Flicker technology, which stabilizes the electrical current to prevent serious flickering seen in standard displays. Together with Less Blue Light technology, this helps to reduce eye fatigue after extended use while also enhancing the quality of the gaming environment," MSI explains.</p> <p>Other features include Killer E2200 LAN, up to three mSATA SSDs in RAID 0, 3.5-inch HDD (various options), Blu-ray writer, dual Yamaha 5W speakers, 802.11ac Wi-Fi, Bluetooth, four USB 3.0 ports (one with Super Charger technology), two USB 2.0 ports, a 3-in-1 card reader, 2MP webcam, HDMI input, HDMI output, VGA output, microphone and headphone jacks, and Windows 8.1.</p> <p>Depending on the exact configuration, these systems are pretty pricey. We've only spotted a few so far online, which ranged from around $2,100 to $2,700.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/msi_injects_gaming_all--one_systems_nvidias_latest_mobile_gpus_2014#comments aio all-in-one Gaming geforce gtx 970m geforcegtx 980m Hardware msi nvidia OEM rigs News Thu, 16 Oct 2014 15:41:38 +0000 Paul Lilly 28729 at http://www.maximumpc.com Google Unveils Nexus 9 Tablet with 64-bit Tegra K1 Inside http://www.maximumpc.com/google_unveils_nexus_9_tablet_64-bit_tegra_k1_inside_2014 <!--paging_filter--><h3><img src="/files/u69/nexus_9_tablet.jpg" alt="Nexus 9" title="Nexus 9" width="228" height="172" style="float: right;" />World's first Android 5.0 Lollipop tablet</h3> <p>Today's a big day for Google and its Android platform. In addition to launching the big-size <a href="http://www.maximumpc.com/motorola_rolls_out_nexus_6_handset_google_featuring_android_50_lollipop_2014">Nexus 6 handset</a> built by Motorola, <strong>Google today also unveiled the Nexus 9 tablet built by HTC</strong>. Like the Nexus 6 smartphone, the Nexus 9 rocks the newest build of Google's mobile operating system, Android 5.0, otherwise now known as Lollipop. Unlike the Nexus 6, the Nexus 9 sports a 64-bit Nvidia Tegra K1 processor clocked at 2.3GHz inside.</p> <p>The Nexus 9 features an 8.9-inch IPS display with a 2048x1536 resolution (QVGA) and 4:3 aspect ratio. It also boasts 2GB of RAM, 16GB or 32GB of built-in storage (non-expandable), 1.6-megapixel front-facing camera, 8-megapixel rear-facing camera, and front-firing HTC BoomSound speakers.</p> <p>Brushed metal sides, clean lines, and a thin bezel give the tablet a sleek look, at least that's our impression from the press photos we've seen. There's also a soft grip back and "subtle curves," Google says. Optionally, you can add a keyboard folio that magnetically attaches to the Nexus 9 -- it folds into two different angles and is supposed to rest on your lap like a laptop.</p> <p>The 16GB ($399) and 32GB ($479) <a href="http://www.google.com/nexus/9/" target="_blank">Nexus 9</a> will go up for pre-order on October 17th.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/google_unveils_nexus_9_tablet_64-bit_tegra_k1_inside_2014#comments android 5.0 Google Hardware htc lollipop mobile nexus 9 nvidia slate tablet tegra k1 News Wed, 15 Oct 2014 19:42:18 +0000 Paul Lilly 28725 at http://www.maximumpc.com AMD Claims its GPUs are Great at Tackling VR Latency http://www.maximumpc.com/amd_claims_its_gpus_are_great_tackling_vr_latency_2014 <!--paging_filter--><h3>Looks like Nvidia isn't the only GPU company equipped to take on VR latency</h3> <p>While PC gamers are excited about the release of Nvidia’s <a title="980 review" href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014" target="_blank">GeForce GTX 980</a> graphics card, it is the Oculus community that is gushing over the GPU. The 980 has become a darling of the VR community with Nvidia’s claims that the new Maxwell-architecture video card will cut latency by up to 50%. It also helps that Oculus VR used the GTX 980 on its systems at its inaugural <a title="oculus connect" href="http://www.maximumpc.com/new_%E2%80%98crescent_bay%E2%80%99_rift_prototype_impresses_better_display_360-degree_tracking" target="_blank">Oculus Connect</a>&nbsp;event held in September.</p> <p style="text-align: center;"><img src="http://international.download.nvidia.com/geforce-com/international/images/nvidia-virtual-reality-technology/nvidia-virtual-reality-latency-reduction-technology-640px.png" alt="Nvidia VR latency" title="Nvidia VR latency" width="640" height="358" /></p> <p style="text-align: center;"><strong>Nvidia claims that its GPU technology can potentially reduce latency from 50ms to a much more pleasant 25ms.&nbsp;</strong></p> <p>For the uninitiated, latency is a difficult challenge for VR companies like Oculus to solve because too much latency can lead to nausea–inducing head tracking and undesired motion blur, which detract from the experience.</p> <p>While this sounds like a big win for the green team and a missed opportunity for AMD, Oculus VR Software Architect Tom Forsyth told <a href="http://www.tomshardware.com/news/oculus-oculus-connect-vr-amd-nvidia,27729.html" target="_blank">Tom’s Hardware</a> that these technological capabilities were already in AMD graphics cards.</p> <p style="text-align: center;"><img src="http://international.download.nvidia.com/geforce-com/international/images/geforce-gtx-980-970-feature-article/vr-direct.png" alt="Nvidia" title="Nvidia" width="640" height="355" /></p> <p style="text-align: center;"><strong>Nvidia has been promoting the VR aspect of its new Maxwell GPUs</strong></p> <p>When we followed up with AMD to see if it could back up those claims, the company confirmed Forsyth’s assertion and told us, “In comments to Tom’s Hardware made by Oculus VR’s Tom Forsyth, AMD Radeon hardware already supports reduced-latency VR rendering through ‘asynchronous timewarp.’ Asynchronous timewarp can be exposed in AMD Radeon hardware via the Asynchronous Compute Engines (ACE), which can schedule and execute compute and display operations independently of the graphics queue. The ACE is a fundamental architectural building block of AMD Radeon GPUs utilizing the Graphics Core Next architecture.”</p> <p>While the jury is still out on which graphics-card company will provide the best GPUs for VR moving forward, it sounds like you shouldn’t count out AMD in the latency department just yet. We've got a DK2 on order, so expect more VR-related stories as soon as we get it in!</p> http://www.maximumpc.com/amd_claims_its_gpus_are_great_tackling_vr_latency_2014#comments amd asynchronous time warp gcn GeForce 980 graphics core next maximum pc nvidia oculus rift vr Gaming News Wed, 15 Oct 2014 17:57:21 +0000 Jimmy Thang 28724 at http://www.maximumpc.com Zotac GeForce GTX 970 AMP! Extreme Unboxing (Video) http://www.maximumpc.com/zotac_geforce_gtx_970_amp_extreme_unboxing_video <!--paging_filter--><h3>Check out Zotac's extremely fancy GeForce GTX 970</h3> <p>Tom's back again with another video, since being on camera he has become drunk with power. This time, he's showing off Zotac's shiny AMP! Extreme Editon of the GTX 970, with boosted clock speeds, big cooling, and even a carbon fiber-esque backplate. This card uses Nvidia's new "Maxwell" architecture, which improves power efficiency and performance, in addition to adding features like Voxel Global Illumination and Multi-Frame Sampled Anti-Aliasing. <a title="Nvidia GeForce GTX 980 review" href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014" target="_blank">You can read all about that in our review of the GTX 980</a>, which is the 970's big brother (as its numbering probably indicated).</p> <p><img src="/files/u160416/zotac970.jpg" alt="Zotac GeForce GTX 970 AMP! Extreme Edition" title="Zotac GeForce GTX 970 AMP! Extreme Edition" width="620" height="390" /></p> <p>The AMP! Extreme Edition is very fancy and cost $410 (up from the GTX 970's normal $330 asking-price) . Zotac isn't generally known for high-perfomance variants. MSI has "Lightning," ASUS has "Republic of Gamers," and Sapphire has "Vapor-X," to name a few. After checking out this card, we wonder if Zotac will get an enthusiast spotlight of its own. Check out our video for the details on this guy.</p> <p>.<iframe src="//www.youtube.com/embed/y3mh5pb1tnM" width="560" height="315" frameborder="0"></iframe></p> http://www.maximumpc.com/zotac_geforce_gtx_970_amp_extreme_unboxing_video#comments geforce graphics GTX 970 MPCTV nvidia unboxing video Video Card zotac News Wed, 08 Oct 2014 19:26:58 +0000 Tom McNamara 28685 at http://www.maximumpc.com Maxwell Goes Mobile as Nvidia Launches GeForce GTX 970M and 980M GPUs http://www.maximumpc.com/maxwell_goes_mobile_nvidia_launches_geforce_gtx_970m_and_980m_gpus <!--paging_filter--><h3><img src="/files/u69/nvidia_geforce_gtx_980m.jpg" alt="Nvidia GeForce GTX 980M" title="Nvidia GeForce GTX 980M" width="228" height="126" style="float: right;" />Update: Now with more screens and information on Nvidia battery boost</h3> <p>When Nvidia unveiled its first Maxwell-based graphics cards during its <a href="http://www.maximumpc.com/nvidia_gets_ready_game24_first_ever_24-hour_pc_gaming_celebration_2014">GAME24 event</a>, the company trumpeted <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014">increased performance</a> alongside power efficiency, allowing for high-end video cards that run cooler and quieter. That's the kind of combination that's ideal for mobile gamers, and if you've been waiting for Maxwell to arrive on laptops, your wait is over -- <strong>Nvidia this morning launched its GeForce GTX 970M and 980M notebook GPUs</strong>.</p> <p>"Let’s start with some history. Nvidia’s 8th-generation GPU architecture, Fermi, delivered about 40 percent of the desktop equivalent in 2010. Kepler, our 9th generation GPU, launched in 2010, closed the gap to 60 percent, giving gamers 1080p resolution and 'ultra' settings for the first time in a notebook," Nvidia explained in a <a href="http://blogs.nvidia.com/blog/2014/10/07/maxwell-comes-to-notebooks/" target="_blank">blog post</a>.</p> <p>"With Maxwell, that gap shrinks to 80 percent of the desktop equivalent and pushes the resolution well beyond 1080p. It’s an astonishing achievement when you compare the thermal and power differences in a desktop tower and a notebook chassis," Nvidia continued.</p> <p style="text-align: center;"><strong><img src="/files/u154082/geforce_gtx_980_block_diagram_final.png" alt="980M block diagram" title="980M block diagram" width="620" height="578" /></strong></p> <p style="text-align: center;"><strong>980M block diagram</strong></p> <p>The GeForce GTX 980M wields 1536 CUDA cores with a base clockspeed of 1038MHz and unspecified boost clockspeed. It uses GDDR5 memory clocked at 2500MHz on a 256-bit bus, which translates into 160GB/s of memory bandwidth. All the latest APIs and technologies are supported, such as Optimus, GameStream, ShadowPlay, DirectX 12, OpenGL 4.4, OpenCL 1.1, PCI Express 3.0, and so forth.</p> <p>Nvidia's GTX 970M is slightly toned down with 1280 CUDA cores with a base clockspeed of 924MHz, also with an unspecified boost clock. The biggest difference between the two is that the memory travels through a 192-bit bus and tops out at 120GB/s of memory bandwidth.</p> <p style="text-align: center;"><strong><img src="/files/u154082/battery_boost_980m.png" alt="980m battery boost" title="980m battery boost" width="620" height="374" /></strong></p> <p style="text-align: center;"><strong>Nvidia promises better battery boost with its 900-series mobile cards</strong></p> <p>One returning feature that Nvidia is touting is Nvidia Battery Boost, which is a feature that throttles GPU performance down to as low as 30FPS to optimize for battery life. If you’ve been following our coverage of Battery Boost, however, you might remember how disappointed we’ve been in it. From our testing, battery boost seems to have little-to-no impact on battery life. When we met with Nvidia, the company acknowledged that it hasn’t done the best job of working with OEMs to implement the feature, which it says it is now working hard to rectify. Nvidia claims that according to its internal tests, Battery Boost can increase battery life anywhere from 29-55% in games such League of Legends and Tomb Raider.&nbsp;</p> <p><iframe src="//www.youtube.com/embed/F1jXswkHxdI" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Nvidia's GeForce GTX 980M with battery boost demo</strong></p> <p>According to Nvidia, both GPUs are capable of rendering games at up to 4K resolution. Since most laptops don't support resolutions above 1920x1080, the Maxwell parts use something called <a href="http://www.geforce.com/whats-new/articles/dynamic-super-resolution-instantly-improves-your-games-with-4k-quality-graphics" target="_blank">Dynamic Super Resolution</a> (DSR) to render games at the higher res and then scale them down. Nvidia says this results in superior image quality than rendering directly to 1080p and considers it Maxwell's most exciting new technology.</p> <p><img src="/files/u154082/maxwell_laptops.png" alt="upcoming gaming laptops with 900 series mobile GPUs" title="upcoming gaming laptops with 900 series mobile GPUs" width="620" height="372" /></p> <p>Look for notebooks equipped with the new <a href="http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-980m" target="_blank">GeForce GTX 980M</a> and <a href="http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-970m" target="_blank">970M</a> GPUs to start shipping today.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/maxwell_goes_mobile_nvidia_launches_geforce_gtx_970m_and_980m_gpus#comments Gaming geforce gtx 970m geforce gtx 980m gpu Hardware laptop maxwell mobile notebook nvidia News Wed, 08 Oct 2014 00:26:31 +0000 Paul Lilly and Jimmy Thang 28673 at http://www.maximumpc.com Acer 4K G-Sync Monitor Tested with a GTX 980 (Video) http://www.maximumpc.com/acer_4k_g-sync_monitor_tested_gtx_980_video <!--paging_filter--><p><img src="/files/u160416/xb280hk.png" alt="Acer XB280HK 4K Monitor" title="Acer XB280HK 4K Monitor" width="250" height="250" style="float: right;" /></p> <h3>Acer joins the G-Sync party</h3> <p>We got Acer's XB280HK monitor in, which is the company's 28-inch 4K G-Sync unit. For now, it's the only 4K G-Sync unit that you can buy. G-Sync is a technology from Nvidia that sycronizes your monitor's refresh rate with your video card's refresh rate, which eliminates screen tearing (but it's not compatible with all GeForce cards. Here's a <a title="g sync gpus" href="http://www.geforce.com/hardware/technology/g-sync/supported-gpus" target="_blank">list of supported G-Sync GPUs</a>). 4K resolution, at 3840x2160, is four times as many pixels as 1920x1080, so it needs a lot of horsepower to play a game. We tested the monitor <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014" target="_blank">on one of our GeForce GTX 980 </a>video cards running <a href="http://www.maximumpc.com/batman_arkham_origins_review_2014" target="_blank">Batman: Arkham Origins</a>, a game that's optimized for Nvidia graphics. Your guide in the Youtube video is Tom McNamara, our Technical Editor.</p> <p>This monitor retails for about $800, which is the same price as <a href="http://www.maximumpc.com/asus_rog_readies_swift_pg278q_monitor_nvidia_g-sync_august_release" target="_blank">the Asus ROG Swift, a 2560x1440 G-Sync panel</a>. However, the Swift can go up to a 144Hz refresh rate, while the XB280HK maxes out at 60Hz. Both monitors use a TN panel instead of IPS. IPS tends to have better image quality, but TN can have much lower latency. If you're thinking of picking one of these up, also be aware that they are DisplayPort-only. HDMI and DVI can't provide enough bandwidth. (HDMI 2.0 does, but the monitor has to have support for that built in.) These monitors come with the correct cable, and the compatible cards all have DisplayPort -- but some cards may only have "mini" DisplayPort, so you'll need an adapter in those scenarios.&nbsp;</p> <p><iframe src="//www.youtube.com/embed/rXHliHqUPA0" width="560" height="315" frameborder="0"></iframe></p> http://www.maximumpc.com/acer_4k_g-sync_monitor_tested_gtx_980_video#comments 4K monitor g-sync GTX 980 hands-on maxwell MPCTV nvidia video XB280HK News Features Tue, 07 Oct 2014 20:38:42 +0000 Tom McNamara 28680 at http://www.maximumpc.com AMD Marks Down Radeon R9 290 and 290X Cards to Compete with Maxwell http://www.maximumpc.com/amd_marks_down_radeon_r9_290_and_290x_cards_compete_maxwell <!--paging_filter--><h3><img src="/files/u69/radeon_r9_290x_0.jpg" alt="Radeon R9 290X" title="Radeon R9 290X" width="228" height="158" style="float: right;" />Taking a trip to 'Hawaii' just got a bit more affordable</h3> <p>Competition is fierce in the graphics card market, and while we've seen AMD and Nvidia duke it out with bundled game offers, it's the price wars that truly get our attention. Speaking of which, Nvidia certainly got AMD's attention when it launched Maxwell during the company's <a href="http://www.maximumpc.com/13_million_people_tuned_nvidias_game24_pc_gaming_celebration">GAME24 event</a>, which saw the release of the GeForce GTX 970 and <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014">GTX 980</a> for $329 and $549, respectively (MSRPs). <strong>AMD has just responded by cutting the price of its Radeon R9 290 and 290X Hawaii cards</strong>.</p> <p>We'll give <a href="http://www.fudzilla.com/home/item/35946-amd-drops-290-290x-hawaii-prices" target="_blank"><em>Fudzilla</em> credit</a> for the heads up on this one. According to the news and rumor site, AMD is also planning to slash the price of its Radeon R9 280 graphics card, though at present, it's the other two cards that are marked down. Specifically, you can now find the Radeon R9 290X selling for $399 at Newegg, down from its original launch price of $549, and the Radeon R9 290 going for $299, down from $399. In some cases, prices are even lower if you factor in any applicable mail-in-rebate offers.</p> <p>As for the Radeon R9 280X, <em>Fudzilla</em> claims to have heard that it's dropping to as low as $229, though at the time of this writing, the cheapest we found one selling for new (as opposed to refurbished or open box) is $270 ($250 after mail-in-rebate).</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/amd_marks_down_radeon_r9_290_and_290x_cards_compete_maxwell#comments amd Build a PC Gaming graphics card Hardware maxwell nvidia r9 290 r9 290x radeon Video Card News Mon, 06 Oct 2014 15:52:03 +0000 Paul Lilly 28666 at http://www.maximumpc.com 1.3 Million People Tuned into Nvidia's GAME24 PC Gaming Celebration http://www.maximumpc.com/13_million_people_tuned_nvidias_game24_pc_gaming_celebration <!--paging_filter--><h3><img src="/files/u69/game24.jpg" alt="GAME24" title="GAME24" width="228" height="143" style="float: right;" />PC gaming is alive and well</h3> <p>Nvidia earlier this month launched a 24 hour celebration of PC gaming called GAME24. In addition to being the first GAME24 of (hopefully) many more to come, it was also the first live streamed 24-hour global celebration of PC gaming. By the numbers, it was a success -- <strong>GAME24 attracted more than 1.3 million gamers from nearly 150 countries</strong> who tuned into the live stream to see tech talks, a 24-hour modding competition, and more.</p> <p>GAME24 was home to a Dota 2 tournament, product giveaways, and the launch of new hardware -- Nvidia used GAME24 as a launchpad for its flagship Maxwell-based GeForce GTX 970 and 980 graphics cards.</p> <p>"We wanted to bring the PC gaming industry together so we could celebrate all the great things that gaming has to offer. And gamers responded, crowding into events in Shanghai, London, Los Angeles, Chicago, Indianapolis, Mission Viejo and Stockholm," Nvidia stated in a <a href="http://blogs.nvidia.com/blog/2014/10/02/game24-wrapup/" target="_blank">blog post</a>. "Even more showed up online."</p> <p>Those who tuned in didn't just take a peek and leave -- the average time spent was 15 minutes per person, with a total of 30,000,000 minutes watched collectively. GAME24 also raised $50,000 to Donate to Games in the fight against cancer, Nvidia said.</p> <p>If you missed it, you can check out some of the highlights below:</p> <p><iframe src="//www.youtube.com/embed/dZw9jFf_n-Y?list=PLZHnYvH1qtOYgUepQXLz5lwWmQwnYr5sV" width="620" height="349" frameborder="0"></iframe></p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/13_million_people_tuned_nvidias_game24_pc_gaming_celebration#comments convention game24 Gaming nvidia pc streaming News Thu, 02 Oct 2014 17:36:48 +0000 Paul Lilly 28653 at http://www.maximumpc.com NVIDIA Shield 32GB LTE Gaming Tablet Now Available for $399 http://www.maximumpc.com/nvidia_shield_32gb_lte_gaming_tablet_now_available_399 <!--paging_filter--><h3><img src="/files/u69/nvidia_shield_1.jpg" alt="Nvidia Shield tablet" title="Nvidia Shield tablet" width="228" height="177" style="float: right;" />Double the storage and with LTE connectivity</h3> <p>For those of you who've been waiting for Nvidia to start shipping its Shield gaming tablet with 32GB of on-board storage and 4G LTE connectivity baked in, the wait is now over -- <strong>you can order the Shield 32GB LTE gaming tablet for $399</strong> direct from Nvidia's website. That's twice as much storage as the 16GB model, plus you get LTE connectivity in addition to Wi-Fi access, additions that come at a $100 premium.</p> <p>Otherwise, both Shield tablets are the same. Both versions pack an 8-inch Full HD 1080p (1920x1200) display, Tegra K1 SoC (192 core Kepler GPU, 2.2GHz ARM Cortex A15 CPU), 2GB of RAM, front-facing stereo speakers with dual bass reflex port and built-in microphone, 802.11n Wi-Fi (2x2 MIMO), Bluetooth 4.0, 3.5mm headphone/microphone combo port, micro-SIM, 5MP front-facing camera, 5MB rear-facing camera with auto-focus, mini HDMI output, micro USB 2.0 port, microSD card slot, DirectStylus 2 with 3D Paint, and various software bits, including Android KitKat.</p> <p>In addition to the tablet itself, there are some accessories available, including a wireless controller, tablet cover, an additional stylus, and an additional AC adapter.</p> <p>You can order the Shield tablet in 16GB Wi-Fi ($299) and 32GB Wi-Fi + LTE ($399) from <a href="http://store.nvidia.com/store/nvidia/en_US/custom/pbPage.ShieldTabletPage" target="_blank">Nvidia's website</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_shield_32gb_lte_gaming_tablet_now_available_399#comments Gaming Hardware lte mobile nvidia shield tablet News Wed, 01 Oct 2014 17:16:01 +0000 Paul Lilly 28647 at http://www.maximumpc.com Nvidia GeForce 344.16 WHQL Drivers Add Support for GTX 980 and 970 Maxwell Cards http://www.maximumpc.com/nvidia_geforce_34416_whql_drivers_add_support_gtx_980_and_970_maxwell_cards <!--paging_filter--><h3><img src="/files/u69/gtx_980.jpg" alt="Nvidia GeForce GTX 980" title="Nvidia GeForce GTX 980" width="228" height="174" style="float: right;" />New GeForce drivers now availble to download</h3> <p>So, you've gone out and acquired one or two of Nvidia's new Maxwell-based GeForce GTX 980 or GTX 970 graphics cards, is that right? As <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014">our benchmarks show</a>, you'll be happy with your purchase, and you may even see better results than we did when applying the latest WHQL-certified drivers. No need to wait -- <strong>Nvidia's GeForce 344.16 WHQL drivers are now to available to download</strong> and install.</p> <p>In addition to adding support for Nvidia's latest Maxwell cards, the new driver release purportedly offers the best gaming experience for a handful of titles, including Borderlands: The Pre-Sequel, The Evil Within, F1 2014, and Alien: Isolation, according to the accompanying <a href="http://us.download.nvidia.com/Windows/344.16/344.16-win8-win7-winvista-desktop-release-notes.pdf" target="_blank">release notes (PDF)</a>.</p> <p>You'll also find added or updated application profiles and 3D Vision profiles for a bunch of games, plus 3D compatibility mode support for Assassin's Creed: Freedom City, Halo: Spartan Assault, Murdered Sould Suspect, and Sniper Elite 3.</p> <p>Hit up <a href="http://www.nvidia.com/Download/index.aspx?lang=en-us" target="_blank">Nvidia's website</a> when you're ready to grab the new drivers.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_geforce_34416_whql_drivers_add_support_gtx_980_and_970_maxwell_cards#comments Drivers geforce 344.16 graphics card GTX 970 GTX 980 maxwell nvidia Software Video Card whql News Wed, 24 Sep 2014 16:22:07 +0000 Paul Lilly 28601 at http://www.maximumpc.com Nvidia GeForce GTX 980 Review http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014 <!--paging_filter--><h3><span style="font-size: 1.17em;">4K and SLI tested on Nvidia's high-end Maxwell card</span></h3> <p>Sometimes things don't go according to plan. Both AMD and Nvidia were supposed to have shifted to 20-nanometer parts by now. In theory, that's supposed to get you lower temperatures, higher clock speeds and quieter operation. Due to circumstances largely out of its control, Nvidia has had to go ahead with a 28nm high-end Maxwell part instead, dubbed GM204. This is not a direct successor to the GTX 780, which has more transistors, texture mapping units, and things like that. The 980 is actually the next step beyond the GTX 680, aka GK104, which was launched in March 2012.</p> <p>Despite that, our testing indicates that the GTX 980 can still be meaningfully faster than the GTX 780 and 780 Ti (and AMD’s Radeon R9 290 and 290X, for that matter, though there are a of couple games better optimized for Radeon hardware). When 20nm processes become available sometime next year, we’ll probably see the actual successor to the GTX 780. But right now, the GTX 980 is here, and comes in at $500. That seems high at first, but recall that the GTX 680, 580, and 480 all launched at this price. And keep in mind that it’s a faster card than the 780 and 780 Ti, which currently cost more. (As we wrote this, AMD announced that it was dropping the base price of the R9 290X from $500 to $450, so that war rages on.) The GTX 970 at $329 may be a better deal, but we have not yet obtained one of those for testing.</p> <p>In other news, Nvidia told us that they were dropping the price of the GTX 760 to $219, and the GTX 780 Ti, 780 and 770 are being officially discontinued. So if you need a second one of those for SLI, now is a good time.</p> <p>Let's take a look at the specs:</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>GM204</td> <td>GM204&nbsp;</td> <td>GK104&nbsp;</td> <td>GK104&nbsp;</td> <td class="item-dark">GK110</td> <td>&nbsp;Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1126</td> <td>&nbsp;1050</td> <td>&nbsp;1006</td> <td>&nbsp;863</td> <td>876</td> <td>&nbsp;"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1216</td> <td>&nbsp;1178</td> <td>&nbsp;1058</td> <td>&nbsp;900</td> <td class="item-dark">928</td> <td>&nbsp;N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7000</td> <td>&nbsp;7000</td> <td>&nbsp;6000</td> <td>&nbsp;6000</td> <td>7000</td> <td>&nbsp;5000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;2GB/4GB</td> <td>&nbsp;3GB/6GB</td> <td>3GB</td> <td>&nbsp;4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;64</td> <td>&nbsp;64</td> <td>&nbsp;32</td> <td>&nbsp;48</td> <td>48</td> <td>&nbsp;64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;128</td> <td>&nbsp;192</td> <td>240</td> <td>&nbsp;176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;2048</td> <td>&nbsp;1664</td> <td>&nbsp;1536</td> <td>&nbsp;2304</td> <td>2880</td> <td>&nbsp;2816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;8</td> <td>&nbsp;12</td> <td>&nbsp;15</td> <td>&nbsp;N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;195</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;290</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$499</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>&nbsp;$549</td> </tr> </tbody> </table> </div> <p>On paper, the 980 and 970 don't look like much of a jump from the 680. In fact, the 980 has only 128 shaders (aka "CUDA cores") per streaming multi-processor (SM). Performance tends to increase with a higher number of shaders per SM, so how did the 980 GTX perform so well in our benches, despite having a worse ratio than all the other cards? Well, Nvidia claims that they've improved the performance of each CUDA core by 40%. Provided that this calculation is accurate, the GTX 980 effectively has about as many CUDA cores as a 780 Ti. Add the GTX 980's bigger clock speeds, and performance should be higher.</p> <p><img src="/files/u160416/7g0a0209_620.jpg" width="620" height="349" /></p> <p>You probably also noticed the unusually low price for the GTX 970. The GTX 670 launched at $400 in May 2012, and the GTX 570 launched at $350 in December 2010. These earlier two cards were also had more similar specs compared to their bigger brothers. For example, the GTX 570 had 480 CUDA cores, while the 580 had 512 cores. This is a difference of just 6.25%, although the memory bus was reduced from 384-bits to 320-bits. In contrast, the 970 gets nearly 20% fewer CUDA cores than the 980, though its memory bus remains unchanged. As we said, we haven't gotten a 970 in yet, but, based on its specs, we doubt that we can compensate with overclocking, as we've been able to do in the past with the GTX 670 and 760, and the Radeon R9 290.</p> <p>Nvidia also says that the official boost clock on these new Maxwell cards is not set in stone. We witnessed our cards boosting up to 1,253MHz for extended periods of time (i.e., 20 seconds here, 30 seconds there). When the cards hit their thermal limit of 80 degrees Celsius, they would fall down as low as 1,165Mhz, but we never saw them throttle below the official base clock of 1,126MHz. In SLI, we also noted that the upper card would go up to 84 C. According to Nvidia, these cards have an upper boundary of 95 C, at which point they will throttle below the base clock to avoid going up in smoke. We were not inclined to test that theory, for now.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,1" target="_blank">Next Page: Voxels, new anti-aliasing, and VR</a></h4> <hr /> <p>The company also says that its delta color compression algorithms have improved bandwidth requirements by about 25 percent on average (it varies from game to game). This extra headroom provides more space for increased frame rates. Since DCC directly affects pixels, this effect should scale with your resolution, becoming increasingly helpful as you crank your res higher.</p> <p>You can also combine these gains with Nvidia’s new Multi-Frame Sampled Anti-Aliasing (MFAA). This technique rotates a pixel’s sampling points from one frame to the next, so that two of these points can simulate the visual results of four sampling points whose locations remain static. The effect starts to shimmer at about 20FPS, whereupon it’s automatically disabled. But when running well, Nvidia claims that it can be 30 percent faster, on average, than the visually equivalent level of Multi-Sample Anti-Aliasing (MSAA). Like TXAA (Temporal Anti-Aliasing), this technique won’t be available on AMD cards (or if it is, it will be built by AMD from the ground up and called something else).</p> <p><img src="/files/u160416/7g0a0238_resize.jpg" width="620" height="349" /></p> <p>Unfortunately, MFAA was not available in the version 344.07 beta drivers given to us for testing, but Nvidia said it would be in the driver after this one. This means that the package will not be complete on launch day. Support will trickle down to the older Kepler cards later on. Nvidia hasn’t been specific about timelines of specific cards, but it sounded like the 750 and 750 Ti (also technically Maxwell cards), will not be invited to this party.</p> <p>Another major upgrade is Voxel Global Illumination, or VXGI. Nvidia positions this as the next step beyond ambient occlusion. With VXGI, light bounces off of surfaces to illuminate nooks and crannies that would otherwise not be lit realistically, in real time. Ordinarily, light does not bounce around in a 3D game engine like it does in meatspace. It simply hits a surface, illuminates it, and that’s the end. Sometimes the lighting effect is just painted onto the texture. So there’s a lot more calculation going on with VXGI.</p> <p><img src="/files/u160416/maxwell_die_620.jpg" width="620" height="349" /></p> <p>But Nvidia has not made specific performance claims because the effect is highly scalable. A developer can choose how many cones of light they want to use, and the degree of bounced light resolution (you can go for diffused/blurry spots of light, or a reflection that’s nearly a mirror image of the bounced surface), and they balance this result against a performance target. Since this is something that has to be coded into the game engine, we won’t see that effect right away by forcing it in the drivers, like Nvidia users can with ambient occlusion.</p> <p>Next is Dynamic Super Resolution (in the 344.11 drivers released today, so we'll be giving this one a peek soon). This tech combines super-sampling with a custom filter. Super sampling takes a higher resolution that your monitor can display and squishes it down. This is a popular form of anti-aliasing, but the performance hit is pretty steep. The 13-tap Gaussian filter that the card lays on top can further smooth out jaggies. It's a post-process effect that's thankfully very light, and you can also scale DSR down from 3840x2160 to 2560x1440. It's our understanding that this effect is only available to owners of the 980 and 970, at least for now, but we'll be checking on that ASAP.</p> <p>Nvidia is also investing more deeply into VR headsets with an initiative called VR Direct. Their main bullet point is a reduction in average latency from 50ms to 25ms, using a combination of code optimization, MFAA, and another new feature called Auto Asynchronous Warp (AAW). This displays frames at 60fps even when performance drops below that. Since each eye is getting an independently rendered scene, your PC effectively needs to maintain 120FPS otherwise, which isn’t going to be common with more demanding games. AAW takes care of the difference. However, we haven’t had the opportunity to test the GTX 980 with VR-enabled games yet.</p> <p>Speaking of which, Nvidia is also introducing another new feature called Auto Stereo. As its name implies, it forces stereoscopic rendering in games that were not built with VR headsets in mind. We look forward to testing VR Direct at a later date.</p> <p>Lastly, we also noticed that GeForce Experience can now record at this resolution. It was previously limited to 2560x1600.</p> <p>Until we get our hands on MFAA and DSR, we have some general benchmarks to tide you over. We tested the GTX 980 in two-way SLI and by itself, at 2560x1600 and 3820x2160. We compared it to roughly equivalent cards that we've also run in solo and two-way configs.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,2" target="_blank">Next Page: SLI Benchmarks!</a></h4> <hr /> <p>Here's the system that we've been using for all of our recent GPU benchmarks:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Thermaltake Toughpower Grand (1,050 watts)</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 Update 1</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p class="MsoNormal" style="text-align: left;"><span style="text-align: center;">Now, let’s take a look at our results at 2560x1600 with 4xMSAA. For reference, this is twice as many pixels as 1920x1080. So gamers playing at 1080p on a similar PC can expect roughly twice the framerate, if they use the same graphical settings. We customarily use the highest preset provided by the game itself; for example, <em>Hitman: Absolution</em> is benchmarked with the “Ultra” setting. 3DMark runs the Firestrike test at 1080p, however. We also enable TressFX in Tomb Raider, and PhysX in Metro: Last Light.</span></p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>33</strong></td> <td>&nbsp;19</td> <td>25</td> <td class="item-dark">&nbsp;27</td> <td>&nbsp;26</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>46</strong></td> <td>&nbsp;21</td> <td>&nbsp;22</td> <td>&nbsp;32</td> <td>&nbsp;30</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;75</td> <td>&nbsp;51</td> <td>&nbsp;65</td> <td>&nbsp;<strong>78</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;27</td> <td>&nbsp;40</td> <td>&nbsp;45</td> <td>&nbsp;<strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;45</td> <td>&nbsp;30</td> <td>&nbsp;43</td> <td>&nbsp;<strong>48</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;64</td> <td>&nbsp;35</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;34</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>11,490</strong></td> <td>&nbsp;6,719</td> <td>&nbsp;8,482</td> <td>&nbsp;9,976</td> <td>9,837</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong><strong>)</strong></p> <p class="MsoNormal" style="text-align: left;">To synthesize the results into a few sentences, we would say that the 980 is doing very well for its price. It’s not leapfrogging over the 780 and 780 Ti, but Nvidia indicates that it’s not supposed to anyway. It dominates the GTX 680, but that card is also two years old and discontinued, so the difference is not unexpected or likely to change buying habits. The R9 290X, meanwhile, is hitting $430, while the not-much-slower 290 can be had for as little as $340. And you can pick up a 780 Ti for $560. So the GTX 980's price at launch is going to be a bit of a hurdle for Nvidia.</p> <p class="MsoNormal" style="text-align: left;">Performance in Metro: Last Light has also vastly improved. (We run that benchmark with “Advanced PhysX” enabled, indicating that Nvidia has made some optimizations there. Further testing is needed.) Loyal Radeon fans will probably not be swayed to switch camps, at least on the basis of pure performance. Hitman in particular does not appear to favor the Green Team.</p> <p class="MsoNormal" style="text-align: left;">We were fortunate enough to obtain a second GTX 980, so we decided to set them up in SLI, at the same resolution of 2560x1600. Here, the differences are more distinct. We’ve honed the comparison down to the most competitive cards that we have SLI/CF benchmarks for. (Unfortunately, we do not have a second GTX 680 in hand at this time. But judging by its single-card performance, it's very unlikely to suddenly pull ahead.) For this special occasion, we brought in the Radeon R9 295X2, which has two 290X GPUs on one card and has been retailing lately for about a thousand bucks.</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>66</strong></td> <td>&nbsp;45</td> <td>&nbsp;56</td> <td>&nbsp;50</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>70</strong></td> <td>&nbsp;52</td> <td>&nbsp;53</td> <td>&nbsp;48</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;131</td> <td>&nbsp;122</td> <td>&nbsp;<strong>143</strong></td> <td>&nbsp;90</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;77</td> <td>&nbsp;74</td> <td>&nbsp;<strong>79</strong></td> <td>&nbsp;79</td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;80</td> <td>&nbsp;72</td> <td>&nbsp;<strong>87</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;73</td> <td>&nbsp;60</td> <td><strong>&nbsp;77</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>17,490</strong></td> <td>&nbsp;14,336</td> <td>&nbsp;16,830</td> <td>&nbsp;15,656</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p class="MsoNormal" style="text-align: left;">While a solo 980 GTX is already a respectable competitor for the price, its success is more pronounced when we add a second card—as is the gap between it and the 780 Ti. It still continues to best the GTX 780, getting us over 60 FPS in each game with all visual effects cranked up. That's an ideal threshold. It also looks like Nvidia's claim of 40 percent improved CUDA core performance may not be happening consistently. Future driver releases should reveal if this is a matter of software optimization, or if it's a limitation in hardware. Or just a random cosmic anomaly.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,3" target="_blank">Next Page: 4K benchmarks and conclusion</a></h4> <hr /> <p class="MsoNormal" style="text-align: left;">So, what happens when we scale up to 3840x2160, also known as “4K”? Here we have almost twice as many pixels as 2560x1600, and four times as many as 1080p. Can the GTX 980’s 256-bit bus really handle this much bandwidth?</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;16</td> <td>&nbsp;8.7*</td> <td>&nbsp;26</td> <td class="item-dark">&nbsp;<strong>28</strong></td> <td>&nbsp;28</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>36</strong></td> <td>&nbsp;12</td> <td>&nbsp;18</td> <td>&nbsp;19</td> <td>&nbsp;18</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;35</td> <td>&nbsp;25</td> <td>&nbsp;33</td> <td>&nbsp;<strong>38</strong></td> <td>&nbsp;38</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;20</td> <td>&nbsp;15</td> <td>&nbsp;20</td> <td>&nbsp;24</td> <td><strong>&nbsp;28</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;19</td> <td>&nbsp;15</td> <td>&nbsp;<strong>30</strong></td> <td><strong>&nbsp;30</strong></td> <td>&nbsp;26</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;19</td> <td>&nbsp;11</td> <td>&nbsp;<strong>23</strong></td> <td><strong>&nbsp;23</strong></td> <td>&nbsp;18</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>*TressFX disabled</p> <p>The 980 is still scaling well, but the 384-bit 780 and 780 Ti are clearly scaling better, as is the 512-bit 290X. (<strong>Update:</strong>&nbsp;We've re-checked our test results for Hitman: Absolution, and the AMD cards weren't doing nearly as well as we originally thought, though they're still the best option for that particular game. The Batman tests have been re-done as well.) We had to disable TressFX when benchmarking the 680, because the test would crash otherwise, and it was operating at less than 1FPS anyway. At 4K, that card basically meets its match, and almost its maker.</p> <p>Here's 4K SLI/Crossfire. All tests are still conducted at 4xMSAA, which is total overkill at 4K, but we want to see just how hard we can push these cards. (Ironically, we have most of the SLI results for the 290X here, but not for 2560x1600. That's a paddlin'.)</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;33</td> <td>&nbsp;41</td> <td>&nbsp;44</td> <td class="item-dark">&nbsp;52</td> <td>&nbsp;<strong>53</strong></td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;21</td> <td>&nbsp;27</td> <td>&nbsp;29</td> <td>&nbsp;26</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;<strong>68</strong></td> <td>&nbsp;60</td> <td>&nbsp;65</td> <td>&nbsp;67</td> <td>&nbsp;66</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;40</td> <td>&nbsp;44</td> <td><strong>&nbsp;53</strong></td> <td><strong>&nbsp;</strong><strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;39</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;40</td> <td>&nbsp;24</td> <td>&nbsp;19</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;34</td> <td>&nbsp;33</td> <td>&nbsp;<strong>44</strong></td> <td>&nbsp;17</td> <td>&nbsp;34</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>It does appear that the raw memory bandwidth of the 780, 780 Ti, and 290X come in handy at this resolution, despite the optimizations of Maxwell CUDA cores. That Metro: Last Light score remains pretty interesting. It's the only one we run with PhysX enabled (to balance out using TressFX in Tomb Raider). It really does look like Maxwell is much better at PhysX than any other GPU before it. That tech isn't quite common enough to change the game. But if the difference is as good as our testing indicates, more developers may pick it up.</p> <p>Even a blisteringly fast card can be brought down by high noise levels or prodigious heat. Thankfully, this reference cooler is up to the task. Keep in mind that this card draws up to 165 watts, and its cooler is designed to handle cards that go up to 250W. But even with the fan spinning up to nearly 3,000rpm, it’s not unpleasant. With the case side panels on, you can still hear the fan going like crazy, but we didn’t find it distracting. These acoustics only happened in SLI, by the way. Without the primary card sucking in hot air from the card right below it, its fan behaved much more quietly. The GTX 980’s cooling is nothing like the reference design of the Radeon R9 290 or 290X.</p> <p><img src="/files/u160416/key_visual_620.jpg" width="620" height="349" /></p> <p>With a TDP of just 165W, a respectable 650-watt power supply should have no trouble powering two 980 GTXs. Meanwhile, the 290-watt R9 290X really needs a nice 850-watt unit to have some breathing room, and even more power would not be unwelcome.</p> <p>Since MFAA and DSR were not available in the driver that was supplied for testing, there’s more story for us to tell over the coming weeks. (<strong>Update</strong>: DSR settings are actually in this driver, just not in the location that we were expecting.) And we still need to do some testing with VR. But as it stands right now, the GTX 980 is another impressive showing for Nvidia. Its 4K scaling isn't as good as we'd like, especially since Maxwell is currently the only tech that will have Dynamic Super Resolution. If you want to play at that level, it looks like the 290 and 290X are better choices, price-wise, while the overall performance crown at 4K still belongs to the 780 and 780 Ti. But considering the price difference between the 980 and the 780, its similar performance is commendable.</p> <p>For 2560x1600 or lower resolutions, the 980 GTX emerges as a compelling option, but we're not convinced that it's over $100 better than a 290X. Then again, you have MFAA, DSR, and VR Direct, (and the overall GeForce Experience package that's a bit slicker than AMD's Gaming Evolved) which might work some people, or for Nvidia loyalists who've been waiting for an upgrade from their 680 that's not quite as expensive as the 780 or 780 Ti.</p> <p><a href="http://www.pcgamer.com/2014/09/19/nvidia-gtx-980-tested-sli-4k-and-single-gpu-benchmarks-and-impressions/" target="_blank">Our amigo Wes Fenlon over at PC Gamer has a write-up of his own</a>, so go check it out.</p> http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014#comments 4k 980 GTX benchmarks comparison geforce gpu nvidia performance Review sli Video Card Videocards Fri, 19 Sep 2014 03:04:15 +0000 Tom McNamara 28564 at http://www.maximumpc.com Leaked Press Photo of GeForce GTX 970 Suggests Nvidia is Skipping 800 Series http://www.maximumpc.com/leaked_press_photo_geforce_gtx_970_suggests_nvidia_skipping_800_series <!--paging_filter--><h3><img src="/files/u69/zotac_geforce_gtx_970.jpg" alt="Zotac GeForce GTX 970" title="Zotac GeForce GTX 970" width="228" height="173" style="float: right;" />Thank you Zotac for the confirmation!</h3> <p>Supposed benchmarks of Nvidia's forthcoming GeForce GTX 980, GTX 970, and GTX 980M GPUs were <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_gtx_970_and_gtx_980m_benchmarks_purportedly_leaked" target="_blank">leaked to the web</a> earlier this week, and presuming they were real, it would seem that Nvidia is planning to skip right over the 800 Series and jump right into the 900s. Lest there be any lingering doubt, what looks to be <strong>an official press image of Zotac's GeForce GTX 970 graphics card is making its way through cyberspace</strong>.</p> <p>We spotted the image over at <a href="http://www.fudzilla.com/home/item/35737-nvidia-aic-partner-gtx-970-pictured-with-the-box" target="_blank"><em>Fudzilla</em></a>, which led us over to <a href="http://videocardz.com/52282/zotac-geforce-gtx-970-pictured-the-ultimate-proof-there-are-no-800-series" target="_blank"><em>VideoCardz.com</em></a>. The site says the image was leaked by a Philippine store called PCHUB that was content to consider it a "sneak peek," though we're sure Zotac (and Nvidia) aren't super thrilled about it.</p> <p>In any event, the GeForce GTX 970 is rumored to feature 1,664 CUDA cores, 138 TMUs, and 32 ROPs with a 1051MHz GPU base clockspeed and 1178MHz GPU boost clockspeed. The Zotac card will have 4GB of GDDR5 memory, presumably clocked at 7012MHz on a 256-bit bus.</p> <p>We can also see that Zotac is deviating from the reference cooler in favor of its own custom solution. Since all we have is a photo to go on, there's no word yet of a price or release date, though it's rumored the 900 Series will launch on September 19.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/leaked_press_photo_geforce_gtx_970_suggests_nvidia_skipping_800_series#comments Build a PC geforce gtx 970 graphics card Hardware nvidia Video Card zotac News Fri, 12 Sep 2014 16:13:13 +0000 Paul Lilly 28527 at http://www.maximumpc.com Nvidia GeForce GTX 980, GTX 970, and GTX 980M Benchmarks Purportedly Leaked http://www.maximumpc.com/nvidia_geforce_gtx_980_gtx_970_and_gtx_980m_benchmarks_purportedly_leaked <!--paging_filter--><h3><img src="/files/u69/nvidia_card_0.jpg" alt="Nvidia Card" title="Nvidia Card" width="330" height="241" style="float: right;" />Here's a look at how Nvidia's next batch of graphics cards might perform</h3> <p>How about we kick off the work week with some rumors, speculation, and purportedly leaked info, shall we? Sure, why not! What we have tumbling out of the rumor mill today is the notion that Nvidia is going to launch its GeForce 900 Series cards based on its Maxwell architecture on September 19. Specifications are hard to come by, but in the meantime, <strong>some supposed benchmark scores of Nvidia's forthcoming GeForce GTX 980, GTX 970, and GTX 980M are making the rounds in cyberspace</strong>.</p> <p>The folks at <a href="http://videocardz.com/52166/nvidia-geforce-gtx-980-gtx-970-gtx-980m-gtx-970m-3dmark-performance" target="_blank"><em>Videocardsz.com</em></a> posted what they claim are benchmarks of the aforementioned cards, which they then assembled into a neat chart fleshed out with several existing graphics cards. In 3DMark Fire Strike, the GeForce GTX 980 sits pretty high with a score of 13,005 and is only trumped by dual GPU configurations. As a point of reference, the GeForce GTX 780 Ti posted a score of 12.702.There are three different clockspeeds posted for the GTX 980, and that's because <em>Videocardz.com</em> was unable to confirm which is the actual reference clock. The 13,005 score represents the fastest clocked version (1190MHz core). It's surmised that the card sports 4GB of GDDR5 memory on a 256-bit bus and a 7GHz memory clock.</p> <div>As for the GTX 970, it scored slightly above a GTX 780 (10,282 versus 10,008, respectively).</div> <div>What's most impressive, however, is the purported performance gain of the GTX 980M. In 3DMark Fire Strike, the 980M scored 9,364, about twice as high as the 870M (4,697) and well above the 880M (5,980). <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> </div> http://www.maximumpc.com/nvidia_geforce_gtx_980_gtx_970_and_gtx_980m_benchmarks_purportedly_leaked#comments Build a PC geforce gpu graphics card GTX 970 GTX 980 GTX 980M Hardware nvidia Video Card News Mon, 08 Sep 2014 19:58:02 +0000 Paul Lilly 28497 at http://www.maximumpc.com Nvidia Initiates Patent Lawsuit Against Samsung and Qualcomm http://www.maximumpc.com/nvidia_initiates_patent_lawsuit_against_samsung_and_qualcomm_2014 <!--paging_filter--><h3><img src="/files/u166440/nvidia_logo.png" alt="Nvidia Logo" title="Nvidia Logo" width="200" height="155" style="float: right;" />Nividia looking to block shipments of Samsung products</h3> <p>It appears that Samsung is in for some rough times ahead. <strong>Nividia announced today that it has filed a patent lawsuit against Samsung and Qualcomm</strong> with the U.S. International Trade Commission and the U.S. District Court in Delaware.&nbsp;</p> <p>On its <a title="Nvidia Blog" href="http://blogs.nvidia.com/blog/2014/09/04/nvidia-launches-patent-suits/" target="_blank"><span style="color: #ff0000;">blog</span></a>, Nvidia claims that Samsung mobile phones and tablets contain Qualcomm’s Adreno, ARM’s Mali, or Imagination’s PowerVR graphics architectures. To that effect, the company is requesting that the ITC block shipments of Samsung’s mobile phones and tablets and ask that it be awarded damages from the Delaware court for the patent infringements.&nbsp;</p> <p>Nvidia provided a list of Samsung products it says falls under patent infringements which includes products include the Galaxy Note Edge, Galaxy Note 4, Galaxy S5, Galaxy Note 3 and Galaxy S4 mobile phones; and the Galaxy Tab S, Galaxy Note Pro and Galaxy Tab 2 computer tablets. These devices use Qualcomm’s mobile processors such as the Snapdragon S4, 400, 600, 800, 801 and 805.</p> <p>“Without licensing NVIDIA’s patented GPU technology, Samsung and Qualcomm have chosen to deploy our IP without proper compensation to us,” said Shannon. “This is inconsistent with our strategy to earn an appropriate return on our investment.”</p> <p>According to Nvidia executive vice-president David Shannon, the GPU manufacturer approached Samsung to reach an agreement but Samsung responded that this was mostly its suppliers’ problem.&nbsp;</p> <p>Of the 7,000 issued and pending patents Nvidia has under its belt, Shannon says the company has chosen seven of those to bring up in these cases against Samsung and Qualcomm saying, “Those patents include our foundational invention, the GPU, which puts onto a single chip all the functions necessary to process graphics and light up screens; our invention of programmable shading, which allows non-experts to program sophisticated graphics; our invention of unified shaders, which allow every processing unit in the GPU to be used for different purposes; and our invention of multithreaded parallel processing in GPUs, which enables processing to occur concurrently on separate threads while accessing the same memory and other resources.”</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/nvidia_initiates_patent_lawsuit_against_samsung_and_qualcomm_2014#comments David Shannon nvidia Patent Infringement qualcomm samsung Samsung patent infringement News Fri, 05 Sep 2014 03:14:25 +0000 Sean D Knight 28482 at http://www.maximumpc.com Build it: Real-World 4K Gaming Test Bench http://www.maximumpc.com/build_it_real-world_4k_gaming_test_bench_2014 <!--paging_filter--><h3>This month, we find out what it takes to run games at 4K, and do so using a sweet open-air test bench</h3> <p>The computer world loves it when specs double from one generation to the next. We’ve gone from 16-bit to 32-bit, and finally 64-bit computing. We had 2GB RAM sticks, then 4GB, then 8GB. With monitor resolutions, 1920x1080 has been the standard for a while, but we never quite doubled it, as 2560x1600 was a half-step, but now that 4K resolution has arrived, it’s effectively been exactly doubled, with the panels released so far being 3840x2160. We know it’s not actually 4,000 pixels, but everyone is still calling it “4K.” Though resolution is doubled over 1080p, it’s the equivalent number of pixels as four 1080p monitors, so it takes a lot of horsepower to play games smoothly. For example, our 2013 Dream Machine used four Nvidia GeForce GTX Titans and a CPU overclocked to 5GHz to handle it. Those cards cost $4,000 altogether though, so it wasn’t a scenario for mere mortals. This month, we wanted to see what 4K gaming is like with more-affordable parts. We also wanted to try a distinctive-looking open test bench from DimasTech. This type of case is perfect for SLI testing, too, since it makes component installation and swapping much quicker.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/beauty_shot_small_29.jpg"><img src="/files/u152332/beauty_shot_small_28.jpg" width="620" height="417" /></a></p> <h3>Triple Threat</h3> <p>Instead of GTX Titans, we’re stepping it down a couple of notches to Nvidia GTX 780s. They provide similar gaming performance, but at half the cost. We’re also using “only” three cards instead of four, so the price difference from Dream Machine to this rig is a whopping $2500 (even more if you count the fact that the Dream Machine cards were water-cooled). These cards still need a lot of bandwidth, though, so we’re sticking with an Intel LGA 2011 motherboard, this time an Asus X79 Deluxe. It’s feature-packed and can overclock a CPU like nobody’s business. The X79 Deluxe is running Intel’s Core i7-4960X CPU, which has six cores and twelve processing threads. It’s kind of a beast. We’re cooling it with a Cooler Master Glacer 240L water cooler, which comes with a 240mm radiator.</p> <p>We’ll also need a boatload of power, so we grabbed a Corsair AX1200 PSU which, as its name suggests, supplies up to 1200 watts. It’s also fully modular, meaning that its cables are all detachable. Since we’re only using one storage device in this build, we can keep a lot of spare cables tucked away in a bag, instead of cluttering up the lower tray.</p> <p>All of this is being assembled on a DimasTech Easy V3 test bench, which is a laser-cut steel, hand-welded beauty made in Italy and painted glossy red. It can handle either a 360mm or 280mm radiator as well, and it comes with an articulating arm to move a case fan around to specific areas. It seems like the ultimate open-air test bench, so we’re eager to see what we can do with it.&nbsp;&nbsp; \</p> <h4>1. Case Working</h4> <p>The DimasTech Easy V3 comes in separate parts, but the bulk of it is an upper and lower tray. You slide the lower one in and secure it with a bundled set of six aluminum screws. The case’s fasteners come in a handy plastic container with a screw-on lid. Shown in the photo are the two chromed power and reset buttons, which are the last pieces to be attached. They have pre-attached hexagonal washers, which can be a bit tricky to remove. We had to use pliers on one of them. You’ll need to wire them up yourself, but there’s a diagram included. Then, connect the other head to the motherboard’s front panel header, which has its own diagram printed on the board.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/a_small_29.jpg"><img src="/files/u152332/a_small_28.jpg" title="Image A" width="620" height="413" /></a></p> <h4>2. Getting Testy</h4> <p>Unfortunately, the Easy V3 does not ship with a 2.5-inch drive bay, nor do standard 3.5-inch to 2.5-inch adapters fit inside the bays. If you want to install a solid-state drive, you need to purchase the correctly sized bay or adapter separately from DimasTech. Since this is an open test bench, which is designed for swapping parts quickly, we chose to just leave the drive unsecured. It has no moving parts, so it doesn’t need to be screwed down or even laid flat to work properly. We also moved the 5.25-inch drive bay from the front to the back, to leave as much room as possible to work with our bundle of PSU cables. The lower tray has a number of pre-drilled holes to customize drive bay placement. Meanwhile, our power supply must be oriented just like this to properly attach to the case’s specified bracket. It’s not bad, though, because this positions the power switch higher up, where it’s less likely to get bumped accidentally.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/b_small_24.jpg"><img src="/files/u152332/b_small_23.jpg" title="Image B" width="620" height="413" /></a></p> <h4>3. Able Cables</h4> <p>The best way to install a modular power supply is to attach your required cables first. This time, we got a kit from Corsair that has individually sleeved wires. It costs $40, and also comes in red, white, or blue. Each of these kits is designed to work with a specific Corsair power supply. They look fancier than the stock un-sleeved cables, and the ones for motherboard and CPU power are a lot more flexible than the stock versions. All of the connectors are keyed, so you can’t accidentally plug them into the wrong socket. We used a few black twist ties to gather in the PCI Express cables.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/c_small_27.jpg"><img src="/files/u152332/c_small_26.jpg" title="Image C" width="620" height="413" /></a></p> <h4>4. Taking a Stand(off)</h4> <p>The Easy V3 comes with an unusually tall set of metal motherboard standoffs. These widgets prevent the motherboard from touching the tray below and possibly creating a short circuit. You can screw these in by hand, optionally tightening them up with a pair of pliers. Once those were in, we actually used some thumbscrews bundled with the case to screw the board down on the standoffs. You can use more standard screws, but we had plenty to spare, and we liked the look. The tall standoffs also work nicely with custom liquid cooling loops, because there is enough clearance to send thick tubing underneath (and we’ve seen lots of photos on the Internet of such setups). For us, it provided enough room to install a right-angle SATA cable and send it through the oval cut-out in the tray and down to the SSD below.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/d_small_23.jpg"><img src="/files/u152332/d_small_22.jpg" title="Image D" width="620" height="413" /></a></p> <p style="text-align: center;">&nbsp;</p> <hr /> <p>&nbsp;</p> <h4>5. Triple Play</h4> <p>This bench has a black bracket that holds your PCIe cards and can be slid parallel to the motherboard to accommodate different board layouts. It will take up to four two-slot cards, and DimasTech sells a longer 10-slot bracket on its website for workstation boards. We had to use the provided aluminum thumbscrews to secure the cards, since all of the screws we had in The Lab were either too coarsely threaded or not the right diameter, which is unusual. Installing cards is easy, because your view of the board slot is not blocked by a case. The video cards will end up sandwiched right next to each other, though, so you’ll need a tool to release the slot-locking mechanism on two of them (we used a PCI slot cover). The upper two cards can get quite toasty, so we moved the bench’s built-in flexible fan arm right in front of their rear intake area, and we told the motherboard to max out its RPM. We saw an immediate FPS boost in our tests, because by default these cards will throttle once they get to about 83 C.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/e_small_21.jpg"><img src="/files/u152332/e_small_20.jpg" title="Image E" width="620" height="413" /></a></p> <h4>6. Cool Under Pressure</h4> <p>Since the Glacer 240L cooler has integrated tubing that’s relatively short, the orientation pictured was our only option. We could have put the fans on the other side of the radiator, but since performance was already superb, we decided we liked the looked of them with the grills on top. To mount the radiator, we used the bundled screws, which became the right length when we added some rubber gaskets, also included.&nbsp; The radiator actually doesn’t give off much heat, even when the CPU is overclocked and firing on all cylinders, so we didn’t have to worry about the nearby power supply fan pulling in a lot of hot intake. In fact, the CPU never crossed 65C in all of our benchmarks, even when overclocked to 4.5GHz. We even threw Prime95 at it, and it didn’t break a sweat. Temperatures are also affected by ambient temperatures, though. With our open-air layout, heat coming out of the GPUs doesn’t get anywhere near the radiator, and The Lab’s air conditioning helps keep temperatures low, so it’s pretty much an ideal environment, short of being installed in a refrigerator. Your mileage may vary.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/f_small_22.jpg"><img src="/files/u152332/f_small_21.jpg" title="Image F" width="620" height="413" /></a></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_small_18.jpg"><img src="/files/u152332/main_image_small_17.jpg" title="Main Image" width="620" height="382" /></a></p> <h3>A Golden Triangle</h3> <p>Despite our penchant for extreme performance, we rarely build triple-GPU systems, so we weren’t sure how well they’d handle 4K, but we figured they’d kick ass. Thankfully, they handled UHD quite well. So well, in fact, that we also tested the system with “only” two GTX 780s and still got respectable gaming performance. For example, with two cards, the Bioshock Infinite benchmark reported an average of a little over 60 FPS on its highest settings. In Tomb Raider, we disabled anti-aliasing and TressFX, maxing out all the other settings, and we still averaged 62 FPS. We benchmarked the opening sequence of Assassin’s Creed 4 with AA and PhysX disabled and everything else maxed out, and we averaged 47 FPS. The Metro: Last Light benchmark, however, averaged 25FPS on max settings, even with PhysX disabled.</p> <p>Unfortunately, we had trouble getting Hitman: Absolution and Metro: Last Light to recognize the third card. This issue is not unheard of, and made us think: If you stick with two GPUs, you no longer need the PCI Express bandwidth of expensive LGA 2011 CPUs, or their equally expensive motherboards, or a huge power supply. That potentially cuts the cost of this system in half, from around $4200 to roughly $2100. You could also save money by going with, say, a Core i7-4930K instead, and a less expensive LGA 2011 motherboard and a smaller SSD. But it’s still a pretty steep climb in price when going from two cards to three.</p> <p>The test bench itself feels sturdy and looks sweet, but we wish that it accepted standard computer-type screws, and that it came with a 2.5-inch drive bay or could at least fit a standard 3.5-to-2.5 adapter. We’d also recommend getting a second articulating fan arm if you’re liquid-cooling, so that one could provide airflow to the voltage regulators around the CPU, and the other could blow directly on your video cards. With the fan aimed at our cards, we instantly gained another 10 FPS in the Tomb Raider benchmark.</p> <p>The Seagate 600 SSD was nice and speedy, although unzipping compressed files seemed to take longer than usual. The X79 Deluxe motherboard gave us no trouble, and the bundled “Asus AI Suite III” software has lots of fine-grained options for performance tuning and monitoring, and it looks nice. Overall, this build was not only successful but educational, too.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong><br /> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light"> <p style="font-size: 10px; font-weight: normal; text-align: start;"><strong>ZERO</strong></p> <p style="font-size: 10px; font-weight: normal; text-align: start;"><strong>POINT</strong></p> </th> <th></th> </tr> </thead> <tbody> <tr> <td class="item">Premiere Pro CS6 (sec)</td> <td class="item-dark">2,000</td> <td><span style="text-align: center;">1,694</span><strong>&nbsp;</strong></td> </tr> <tr> <td>Stitch.Efx 2.0 (sec)</td> <td>831</td> <td><span style="text-align: center;">707</span><strong>&nbsp;</strong></td> </tr> <tr> <td class="item">ProShow Producer 5.0 (sec)</td> <td class="item-dark">1,446</td> <td>1,246</td> </tr> <tr> <td>x264 HD 5.0 (fps)</td> <td>21.1</td> <td>25.6<strong></strong></td> </tr> <tr> <td>Batmans Arkam City (fps)</td> <td>76</td> <td>169<strong></strong></td> </tr> <tr> <td class="item">3DMark11 Extreme</td> <td class="item-dark">5,847&nbsp;</td> <td>12,193</td> </tr> </tbody> </table> </div> </div> <p><span style="font-size: 10px; font-weight: bold;"><em>The zero-point machine compared here consists of a 3.2GHz Core i7-3930K and 16GB of Corsair DDR3/1600 on an Asus P9X79 Deluxe motherboard. It has a GeForce GTX 690, a Corsair Neutron GTX SSD, and 64-bit Windows 7 Professional.</em></span></p> http://www.maximumpc.com/build_it_real-world_4k_gaming_test_bench_2014#comments 4k computer gaming pc geforce Hardware maximum pc May issues 2014 nvidia open Test Bench Features Wed, 03 Sep 2014 19:29:01 +0000 Tom McNamara 28364 at http://www.maximumpc.com