nvidia http://www.maximumpc.com/taxonomy/term/320/ en Leaked Press Photo of GeForce GTX 970 Suggests Nvidia is Skipping 800 Series http://www.maximumpc.com/leaked_press_photo_geforce_gtx_970_suggests_nvidia_skipping_800_series <!--paging_filter--><h3><img src="/files/u69/zotac_geforce_gtx_970.jpg" alt="Zotac GeForce GTX 970" title="Zotac GeForce GTX 970" width="228" height="173" style="float: right;" />Thank you Zotac for the confirmation!</h3> <p>Supposed benchmarks of Nvidia's forthcoming GeForce GTX 980, GTX 970, and GTX 980M GPUs were <a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_gtx_970_and_gtx_980m_benchmarks_purportedly_leaked" target="_blank">leaked to the web</a> earlier this week, and presuming they were real, it would seem that Nvidia is planning to skip right over the 800 Series and jump right into the 900s. Lest there be any lingering doubt, what looks to be <strong>an official press image of Zotac's GeForce GTX 970 graphics card is making its way through cyberspace</strong>.</p> <p>We spotted the image over at <a href="http://www.fudzilla.com/home/item/35737-nvidia-aic-partner-gtx-970-pictured-with-the-box" target="_blank"><em>Fudzilla</em></a>, which led us over to <a href="http://videocardz.com/52282/zotac-geforce-gtx-970-pictured-the-ultimate-proof-there-are-no-800-series" target="_blank"><em>VideoCardz.com</em></a>. The site says the image was leaked by a Philippine store called PCHUB that was content to consider it a "sneak peek," though we're sure Zotac (and Nvidia) aren't super thrilled about it.</p> <p>In any event, the GeForce GTX 970 is rumored to feature 1,664 CUDA cores, 138 TMUs, and 32 ROPs with a 1051MHz GPU base clockspeed and 1178MHz GPU boost clockspeed. The Zotac card will have 4GB of GDDR5 memory, presumably clocked at 7012MHz on a 256-bit bus.</p> <p>We can also see that Zotac is deviating from the reference cooler in favor of its own custom solution. Since all we have is a photo to go on, there's no word yet of a price or release date, though it's rumored the 900 Series will launch on September 19.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/leaked_press_photo_geforce_gtx_970_suggests_nvidia_skipping_800_series#comments Build a PC geforce gtx 970 graphics card Hardware nvidia Video Card zotac News Fri, 12 Sep 2014 16:13:13 +0000 Paul Lilly 28527 at http://www.maximumpc.com Nvidia GeForce GTX 980, GTX 970, and GTX 980M Benchmarks Purportedly Leaked http://www.maximumpc.com/nvidia_geforce_gtx_980_gtx_970_and_gtx_980m_benchmarks_purportedly_leaked <!--paging_filter--><h3><img src="/files/u69/nvidia_card_0.jpg" alt="Nvidia Card" title="Nvidia Card" width="330" height="241" style="float: right;" />Here's a look at how Nvidia's next batch of graphics cards might perform</h3> <p>How about we kick off the work week with some rumors, speculation, and purportedly leaked info, shall we? Sure, why not! What we have tumbling out of the rumor mill today is the notion that Nvidia is going to launch its GeForce 900 Series cards based on its Maxwell architecture on September 19. Specifications are hard to come by, but in the meantime, <strong>some supposed benchmark scores of Nvidia's forthcoming GeForce GTX 980, GTX 970, and GTX 980M are making the rounds in cyberspace</strong>.</p> <p>The folks at <a href="http://videocardz.com/52166/nvidia-geforce-gtx-980-gtx-970-gtx-980m-gtx-970m-3dmark-performance" target="_blank"><em>Videocardsz.com</em></a> posted what they claim are benchmarks of the aforementioned cards, which they then assembled into a neat chart fleshed out with several existing graphics cards. In 3DMark Fire Strike, the GeForce GTX 980 sits pretty high with a score of 13,005 and is only trumped by dual GPU configurations. As a point of reference, the GeForce GTX 780 Ti posted a score of 12.702.There are three different clockspeeds posted for the GTX 980, and that's because <em>Videocardz.com</em> was unable to confirm which is the actual reference clock. The 13,005 score represents the fastest clocked version (1190MHz core). It's surmised that the card sports 4GB of GDDR5 memory on a 256-bit bus and a 7GHz memory clock.</p> <div>As for the GTX 970, it scored slightly above a GTX 780 (10,282 versus 10,008, respectively).</div> <div>What's most impressive, however, is the purported performance gain of the GTX 980M. In 3DMark Fire Strike, the 980M scored 9,364, about twice as high as the 870M (4,697) and well above the 880M (5,980). <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> </div> http://www.maximumpc.com/nvidia_geforce_gtx_980_gtx_970_and_gtx_980m_benchmarks_purportedly_leaked#comments Build a PC geforce gpu graphics card GTX 970 GTX 980 GTX 980M Hardware nvidia Video Card News Mon, 08 Sep 2014 19:58:02 +0000 Paul Lilly 28497 at http://www.maximumpc.com Nvidia Initiates Patent Lawsuit Against Samsung and Qualcomm http://www.maximumpc.com/nvidia_initiates_patent_lawsuit_against_samsung_and_qualcomm_2014 <!--paging_filter--><h3><img src="/files/u166440/nvidia_logo.png" alt="Nvidia Logo" title="Nvidia Logo" width="200" height="155" style="float: right;" />Nividia looking to block shipments of Samsung products</h3> <p>It appears that Samsung is in for some rough times ahead. <strong>Nividia announced today that it has filed a patent lawsuit against Samsung and Qualcomm</strong> with the U.S. International Trade Commission and the U.S. District Court in Delaware.&nbsp;</p> <p>On its <a title="Nvidia Blog" href="http://blogs.nvidia.com/blog/2014/09/04/nvidia-launches-patent-suits/" target="_blank"><span style="color: #ff0000;">blog</span></a>, Nvidia claims that Samsung mobile phones and tablets contain Qualcomm’s Adreno, ARM’s Mali, or Imagination’s PowerVR graphics architectures. To that effect, the company is requesting that the ITC block shipments of Samsung’s mobile phones and tablets and ask that it be awarded damages from the Delaware court for the patent infringements.&nbsp;</p> <p>Nvidia provided a list of Samsung products it says falls under patent infringements which includes products include the Galaxy Note Edge, Galaxy Note 4, Galaxy S5, Galaxy Note 3 and Galaxy S4 mobile phones; and the Galaxy Tab S, Galaxy Note Pro and Galaxy Tab 2 computer tablets. These devices use Qualcomm’s mobile processors such as the Snapdragon S4, 400, 600, 800, 801 and 805.</p> <p>“Without licensing NVIDIA’s patented GPU technology, Samsung and Qualcomm have chosen to deploy our IP without proper compensation to us,” said Shannon. “This is inconsistent with our strategy to earn an appropriate return on our investment.”</p> <p>According to Nvidia executive vice-president David Shannon, the GPU manufacturer approached Samsung to reach an agreement but Samsung responded that this was mostly its suppliers’ problem.&nbsp;</p> <p>Of the 7,000 issued and pending patents Nvidia has under its belt, Shannon says the company has chosen seven of those to bring up in these cases against Samsung and Qualcomm saying, “Those patents include our foundational invention, the GPU, which puts onto a single chip all the functions necessary to process graphics and light up screens; our invention of programmable shading, which allows non-experts to program sophisticated graphics; our invention of unified shaders, which allow every processing unit in the GPU to be used for different purposes; and our invention of multithreaded parallel processing in GPUs, which enables processing to occur concurrently on separate threads while accessing the same memory and other resources.”</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/nvidia_initiates_patent_lawsuit_against_samsung_and_qualcomm_2014#comments David Shannon nvidia Patent Infringement qualcomm samsung Samsung patent infringement News Fri, 05 Sep 2014 03:14:25 +0000 Sean D Knight 28482 at http://www.maximumpc.com Build it: Real-World 4K Gaming Test Bench http://www.maximumpc.com/build_it_real-world_4k_gaming_test_bench_2014 <!--paging_filter--><h3>This month, we find out what it takes to run games at 4K, and do so using a sweet open-air test bench</h3> <p>The computer world loves it when specs double from one generation to the next. We’ve gone from 16-bit to 32-bit, and finally 64-bit computing. We had 2GB RAM sticks, then 4GB, then 8GB. With monitor resolutions, 1920x1080 has been the standard for a while, but we never quite doubled it, as 2560x1600 was a half-step, but now that 4K resolution has arrived, it’s effectively been exactly doubled, with the panels released so far being 3840x2160. We know it’s not actually 4,000 pixels, but everyone is still calling it “4K.” Though resolution is doubled over 1080p, it’s the equivalent number of pixels as four 1080p monitors, so it takes a lot of horsepower to play games smoothly. For example, our 2013 Dream Machine used four Nvidia GeForce GTX Titans and a CPU overclocked to 5GHz to handle it. Those cards cost $4,000 altogether though, so it wasn’t a scenario for mere mortals. This month, we wanted to see what 4K gaming is like with more-affordable parts. We also wanted to try a distinctive-looking open test bench from DimasTech. This type of case is perfect for SLI testing, too, since it makes component installation and swapping much quicker.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/beauty_shot_small_29.jpg"><img src="/files/u152332/beauty_shot_small_28.jpg" width="620" height="417" /></a></p> <h3>Triple Threat</h3> <p>Instead of GTX Titans, we’re stepping it down a couple of notches to Nvidia GTX 780s. They provide similar gaming performance, but at half the cost. We’re also using “only” three cards instead of four, so the price difference from Dream Machine to this rig is a whopping $2500 (even more if you count the fact that the Dream Machine cards were water-cooled). These cards still need a lot of bandwidth, though, so we’re sticking with an Intel LGA 2011 motherboard, this time an Asus X79 Deluxe. It’s feature-packed and can overclock a CPU like nobody’s business. The X79 Deluxe is running Intel’s Core i7-4960X CPU, which has six cores and twelve processing threads. It’s kind of a beast. We’re cooling it with a Cooler Master Glacer 240L water cooler, which comes with a 240mm radiator.</p> <p>We’ll also need a boatload of power, so we grabbed a Corsair AX1200 PSU which, as its name suggests, supplies up to 1200 watts. It’s also fully modular, meaning that its cables are all detachable. Since we’re only using one storage device in this build, we can keep a lot of spare cables tucked away in a bag, instead of cluttering up the lower tray.</p> <p>All of this is being assembled on a DimasTech Easy V3 test bench, which is a laser-cut steel, hand-welded beauty made in Italy and painted glossy red. It can handle either a 360mm or 280mm radiator as well, and it comes with an articulating arm to move a case fan around to specific areas. It seems like the ultimate open-air test bench, so we’re eager to see what we can do with it.&nbsp;&nbsp; \</p> <h4>1. Case Working</h4> <p>The DimasTech Easy V3 comes in separate parts, but the bulk of it is an upper and lower tray. You slide the lower one in and secure it with a bundled set of six aluminum screws. The case’s fasteners come in a handy plastic container with a screw-on lid. Shown in the photo are the two chromed power and reset buttons, which are the last pieces to be attached. They have pre-attached hexagonal washers, which can be a bit tricky to remove. We had to use pliers on one of them. You’ll need to wire them up yourself, but there’s a diagram included. Then, connect the other head to the motherboard’s front panel header, which has its own diagram printed on the board.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/a_small_29.jpg"><img src="/files/u152332/a_small_28.jpg" title="Image A" width="620" height="413" /></a></p> <h4>2. Getting Testy</h4> <p>Unfortunately, the Easy V3 does not ship with a 2.5-inch drive bay, nor do standard 3.5-inch to 2.5-inch adapters fit inside the bays. If you want to install a solid-state drive, you need to purchase the correctly sized bay or adapter separately from DimasTech. Since this is an open test bench, which is designed for swapping parts quickly, we chose to just leave the drive unsecured. It has no moving parts, so it doesn’t need to be screwed down or even laid flat to work properly. We also moved the 5.25-inch drive bay from the front to the back, to leave as much room as possible to work with our bundle of PSU cables. The lower tray has a number of pre-drilled holes to customize drive bay placement. Meanwhile, our power supply must be oriented just like this to properly attach to the case’s specified bracket. It’s not bad, though, because this positions the power switch higher up, where it’s less likely to get bumped accidentally.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/b_small_24.jpg"><img src="/files/u152332/b_small_23.jpg" title="Image B" width="620" height="413" /></a></p> <h4>3. Able Cables</h4> <p>The best way to install a modular power supply is to attach your required cables first. This time, we got a kit from Corsair that has individually sleeved wires. It costs $40, and also comes in red, white, or blue. Each of these kits is designed to work with a specific Corsair power supply. They look fancier than the stock un-sleeved cables, and the ones for motherboard and CPU power are a lot more flexible than the stock versions. All of the connectors are keyed, so you can’t accidentally plug them into the wrong socket. We used a few black twist ties to gather in the PCI Express cables.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/c_small_27.jpg"><img src="/files/u152332/c_small_26.jpg" title="Image C" width="620" height="413" /></a></p> <h4>4. Taking a Stand(off)</h4> <p>The Easy V3 comes with an unusually tall set of metal motherboard standoffs. These widgets prevent the motherboard from touching the tray below and possibly creating a short circuit. You can screw these in by hand, optionally tightening them up with a pair of pliers. Once those were in, we actually used some thumbscrews bundled with the case to screw the board down on the standoffs. You can use more standard screws, but we had plenty to spare, and we liked the look. The tall standoffs also work nicely with custom liquid cooling loops, because there is enough clearance to send thick tubing underneath (and we’ve seen lots of photos on the Internet of such setups). For us, it provided enough room to install a right-angle SATA cable and send it through the oval cut-out in the tray and down to the SSD below.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/d_small_23.jpg"><img src="/files/u152332/d_small_22.jpg" title="Image D" width="620" height="413" /></a></p> <p style="text-align: center;">&nbsp;</p> <hr /> <p>&nbsp;</p> <h4>5. Triple Play</h4> <p>This bench has a black bracket that holds your PCIe cards and can be slid parallel to the motherboard to accommodate different board layouts. It will take up to four two-slot cards, and DimasTech sells a longer 10-slot bracket on its website for workstation boards. We had to use the provided aluminum thumbscrews to secure the cards, since all of the screws we had in The Lab were either too coarsely threaded or not the right diameter, which is unusual. Installing cards is easy, because your view of the board slot is not blocked by a case. The video cards will end up sandwiched right next to each other, though, so you’ll need a tool to release the slot-locking mechanism on two of them (we used a PCI slot cover). The upper two cards can get quite toasty, so we moved the bench’s built-in flexible fan arm right in front of their rear intake area, and we told the motherboard to max out its RPM. We saw an immediate FPS boost in our tests, because by default these cards will throttle once they get to about 83 C.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/e_small_21.jpg"><img src="/files/u152332/e_small_20.jpg" title="Image E" width="620" height="413" /></a></p> <h4>6. Cool Under Pressure</h4> <p>Since the Glacer 240L cooler has integrated tubing that’s relatively short, the orientation pictured was our only option. We could have put the fans on the other side of the radiator, but since performance was already superb, we decided we liked the looked of them with the grills on top. To mount the radiator, we used the bundled screws, which became the right length when we added some rubber gaskets, also included.&nbsp; The radiator actually doesn’t give off much heat, even when the CPU is overclocked and firing on all cylinders, so we didn’t have to worry about the nearby power supply fan pulling in a lot of hot intake. In fact, the CPU never crossed 65C in all of our benchmarks, even when overclocked to 4.5GHz. We even threw Prime95 at it, and it didn’t break a sweat. Temperatures are also affected by ambient temperatures, though. With our open-air layout, heat coming out of the GPUs doesn’t get anywhere near the radiator, and The Lab’s air conditioning helps keep temperatures low, so it’s pretty much an ideal environment, short of being installed in a refrigerator. Your mileage may vary.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/f_small_22.jpg"><img src="/files/u152332/f_small_21.jpg" title="Image F" width="620" height="413" /></a></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_small_18.jpg"><img src="/files/u152332/main_image_small_17.jpg" title="Main Image" width="620" height="382" /></a></p> <h3>A Golden Triangle</h3> <p>Despite our penchant for extreme performance, we rarely build triple-GPU systems, so we weren’t sure how well they’d handle 4K, but we figured they’d kick ass. Thankfully, they handled UHD quite well. So well, in fact, that we also tested the system with “only” two GTX 780s and still got respectable gaming performance. For example, with two cards, the Bioshock Infinite benchmark reported an average of a little over 60 FPS on its highest settings. In Tomb Raider, we disabled anti-aliasing and TressFX, maxing out all the other settings, and we still averaged 62 FPS. We benchmarked the opening sequence of Assassin’s Creed 4 with AA and PhysX disabled and everything else maxed out, and we averaged 47 FPS. The Metro: Last Light benchmark, however, averaged 25FPS on max settings, even with PhysX disabled.</p> <p>Unfortunately, we had trouble getting Hitman: Absolution and Metro: Last Light to recognize the third card. This issue is not unheard of, and made us think: If you stick with two GPUs, you no longer need the PCI Express bandwidth of expensive LGA 2011 CPUs, or their equally expensive motherboards, or a huge power supply. That potentially cuts the cost of this system in half, from around $4200 to roughly $2100. You could also save money by going with, say, a Core i7-4930K instead, and a less expensive LGA 2011 motherboard and a smaller SSD. But it’s still a pretty steep climb in price when going from two cards to three.</p> <p>The test bench itself feels sturdy and looks sweet, but we wish that it accepted standard computer-type screws, and that it came with a 2.5-inch drive bay or could at least fit a standard 3.5-to-2.5 adapter. We’d also recommend getting a second articulating fan arm if you’re liquid-cooling, so that one could provide airflow to the voltage regulators around the CPU, and the other could blow directly on your video cards. With the fan aimed at our cards, we instantly gained another 10 FPS in the Tomb Raider benchmark.</p> <p>The Seagate 600 SSD was nice and speedy, although unzipping compressed files seemed to take longer than usual. The X79 Deluxe motherboard gave us no trouble, and the bundled “Asus AI Suite III” software has lots of fine-grained options for performance tuning and monitoring, and it looks nice. Overall, this build was not only successful but educational, too.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong><br /> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light"> <p style="font-size: 10px; font-weight: normal; text-align: start;"><strong>ZERO</strong></p> <p style="font-size: 10px; font-weight: normal; text-align: start;"><strong>POINT</strong></p> </th> <th></th> </tr> </thead> <tbody> <tr> <td class="item">Premiere Pro CS6 (sec)</td> <td class="item-dark">2,000</td> <td><span style="text-align: center;">1,694</span><strong>&nbsp;</strong></td> </tr> <tr> <td>Stitch.Efx 2.0 (sec)</td> <td>831</td> <td><span style="text-align: center;">707</span><strong>&nbsp;</strong></td> </tr> <tr> <td class="item">ProShow Producer 5.0 (sec)</td> <td class="item-dark">1,446</td> <td>1,246</td> </tr> <tr> <td>x264 HD 5.0 (fps)</td> <td>21.1</td> <td>25.6<strong></strong></td> </tr> <tr> <td>Batmans Arkam City (fps)</td> <td>76</td> <td>169<strong></strong></td> </tr> <tr> <td class="item">3DMark11 Extreme</td> <td class="item-dark">5,847&nbsp;</td> <td>12,193</td> </tr> </tbody> </table> </div> </div> <p><span style="font-size: 10px; font-weight: bold;"><em>The zero-point machine compared here consists of a 3.2GHz Core i7-3930K and 16GB of Corsair DDR3/1600 on an Asus P9X79 Deluxe motherboard. It has a GeForce GTX 690, a Corsair Neutron GTX SSD, and 64-bit Windows 7 Professional.</em></span></p> http://www.maximumpc.com/build_it_real-world_4k_gaming_test_bench_2014#comments 4k computer gaming pc geforce Hardware maximum pc May issues 2014 nvidia open Test Bench Features Wed, 03 Sep 2014 19:29:01 +0000 Tom McNamara 28364 at http://www.maximumpc.com Nvidia Gets Ready for GAME24, First Ever 24-Hour PC Gaming Celebration http://www.maximumpc.com/nvidia_gets_ready_game24_first_ever_24-hour_pc_gaming_celebration_2014 <!--paging_filter--><h3><img src="/files/u69/nvidia_female_gamer.jpg" alt="Nvidia Female Gamer" title="Nvidia Female Gamer" width="228" height="168" style="float: right;" />It's about time we celebrated how awesome PC gaming is!</h3> <p>We love video games. Many of you reading this also love video games. And if you're a fan of <em>Maximum PC</em>, chances are you prefer gaming on a PC. It is, after all, the superior platform for gaming -- we love our console gaming brethren, but they'll never convince us otherwise -- so why not celebrate this hobby of ours? That's exactly what <strong>Nvidia</strong> plans to do -- the GPU maker <strong>just sent us details about GAME24, the first ever global PC gaming celebration</strong>.</p> <p>"The goal of this event is simple – to celebrate this thing we all love called PC gaming. GAME24 is a combination of many local events around the world, and it will all be broadcast live on the internet," Nvidia says.</p> <p>GAME24 will commence on September 18th at 6PM PST and last for 24 hours. Events will take place in Chicago, Los Angeles, Mission Viejo, London, Indianapolis, Shanghai, and Stockholm, along with virtual stage around the world. If you can't make it to one of the venues, you catch the broadcast on <a href="http://game24.nvidia.com/" target="_blank">game24.nvidia.com</a>.</p> <p>You'll be able to interact with the livestream at home and win prices, share gaming experiences, and more. And while this hasn't been confirmed, we've heard rumors that Nvidia might show off a new GPU during the 24-hour celebration.</p> <p>More details will be available on <a href="http://blogs.nvidia.com/" target="_blank">Nvidia's blog</a> later today. In the meantime, you can <a href="http://www.nvidia.com/content/game24/game24-registration.html" target="_blank">register for the livestream</a> or one of the physical locations.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_gets_ready_game24_first_ever_24-hour_pc_gaming_celebration_2014#comments game24 games nvidia News Tue, 02 Sep 2014 17:20:40 +0000 Paul Lilly 28464 at http://www.maximumpc.com No BS Podcast #231: AMD and Origin PC Settle Past Dispute on the Show http://www.maximumpc.com/no_bs_podcast_231_amd_and_origin_pc_settle_past_dispute_show <!--paging_filter--><h3>Plus: AMD's commitment to high-end CPUs, DDR4, 5-way GPU support, 20nm GPUs, and more!</h3> <p>In a bit of a surprise to us, <a title="amd" href="http://www.maximumpc.com/tags/amd" target="_blank">AMD</a> and <a title="origin pc" href="http://www.maximumpc.com/tags/Origin_PC" target="_blank">Origin PC</a> wanted to come into the podcast room together for <a title="No BS podcast 231" href="http://dl.maximumpc.com/maxpc_231_20140828.mp3" target="_blank"><strong>episode 231 of the No BS Podcast</strong></a>. As you may recall, this pairing is kind of surprising considering that last October, Origin PC’s co-founder and CEO Kevin Wasielewski announced that the company would be <a title="origin pc drops amd gpus" href="http://www.maximumpc.com/origin_pc_now_dealing_exclusively_nvidia_graphics_claims_amd_gpus_are_problematic2013" target="_blank">dropping AMD graphics cards from its systems</a>, stating, “This decision was based on a combination of many factors including customer experiences, GPU performance/drivers/stability, and requests from our support staff.” He then later added, “Based on our 15+ years of experience building and selling award winning high-performance PCs, we strongly feel the best PC gaming experience is on Nvidia GPUs.”</p> <p>Well, not only did we get Wasielewski in the room, but we also got AMD’s VP of Global Channel Sales Roy Taylor and AMD’s Director of Public Relations Chris Hook to come on. In the show, the two parties settle their past dispute with Taylor suggesting that AMD is now committed to giving hardware partners like Origin PC more support and communication. In the podcast, he outlines some of the strategies to do so. Wasielewski also confirmed that you can now get AMD video cards in Origin PCs again and shot down any <a href="http://semiaccurate.com/2013/10/07/nvidias-program-get-oems-like-origin-pc-dump-amd-called-tier-0/" target="_blank">rumors</a> that Nvidia was compensating Origin PC to slander AMD late last year when the announcement came about.</p> <p>Taylor also asserts that AMD’s graphics drivers have gotten a lot better over the past year, but admits this wasn’t always the case and that the company is still getting burned by that bad reputation.&nbsp;</p> <p>While Gordon was away on vacation, he did submit several questions for the rest of the crew to ask on the air, and in the show we cover a ton of ground from topics that range from:</p> <ul> <li>The possibility of 5-way GPU support</li> <li>AMD’s renewed commitment to battling Intel at the high-end CPU market</li> <li>AMD’s plans to start using DDR4</li> <li>Origin PC and AMD’s thoughts on Valve’s upcoming <a title="maximum pc steam machine" href="http://www.maximumpc.com/everything_you_need_know_about_steam_machines_2014" target="_blank">Steam Machine</a> initiative</li> <li>AMD’s take on the&nbsp;<a title="oculus rift" href="http://www.maximumpc.com/tags/oculus_rift" target="_blank">Oculus Rift</a>/VR</li> <li>Freesync monitor availability</li> <li>Why <a title="AMD ssd" href="http://www.maximumpc.com/amd_reportedly_gearing_sell_radeon-branded_line_ssds_2014" target="_blank">AMD is getting into the SSD market</a></li> <li>AMD’s presence (or lack thereof) in the laptop/gaming notebook segment</li> <li>20nm GPUs</li> <li>And then we of course top it off with your fan questions!&nbsp;</li> </ul> <p><iframe src="//www.youtube.com/embed/dTB2Uk43LKU" width="620" height="349" frameborder="0"></iframe></p> <p>The old format isn’t going away, and Gordon’s rants will return, but in the meantime, give this episode a listen, and let us know what you think!</p> <p><a title="Download Maximum PC Podcast #231 MP3" href="http://dl.maximumpc.com/maxpc_231_20140828.mp3" target="_blank"><img src="/files/u160416/rss-audiomp3.png" width="80" height="15" /></a>&nbsp;<a title="Maximum PC Podcast RSS Feed" href="http://feeds.feedburner.com/maximumpc/1337" target="_blank"><img src="/files/u160416/chicklet_rss-2_0.png" width="80" height="15" /></a>&nbsp;<a href="https://itunes.apple.com/podcast/maximum-pc-no-bs-podcast/id213247824"><img src="/files/u160416/chicklet_itunes.gif" alt="Subscribe to Maximum PC Podcast on iTunes" title="Subscribe to Maximum PC Podcast on iTunes" width="80" height="15" /></a></p> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Subscribe to the magazine for only 99 cents an issue:</h4> <h5><a title="Subscribe to Maximum PC Magazine" href="https://w1.buysub.com/pubs/IM/MAX/MAX_subscriptionpage.jsp?cds_page_id=63027" target="_blank">In print</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Zinio" href="https://www.zinio.com/checkout/publisher/?productId=500663614" target="_blank">On Zinio</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Google Play" href="https://play.google.com/store/newsstand/details/Maximum_PC?id=CAoww6lU&amp;hl=en" target="_blank">On Google Play</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on iTunes" href="http://goo.gl/UIkW4" target="_blank">On iTunes</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Amazon Kindle" href="http://www.amazon.com/Maximum-PC/dp/B005XD5144/ref=sr_1_1?ie=UTF8&amp;qid=1406326197">On the Amazon Kindle Store</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Your Nook" href="http://www.barnesandnoble.com/w/maximum-pc-future-us-future-publishing/1119741259" target="_blank">On the Barnes &amp; Noble Nook Store</a></h5> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Stalk us in a number of ways:</h4> <p>Become a fan&nbsp;<a title="Maximum PC Facebook page" href="https://www.facebook.com/maximumpc" target="_blank">on Facebook</a></p> <p>Follow us&nbsp;<a href="https://twitter.com/maximumpc" target="_blank">on Twitter</a></p> <p>Subscribe to us&nbsp;<a title="Maximum PC Youtube page" href="https://www.youtube.com/user/MaximumPCMag" target="_blank">on Youtube</a></p> <p>Subscribe&nbsp;<a title="Maximum PC RSS Feed" href="http://feeds.feedburner.com/maximumpc/1337">to our RSS feed</a></p> <p>Subscribe&nbsp;<a href="https://itunes.apple.com/us/podcast/maximum-pc-no-bs-podcast/id213247824" target="_blank">to the podcast on iTunes</a></p> <p>email us at:&nbsp;<a href="mailto:maximumpcpodcast@gmail.com">maximumpcpodcast AT gmail DOT com</a></p> <p>Leave us a voicemail at 877-404-1337 x1337</p> http://www.maximumpc.com/no_bs_podcast_231_amd_and_origin_pc_settle_past_dispute_show#comments 231 amd cpu ddr4 episode graphics cards maximum pc No BS Podcast nvidia origin pc rumors Gaming News No BS Podcast Thu, 28 Aug 2014 20:37:32 +0000 The Maximum PC Staff 28441 at http://www.maximumpc.com Nvidia Retains Lead in Discrete Graphics Card Business, Shipments Down Overall http://www.maximumpc.com/nvidia_retains_lead_discrete_graphics_card_business_shipments_down_overall_2014 <!--paging_filter--><h3><img src="/files/u69/nvidia_0.jpg" alt="Nvidia" title="Nvidia" width="228" height="171" style="float: right;" />Tablets and embedded graphics are eating into the add-in board market</h3> <p>The latest report from Jon Peddie Research (JPR) shows that <strong>graphics add-in board (AIB) shipments during the second quarter of 2014 declined 17.5 percent compared to the previous quarter</strong>. JPR says the market is behaving according to past years, though the decrease was more than the 10-year average. What's also interesting is that the drop in discrete graphics card shipments coincided with a 1.3 percent increase in desktop PC shipments.</p> <p><a href="http://jonpeddie.com/publications/add-in-board-report/" target="_blank">According to JPR</a>, tablets and embedded graphics caused part of the decline. However, "PC gaming momentum continues to build and is the bright spot in the AIB market," with Nvidia reaping the lion's share of the rewards.</p> <p>Nvidia's share of the discrete graphics card market slipped sequentially from 64.9 percent to 62 percent, though barely budged compared to the same quarter a year ago when Nvidia held a 61.9 percent share of the market. Meanwhile, AMD ended the quarter wit ha 37.9 percent share, up from 35 percent in the previous quarter and down slightly from 38 percent a year ago.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_retains_lead_discrete_graphics_card_business_shipments_down_overall_2014#comments amd graphics card jon peddie research jpr nvidia Video Card News Tue, 26 Aug 2014 15:11:58 +0000 Paul Lilly 28419 at http://www.maximumpc.com Digital Storm Targets Gamers with Bolt II Battle Box Titan Z Special Edition http://www.maximumpc.com/digital_storm_targets_gamers_bolt_ii_battle_box_titan_z_special_edition_2014 <!--paging_filter--><h3><img src="/files/u69/digital_storm_battle_box.jpg" alt="Digital Storm Bolt II Battle Box" title="Digital Storm Bolt II Battle Box" width="228" height="183" style="float: right;" />Liquid cooled and ready for the heat of battle</h3> <p><strong>Digital Storm today unveiled its Bolt II Battle Box Titan Z Special Edition</strong>, which is a specially priced Bolt II small form factor (SFF) rig wielding a dual-GPU Nvidia GeForce GTX Titan Z graphics card. In addition to adding a Titan Z, Digital Storm went back to the drawing board and redesigned the Bolt II to accommodate a new Hardline Cooling System consisting of a 240mm radiator, pump, and "stunning" acrylic tubing with yellow coolant.</p> <p>"Nvidia launched the GTX Battle Box Program to allow gamers to play AAA, combat-focused games at max settings and super high resolutions," said Harjit Chana, Chief Brand Officer. "But gaming in 4K requires much more than simply upgrading components. Our Hardline Cooling System allows gamers to unlock the Bolt II’s full potential and experience games in ways they never thought possible."</p> <p>The Bolt II Battle Box is available now for just under $5,000, down from what its regular selling price should be, which is $6,658. At that starting price, the Bolt II Battle Box comes with a painted chassis, an overclocked Intel Core i7 4790K processor, Asus Maximus VI Impact motherboard, 16GB of DDR3-1600 memory, Blu-ray player, 250GB Samsung 840 EVO SSD, 1TB Seagate HDD, Titan Z graphics card, liquid cooling, internal lighting, 700W power supply, and Windows 8.1 64-bit.</p> <p>You can find out more (and/or place an order) on <a href="https://www.digitalstormonline.com/configurator.asp?id=1034531" target="_blank">Digital Storm's website</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/digital_storm_targets_gamers_bolt_ii_battle_box_titan_z_special_edition_2014#comments battle boz bolt ii Digital Storm Gaming geforce gtx titan z Hardware nvidia OEM rigs News Tue, 26 Aug 2014 14:35:10 +0000 Paul Lilly 28418 at http://www.maximumpc.com Nvidia Shield Tablet Review http://www.maximumpc.com/nvidia_shield_tablet_review_2014 <!--paging_filter--><h3>Updated: Now with video review!&nbsp;</h3> <p>Despite its problems, we actually liked <a title="Nvidia Shield review" href="http://www.maximumpc.com/nvidia_shield_review_2013" target="_blank">Nvidia’s original Shield Android gaming handheld</a>. Our biggest issue with it was that it was bulky and heavy. With rumors swirling around about a Shield 2, we were hoping to see a slimmer, lighter design. So consider us initially disappointed when we learned that the next iteration of Shield would just be yet another Android tablet. Yawn, right? The fact of the matter is that the Shield Tablet may be playing in an oversaturated market, but it’s still great at what it sets out to be.</p> <p><iframe src="//www.youtube.com/embed/dGigsxi9-K4" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>We've updated our review to include the video review above.</strong></p> <p>At eight inches, the Shield Tablet features a gorgeous 1900x1200 display, which shares the same resolution as Google’s flagship <a title="nexus 7 review" href="http://www.maximumpc.com/google_nexus_7_review_2013" target="_blank">Nexus 7</a> tablet. At 13.1 ounces, the Shield Tablet is about three ounces heavier than the Nexus 7 but still a lot lighter than the original’s 1 lb. 4.7 ounces.&nbsp;</p> <p>Part of the weight increase with the Shield Tablet over the Nexus 7 is due to the extra inch that you’re getting from the screen, but also because the Shield Tablet is passively cooled and has an extra thermal shield built inside to dissipate heat. It’s a little heavier than we like, but isn’t likely to cause any wrist problems. On the back of the Shield is an anti-slip surface and a 5MP camera, and on the front of the tablet is a front-facing 5MP camera and two front-facing speakers. While the speakers are not going to blow away dedicated Bluetooth speakers, they sound excellent for a tablet. In addition to the speakers, the Shield Tablet has a 3.5mm headphone jack up at the top. Other ports include Micro USB, Mini HDMI out, and a MicroSD card slot capable of taking up to 128GB cards. Buttons on the Shield include a volume rocker and a power button, which we found to be a little small and shallow for our liking.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_exploded_view_black_bckgr.jpg" alt="Nvidia Shield Tablet guts" title="Nvidia Shield Tablet guts" width="620" height="349" /></p> <p style="text-align: center;"><strong>The guts of the Nvidia Shield Tablet.</strong></p> <p>All of this is running on the latest version of Android KitKat (4.4). Nvidia says that it will update the tablet to Android L within a few weeks of Google’s official release. If Nvidia’s original Shield is any indication of how well the company keeps up with OS updates, you should be able to expect to get the latest version of Android after a couple of weeks, if not a months, after release. Regardless, the Shield Tablet is running a pretty stock version of Android to begin with, the main difference being that Nvidia has pre-loaded the tablet with its Shield Hub, which is a 10-foot UI used to purchase, download, and launch games.</p> <p>Arguably, the real star of the tablet is Nvidia’s new Tegra K1 mobile superchip. The 2.2GHz quad-core A15 SOC features Nvidia’s Kepler GPU architecture and 192 CUDA cores along with 2GB of low-power DDR3. K1 supports many of the graphical features commonplace in GeForce graphics cards, including tesselation, HDR lighting, Global illumination, subsurface scattering, and more.</p> <p>In our performance benchmarks, the K1 killed it. Up until now, the original Shield’s actively cooled Tegra 4 is arguably one of the most, if not <em>the</em> most, powerful Android SOC on the market, but the K1 slaughters it across the board. In Antutu and GeekBench benchmark, we saw modest gains of 12 percent to 23 percent in Shield vs. Shield Tablet action. But in Passmark and GFX Bench’s Trex test, we saw nearly a 50 percent spread, and in 3DMark’s mobile Icestorm Unlimited test, we saw an astounding 90 percent advantage for the Shield Tablet. This is incredible when you consider that the tablet has no fans and a two-watt TDP. Compared to the second-gen Nexus 7, the Shield Tablet benchmarks anywhere from 77 percent to 250 percent faster. This SOC is smoking fast.</p> <p>In terms of battery life, Nvidia claims you’ll get 10 hours watching/surfing the web and about five hours from gaming with its 19.75 Wh battery. This is up 3.75 Wh up from Google’s Nexus 7 equivalent, and from our experiential tests, we found those figures to be fairly accurate if not a best-case scenario. It will pretty much last you all day, but you'll still want to let it sip juice every night.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_war_thunder.jpg" alt="Shield Tablet review" title="Shield Tablet review" width="620" height="343" /></p> <p style="text-align: center;"><strong>The new wireless controller uses Wi-Fi Direct instead of Bluetooth for lower latency.</strong></p> <p>Of course, if you’re going to game with it, you’re going to need Nvidia’s new wireless Shield Controller. Sold separately for $60, the 11.2-ounce Shield Controller maintains the same button layout as the original Shield controller, but feels a lot lighter and is more comfortable to hold. While most Android game controllers operate over Bluetooth, Nvidia opted to go with Wi-Fi Direct, stating that it offers 2x faster response time and more bandwidth. The extra bandwidth allows you to plug a 3.5mm headphone into the controller and also allows you to link up to four controllers to the device, which is an appreciated feature when you hook up the tablet to your HDTV via the Shield Tablet’s <a title="shield console mode" href="http://www.maximumpc.com/nvidia_sweetens_shield_console_android_442_kitkat_price_drop_199_through_april" target="_blank">Console Mode</a>. Other unique features of the controller include capacitive-touch buttons for Android’s home, back, and play buttons. There’s also a big green Nvidia button that launches Shield Hub. The controller also has a small, triangle-shaped clickable touch pad which allows you to navigate your tablet from afar. One quibble with it is that we wish the trackpad was more square, to at least mimic the dimensions of the tablet; the triangle shape was a little awkward to interface with. Another problem that we initially had with the controller was that the + volume button stopped working after a while. We contacted Nvidia about this and the company sent us a new unit, which remedied the issue. One noticeable feature missing from the controller is rumble support. Nvidia said this was omitted on the original Shield to keep the weight down; its omission is a little more glaring this time around, however, since there's no screen attached to the device.</p> <p>The controller isn’t the only accessory that you’ll need to purchase separately if you want to tap into the full Shield Tablet experience. To effectively game with the tablet, you’ll need the Shield Tablet cover, which also acts as a stand. Like most tablets, a magnet in the cover shuts off the Shield Tablet when closed, but otherwise setting up the cover and getting it to act as a stand is initially pretty confusing. The cover currently only comes in black, and while we’re generally not big on marketing aesthetics, it would be nice to have an Nvidia green option to give the whole look a little more pop. We actually think the cover should just be thrown in gratis, especially considering that the cheapest 16GB model costs $300. On the upside though, you do get Nvidia’s new passive DirectStylus 2 that stows away nicely in the body of the Shield Tablet. Nvidia has pre-installed note-writing software and its own Nvidia Dabbler painting program. The nice thing about Dabbler is that it leverages the K1’s GPU acceleration so that you can virtually paint and blend colors in real time. There’s also a realistic mode where the “paint” slowly drips down the virtual canvas like it would in real life.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_trine2_0.jpg" alt="Shield tablet review" title="Shield tablet review" width="620" height="404" /></p> <p style="text-align: center;"><strong>The Shield Controller is a lot lighter and less blocky than the original Shield Portable.</strong></p> <p>But that’s probably not why you’re interested in the Shield Tablet. This device is first and foremost a gaming tablet and even comes with a free Android copy of Trine 2. Trine 2 was originally a PC game and it’s made a great transition to the Shield Tablet. While the game was never known to be a polygon pusher, it looks just as good as it ever did on its x86 debut.&nbsp;</p> <p>With gaming as the primary driver for Shield Tablet, you may wonder why Nvidia didn’t bundle its new controller. The company likely learned from Microsoft’s mistake with Kinect and the Xbox One: Gamers don’t like to spend money and getting the price as low as possible was likely on Nvidia’s mind. Of course, not everyone may even want a controller, with the general lack of support for them in games. Nvidia says there are now around 400 Android titles that support its controller, but that’s only a small percentage of Android games and the straight truth is that the overwhelming majority of these games are garbage.&nbsp;</p> <p>Nvidia is making a push for Android gaming, however. The company worked with Valve to port over Half Life 2 and Portal to the Shield and they look surprisingly fantastic and are easily the two prettiest games on Android at the moment. Whether Android will ever become a legitimate platform for hardcore gaming is anyone’s guess, but at least the Shield Tablet will net you a great front seat if the time ever arises.</p> <p>Luckily, you won’t have to rely solely on the Google Play store to get your gaming fix. Emulators run just as well here as they did on the original Shield and this iteration of Shield is also compatible with Gamestream, which is Nvidia’s streaming technology that allows you to stream games from your PC to your Shield. Gamestream, in theory, lets you play your controller-enabled PC games on a Shield.</p> <p>At this point, Nvidia says Gamestream supports more than 100 games such as Batman: Arkham Origins and Titanfall from EA’s Origin and Valve’s Steam service. The problem, though, is that there are hundreds more games on Steam and Origin that support controllers—but not the Shield Tablet’s controller. For example, Final Fantasy VII, a game that we couldn’t get to work with the original Shield, still isn't supported even though it works with an Xbox controller on the PC. When Gamestream does work, however, it’s relatively lag-free and kind of wonderful. The one caveat here is that you’ll have to get a 5GHz dual-band router to effectively get it working.&nbsp;</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/rh7fWdQT2eE" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Nvidia Shield Video demo.</strong></p> <p>Would we buy the Shield Tablet if we owned the original Shield (now renamed the Shield Portable)? Probably not. If we were looking for a new tablet and top-notch gaming performance was on the checklist, the Shield Tablet is easily the top contender today. We’d take it over the second-gen Nexus 7 in a heartbeat. While we understand why Nvidia decided to separate the cover and controller to keep the prices down and avoid the Kinect factor, we think a bundled package with a small price break as an alternative would have been nice. All things considered though, consider us surprised. The Shield Tablet is pretty dang cool.&nbsp;</p> <p><strong>$300</strong></p> <p><em><strong>Update:</strong> The original article incorrectly labled the Shield Portable benchmarks with the Nexus 7 figures. The issue has been resolved and both benchmark charts are listed below.&nbsp;</em></p> http://www.maximumpc.com/nvidia_shield_tablet_review_2014#comments android Google Hardware KitKat maximum pc nvidia portable Review shield tablet wireless controller News Reviews Tablets Mon, 18 Aug 2014 21:36:57 +0000 Jimmy Thang 28263 at http://www.maximumpc.com Free Copy of Borderlands: The Pre-Sequel with Purchase of Select GeForce GTX GPUs http://www.maximumpc.com/free_copy_borderlands_pre-sequel_purchase_select_geforce_gtx_gpus_2014 <!--paging_filter--><h3><img src="/files/u166440/borderlands_the_pre-sequel.jpg" alt="Borderlands The PreSequel" title="Borderlands The PreSequel" width="200" height="113" style="float: right;" />Offer available at participating retailers</h3> <p>If you have been looking to upgrade your GPU, then now would be a good time to do so. Especially if you are a fan of the Borderlands franchise. Those who <strong>purchase select Nvidia GeForce GTX GPUs will get a free copy of Borderlands: The Pre-Sequel</strong>.</p> <p>To get a free copy of the FPS, consumers will need to purchase the GeForce GTX Titan, <a title="MPC 780Ti benchmarks" href="http://www.maximumpc.com/nvidia_geforce_gtx_780_ti_benchmarks" target="_blank"><span style="color: #ff0000;">780Ti</span></a>, 780, or 770 desktop GPUs from a list of <a title="Nvidia website" href="http://www.geforce.com/GetBorderlands" target="_blank"><span style="color: #ff0000;">participating retailers</span></a>. &nbsp;However, the deal is only good while supplies last.&nbsp;</p> <p>Like Borderlands 2, Borderlands: The Pre-Sequel will make use of Nvidia’s PhysX technology. “If you have a high-end Nvidia GPU, Borderlands: The Pre-Sequel will offer higher fidelity and higher performance hardware-driven special effects including awesome weapon impacts, moon-shatteringly cool cryo explosions, and ice particles, and cloth and fluid simulation that blows me away every time I see it,” said Gearbox Software ceo Randy Pitchford.</p> <p>Borderlands: The Pre-Sequel takes place during Borderlands and Borderlands 2. In it, players will learn about Borderlands 2’s villain, Handsome Jack, and how he rose to power while shooting and looting their way through the game as his henchmen.</p> <p>Borderlands: The Pre-Sequel will be available in North America October 14 and October 17 Internationally.&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/free_copy_borderlands_pre-sequel_purchase_select_geforce_gtx_gpus_2014#comments Borderlands The Pre-Sequel free copy Boderlands Pre-Sequel GeForce GTX GPU nvidia News Wed, 13 Aug 2014 22:45:24 +0000 Sean D Knight 28342 at http://www.maximumpc.com Falcon Northwest Tiki-Z Micro Tower Totes a Titan Z Graphics Card http://www.maximumpc.com/falcon_northwest_tiki-z_micro_tower_totes_titan_z_graphics_card_2014 <!--paging_filter--><h3><img src="/files/u69/tiki-z.jpg" alt="Falcon Northwest Tiki Z" title="Falcon Northwest Tiki Z" width="228" height="155" style="float: right;" />A tiny system with the gaming performance of a Titan Z</h3> <p>Of all the systems featuring an <strong>Nvidia GeForce GTX Titan Z graphics card, the Tiki-Z Special Edition from Falcon Northwest </strong>might be the most impressive. That's because the Tiki-Z Special Edition is a micro-tower measuring just 4 inches wide and 13 inches tall --the same size as the standard Tiki and roughly equivalent to the original Xbox console -- yet has enough space to accommodate Nvidia's Titan Z, which is powered by a pair of Kepler GPUs.</p> <p>"Tiki-Z gives our customers the dual GPU option they’ve wanted since Tiki was first released," said Kelt Reeves, president of Falcon Northwest. "They can now play truly demanding 3D games at 4K resolution in a slim PC that can easily fit on anyone’s desk. Tiki-Z takes our power-per-cubic-inch mission to an entirely new level."</p> <p>In order to make room for Nvidia's largest graphics card and keep it cool, Falcon Northwest had to make several modifications, including laser-cut venting with a special exhaust, and the addition of a side window with lighting, which also serves as a custom air intake duct. It also needed help from its hardware partners -- SilverStone created a new version of its tiny 600W PSU.</p> <p>Pricing for the Tiki-Z starts at $5,614 and, for a limited time, will come with an Asus PB287Q 28-inch 4K monitor at no extra charge. Other features include an Asus Z97I Plus motherboard, Intel Core i7 4790K processor, Asetek liquid cooling, 8GB of DDR3-1866 RAM, GeForce GTX Titan Z, Crucial M550 256GB SSD, DVD writer, Windows 8.1, and three-year warranty.</p> <p>The Falcon Northwest Tiki-Z Special Edition is <a href="http://www.falcon-nw.com/promo/tiki-z" target="_blank">available now</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/falcon_northwest_tiki-z_micro_tower_totes_titan_z_graphics_card_2014#comments falcon northwest geforce gtx titan z graphics card Hardware nvidia OEM rigs tiki-z special edition Video Card News Wed, 13 Aug 2014 13:33:27 +0000 Paul Lilly 28337 at http://www.maximumpc.com Best Cheap Graphics Card http://www.maximumpc.com/best_cheap_graphics_card_2014 <!--paging_filter--><h3>Six entry-level cards battle for budget-board bragging rights</h3> <p>The video-card game is a lot like Hollywood. Movies like My Left Foot and The Artist take home the Oscars every year, but movies like Grown Ups 2 and Transformers 3 pull in all the cash. It's the same with GPUs, in that everyone loves to talk about $1,000 cards, but the actual bread-and-butter of the market is made up of models that cost between $100 and $150. These are not GPUs for 4K gaming, obviously, but they can provide a surprisingly pleasant 1080p gaming experience, and run cool and quiet, too.</p> <p>This arena has been so hot that AMD and Nvidia have recently released no fewer than six cards aimed at budget buyers. Four of these cards are from AMD, and Nvidia launched two models care of its all-new Maxwell architecture, so we decided to pit them against one another in an old-fashioned GPU roundup. All of these cards use either a single six-pin PCIe connector or none at all, so you don't even need a burly power supply to run them, just a little bit of scratch and the desire to get your game on. Let's dive in and see who rules the roost!</p> <h3>Nvidia's Maxwell changes the game</h3> <p>Budget GPUs have always been low-power components, and usually need just a single six-pin PCIe power connector to run them. After all, a budget GPU goes into a budget build, and those PCs typically don't come with the 600W-or-higher power supplies that provide dual six- or eight-pin PCIe connectors. Since many budget PSUs done have PCIe connectors, most of these cards come with Molex adapters in case you don't have one. The typical thermal design power (TDP) of these cards is around 110 watts or so, but that number fluctuates up and down according to spec. For comparison, the Radeon R9 290X has a TDP of roughly 300 watts, and Nvidia's flagship card, the GTX 780 Ti, has a TDP of 250W, so these budget cards don't have a lot of juice to work with. Therefore, efficiency is key, as the GPUs need to make the most out of the teeny, tiny bit of wattage they are allotted. During 2013, we saw AMD and Nvidia release GPUs based on all-new 28nm architectures named GCN and Kepler, respectively, and though Nvidia held a decisive advantage in the efficiency battle, it's taken things to the next level with its new ultra-low-power Maxwell GPUs that were released in February 2014.</p> <p>Beginning with the GTX 750 Ti and the GTX 750, Nvidia is embarking on a whole new course for its GPUs, centered around maximum power efficiency. The goal with its former Kepler architecture was to have better performance per watt compared to the previous architecture named Fermi, and it succeeded, but it's taken that same philosophy even further with Maxwell, which had as its goal to be twice as efficient as Kepler while providing 25 percent more performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/maxwell_small_0.jpg"><img src="/files/u152332/maxwell_small.jpg" alt="Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. " width="620" height="279" /></a></p> <p style="text-align: center;"><strong>Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. </strong></p> <p>Achieving more performance for the same model or SKU from one generation to the next is a tough enough challenge, but to do so by cutting power consumption in half is an even trickier gambit, especially considering the Maxwell GPUs are being fabricated on the same 28nm process it used for Kepler. We always expect more performance for less power when moving from one process to the next, such as 32nm to 28nm or 22nm to 14nm, but to do so on the same process is an amazing achievement indeed. Though Nvidia used many technological advances to reduce power consumption, the main structural change was to how the individual CUDA cores inside the Graphics Processing Clusters (GPCs) are organized and controlled. In Kepler, each GPC contained individual processing units, named SMX units, and each unit featured a piece of control logic that handled scheduling for 192 CUDA cores, which was a major increase from the 32 cores in each block found in Fermi. In Maxwell, Nvidia has gone back to 32 CUDA cores per block, but is putting four blocks into each unit, which are now called SM units. If you're confused, the simple version is this—rather than one piece of logic controlling 192 cores, Maxwell has a piece of logic for each cluster of 32 cores, and there are four clusters per unit, for a total of 128 cores per block. Therefore, it's reduced the number of cores per block by 64, from 192 to 128, which helps save energy. Also, since each piece of control logic only has to pay attention to 32 cores instead of 192, it can run them more efficiently, which also saves energy.</p> <p>The benefit to all this energy-saving is the GTX 750 cards don't need external power, so they can be dropped into pretty much any PC on the market without upgrading the power supply. That makes it a great upgrade for any pre-built POS you have lying around the house.</p> <h4>Gigabyte GTX 750 Ti WindForce</h4> <p>Nvidia's new Maxwell cards run surprisingly cool and quiet in stock trim, and that's with a fan no larger than an oversized Ritz cracker, so you can guess what happens when you throw a mid-sized WindForce cooler onto one of them. Yep, it's so quiet and cool you have to check with your fingers to see if it's even running. This bad boy ran at 45 C under load, making it the coolest-running card we've ever tested, so kudos to Nvidia and Gigabyte on holding it down (the temps, that is). This board comes off the factory line with a very mild overclock of just 13MHz (why even bother, seriously), and its boost clock has been massaged up to 1,111MHz from 1,085MHz, but as always, this is just a starting point for your overclocking adventures. The memory is kept at reference speeds however, running at 5,400MHz. The board sports 2GB of GDDR5 memory, and uses a custom design for its blue-colored PCB. It features two 80mm fans and an 8mm copper heat pipe. Most interesting is the board requires a six-pin PCIe connector, unlike the reference design, which does not.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/9755_big_small_0.jpg"><img src="/files/u152332/9755_big_small.jpg" alt="The WindForce cooler is overkill, but we like it that way. " title="Gigabyte GTX 750 Ti WindForce" width="620" height="500" /></a></p> <p style="text-align: center;"><strong>The WindForce cooler is overkill, but we like it that way. </strong></p> <p>In testing, the GTX 750 Ti WindForce was neck-and-neck with the Nvidia reference design, proving that Nvidia did a pretty good job with this card, and that its cooling requirements don't really warrant such an outlandish cooler. Still, we'll take it, and we loved that it was totally silent at all times. Overclocking potential is higher, of course, but since the reference design overclocked to 1,270MHz or so, we don’t think you should expect moon-shot overclocking records. Still, this card was rock solid, whisper quiet, and extremely cool.</p> <p><strong>Gigabyte GTX 750 Ti WindForce</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$160(Street), <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h4>MSI GeForce GTX 750 Gaming</h4> <p>Much like Gigabyte's GTX 750 Ti WindForce card, the MSI GTX 750 Gaming is a low-power board with a massive Twin Frozr cooler attached to it for truly exceptional cooling performance. The only downside is the formerly waifish GPU has been transformed into a full-size card, measuring more than nine inches long. Unlike the Gigabyte card though, this GPU eschews the six-pin PCIe connector, as it's just a 55W board, and since the PCIe slot delivers up to 75W, it doesn't even need the juice. Despite this card's entry-level billing, MSI has fitted it with “military-class” components for better overclocking and improved stability. It uses twin heat pipes to dual 100mm fans to keep it cool, as well. It also includes a switch that lets you toggle between booting from an older BIOS in case you run into overclocking issues.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msigtx750_small_0.jpg"><img src="/files/u152332/msigtx750_small.jpg" alt="MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card." title="MSI GeForce GTX 750 Gaming" width="620" height="364" /></a></p> <p style="text-align: center;"><strong>MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.</strong></p> <p>Speaking of which, this board lives up to its name and has a beefy overclock right out of the box, running at 1,085MHz base clock and 1,163MHz boost clock. It features 1GB of GDDR5 RAM on a 128-bit interface.</p> <p>The Twin Frozr cooler handles the miniscule amount of heat coming out of this board with aplomb—we were able to press our finger forcibly on the heatsink under load and felt almost no warmth, sort of like when we give Gordon a hug when he arrives at the office. As the only GTX 750 in this test, it showed it could run our entire test suite at decent frame rates, but it traded barbs with the slightly less expensive Radeon R7 260X. On paper, both the GTX 750 and the R7 260X are about $119, but rising prices from either increased demand or low supply have placed both cards in the $150 range, making it a dead heat. Still, it's a very good option for those who want an Nvidia GPU and its ecosystem but can't afford the Ti model.</p> <p><strong>MSI GeForce GTX 750 Gaming</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$140, <a href="http://www.msi.com/ " target="_blank">www.msi.com</a></strong></p> <h4>Sapphire Radeon R7 265 Dual-X</h4> <p>The Sapphire Radeon R7 265 is the odds-on favorite in this roundup, due to its impressive specs and the fact that it consumes more than twice the power of the Nvidia cards. Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270. This card actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. At 150W, its TDP is very high—or at least it seems high, given that the GTX 750 Ti costs the exact same $150 and is sitting at just 60W. Unlike the lower-priced R7 260X Bonaire part, though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire, basically). However, it will support the Mantle API, someday.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small_0.jpg"><img src="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small.jpg" alt="Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. " title="Sapphire Radeon R7 265 Dual-X" width="620" height="473" /></a></p> <p style="text-align: center;"><strong>Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. </strong></p> <p>The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damned impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall, though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check, too, making this the go-to GPU for those with small boxes or small monitors.</p> <p><strong> Sapphire Radeon R7 265 Dual-X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9ka.jpg" alt="score:9ka" title="score:9ka" width="210" height="80" /></div> </div> </div> <p><strong>$150 (MSRP), <a href="http://www.sapphiretech.com/ " target="_blank">www.sapphiretech.com</a></strong></p> <h4>AMD Radeon R7 260X</h4> <p>The Radeon R7 260X was originally AMD's go-to card for 1080p gaming on a budget. It’s the only card in the company’s sub-$200 lineup that supports all the next-gen features that appeared in its Hawaii-based flagship boards, including support for TrueAudio, XDMA Crossfire, Mantle (as in, it worked at launch), and it has the ability to drive up to three displays —all from this tiny $120 GPU. Not bad. In its previous life, this GPU was known as the Radeon HD 7790, aka Bonaire, and it was our favorite "budget" GPU when pitted against the Nvidia GTX 650 Ti Boost due to its decent performance and amazing at-the-time game bundles. It features a 128-bit memory bus, 896 Stream Processors, 2GB of RAM (up from 1GB on the previous card), and a healthy boost clock of 1,100MHz. TDP is just 115W, so it slots right in between the Nvidia cards and the higher-end R7 265 board. Essentially, this is an HD 7790 card with 1GB more RAM, and support for TrueAudio, which we have yet to experience.</p> <p style="text-align: center;"><a class="thickbox" href="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small_0.jpg"><img src="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small.jpg" alt="This $120 card supports Mantle, TrueAudio, and CrossFire. " title="AMD Radeon R7 260X" width="620" height="667" /></a></p> <p style="text-align: center;"><strong>This $120 card supports Mantle, TrueAudio, and CrossFire. </strong></p> <p>In testing, the R7 260X delivered passable performance, staking out the middle ground between the faster R7 265 and the much slower R7 250 cards. It ran at about 30fps in tests like Crysis 3 and Tomb Raider, but hit 51fps on CoD: Ghosts and 40fps on Battlefield 4, so it's certainly got enough horsepower to run the latest games on max settings. The fact that it supports all the latest technology from AMD is what bolsters this card's credentials, though. And the fact that it can run Mantle with no problems is a big plus for Battlefield 4 players. We like this card a lot, just like we enjoyed the HD 7790. While it’s not the fastest card in the bunch, it’s certainly far from the slowest.</p> <p><strong>AMD Radeon R7 260X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$120 <a href="http://www.amd.com/ " target="_blank">www.amd.com</a></strong></p> <h4> <hr />MSI Radeon R7 250 OC</h4> <p>In every competition, there must be one card that represents the lower end of the spectrum, and in this particular roundup, it’s this little guy from MSI. Sure, it's been overclocked a smidge and has a big-boy 2GB of memory, but this GPU is otherwise outgunned, plain and simple. For starters, it has just 384 Stream Processors, which is the lowest number we've ever seen on a modern GPU, so it's already severely handicapped right out of the gate. Board power is a decent 65W, but when looking at the specs of the Nvidia GTX 750, it is clearly outmatched. One other major problem, at least for those of us with big monitors, is we couldn't get it to run our desktop at 2560x1600 out of the box, as it only has one single-link DVI connector instead of dual-link. On the plus side, it doesn't require an auxiliary power connector and is just $100, so it's a very inexpensive board and would make a great upgrade from integrated graphics for someone on a strict budget.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msi_radeon_r7_250_oc_2gb_small_0.jpg"><img src="/files/u152332/msi_radeon_r7_250_oc_2gb_small.jpg" alt="Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB." title="MSI Radeon R7 250 OC" width="620" height="498" /></a></p> <p style="text-align: center;"><strong>Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.</strong></p> <p>That said, we actually felt bad for this card during testing. The sight of it limping along at 9 frames per second in Heaven 4.0 was tough to watch, and it didn't do much better on our other tests, either. Its best score was in Call of Duty: Ghosts, where it hit a barely playable 22fps. In all of our other tests, it was somewhere between 10 and 20 frames per second on high settings, which is simply not playable. We'd love to say something positive about the card though, so we'll note that it probably runs fine at medium settings and has a lot of great reviews on Newegg from people running at 1280x720 or 1680x1050 resolution.</p> <p><strong>MSI Radeon R7 250 OC 1TB</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_6.jpg" alt="score:6" title="score:6" width="210" height="80" /></div> </div> </div> <p><strong>$90 <a href=" http://us.msi.com/ " target="_blank">http://us.msi.com</a></strong></p> <h4>PowerColor Radeon R7 250X</h4> <p>The PowerColor Radeon R7 250X represents a mild bump in specs from the R7 250, as you would expect given its naming convention. It is outfitted with 1GB of RAM however, and a decent 1,000MHz boost clock. It packs 640 Stream Processors, placing it above the regular R7 250 but about mid-pack in this group. Its 1GB of memory runs on the same 128-bit memory bus as other cards in this roundup, so it's a bit constrained in its memory bandwidth, and we saw the effects of it in our testing. It supports DirectX 11.2, though, and has a dual-link DVI connector. It even supports CrossFire with an APU, but not with another PCIe GPU&shy;—or at least that's our understanding of it, since it says it supports CrossFire but doesn't have a connector on top of the card.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r7-250x-angle_small_0.jpg"><img src="/files/u152332/r7-250x-angle_small.jpg" alt="The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. " title="PowerColor Radeon R7 250X " width="620" height="369" /></a></p> <p style="text-align: center;"><strong>The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. </strong></p> <p>When we put the X-card to the test, it ended up faring a smidgen better than the non-X version, but just barely. It was able to hit 27 and 28 frames per second in Battlefield 4 and CoD: Ghosts, and 34 fps in Batman: Arkham Origins, but in the rest of the games in our test suite, its performance was simply not what we would call playable. Much like the R7 250 from MSI, this card can't handle 1080p with all settings maxed out, so this GPU is bringing up the rear in this crowd. Since it's priced "under $100" we won't be too harsh on it, as it seems like a fairly solid option for those on a very tight budget, and we'd definitely take it over the vanilla R7 250. We weren't able to see "street" pricing for this card, as it had not been released at press time, but our guess is even though it's one of the slowest in this bunch, it will likely be the go-to card under $100.</p> <p><strong>PowerColor Radeon R7 250X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$100, <a href="http://www.powercolor.com/ " target="_blank">www.powercolor.com</a></strong></p> <h3>Should you take the red pill or the green pill?</h3> <p><strong>Both companies offer proprietary technologies to lure you into their "ecosystems," so let’s take a look at what each has to offer</strong></p> <h4>Nvidia's Offerings</h4> <p><strong>G-Sync</strong></p> <p>Nvidia's G-Sync technology is arguably one of the strongest cards in Nvidia's hand, as it eliminates tearing in video games caused by the display's refresh rate being out of sync with the frame rate of the GPU. The silicon syncs the refresh rate with the cycle of frames rendered by the GPU, so movement onscreen looks buttery smooth at all times, even below 30fps. The only downside is you must have a G-Sync monitor, so that limits your selection quite a bit.</p> <p><strong>Regular driver releases</strong></p> <p>People love to say Nvidia has "better drivers" than AMD, and though the notion of "better" is debatable, it certainly releases them much more frequently than AMD. That's not to say AMD is a slouch—especially now that it releases a new "beta" build each month—but Nvidia seems to be paying more attention to driver support than AMD.</p> <p><strong>GeForce Experience and ShadowPlay</strong></p> <p>Nvidia's GeForce Experience software will automatically optimize any supported games you have installed, and also lets you stream to Twitch as well as capture in-game footage via ShadowPlay. It's a really slick piece of software, and though we don't need a software program to tell us "hey, max out all settings," we do love ShadowPlay.</p> <p><strong>PhysX</strong></p> <p>Nvidia's proprietary PhysX software allows game developers to include billowing smoke, exploding particles, cloth simulation, flowing liquids, and more, but there's just one problem—very few games utilize it. Even worse, the ones that do utilize it, do so in a way that is simply not that impressive, with one exception: Borderlands 2.</p> <h4>AMD's Offerings</h4> <p><strong>Mantle and TrueAudio</strong></p> <p>AMD is hoping that Mantle and TrueAudio become the must-have "killer technology" it offers over Nvidia, but at this early stage, it's difficult to say with certainty if that will ever happen. Mantle is a lower-level API that allows developers to optimize a game specifically targeted at AMD hardware, allowing for improved performance.</p> <p><strong>TressFX</strong></p> <p>This is proprietary physics technology similar to Nvidia's PhysX in that it only appears in certain games, and does very specific things. Thus far, we've only seen it used once—for Lara Croft's hair in Tomb Raider. Instead of a blocky ponytail, her mane is flowing and gets blown around by the wind. It looks cool but is by no means a must-have item on your shopping list, just like Nvidia's PhysX. <strong>&nbsp;</strong></p> <p><strong>Gaming Evolved by Raptr<br /></strong></p> <p>This software package is for Radeon users only, and does several things. First, it will automatically optimize supported games you have installed, and it also connects you to a huge community of gamers across all platforms, including PC and console. You can see who is playing what, track achievements, chat with friends, and also broadcast to Twitch.tv, too. AMD also has a "rewards" program that doles out points for using the software, and you can exchange those points for gear, games, swag, and more.</p> <p><strong>Currency mining</strong></p> <p>AMD cards are better for currency mining than Nvidia cards for several reasons, but their dominance is not in question. The most basic reason is the algorithms used in currency mining favor the GCN architecture, so much so that AMD cards are usually up to five times faster in performing these operations than their Nvidia equivalent. In fact, the mining craze has pushed the demand for these cards is so high that there's now a supply shortage.</p> <h3>All the cards, side by side</h3> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>MSI Geforce GTX 750 Gaming</td> <td>GigaByte GeForce GTX 750 Ti </td> <td>GeForce GTX 650 Ti Boost *</td> <td>GeForce GTX 660 *</td> <td>MSI Radeon R7 250</td> <td>PowerColor Radeon R7 250X</td> <td>AMD Radeon R7 260X</td> <td>Sapphire Radeon R7 265</td> </tr> <tr> <td class="item">Price</td> <td class="item-dark">$120 </td> <td>$150</td> <td>$160</td> <td>$210</td> <td>$90</td> <td>$100</td> <td>$120</td> <td>$150</td> </tr> <tr> <td>Code-name</td> <td>Maxwell</td> <td>Maxwell</td> <td>Kepler</td> <td>Kepler</td> <td>Oland</td> <td>Cape Verde</td> <td>Bonaire</td> <td>Curaco</td> </tr> <tr> <td class="item">Processing cores</td> <td class="item-dark">512</td> <td>640</td> <td>768</td> <td>960</td> <td>384</td> <td>640</td> <td>896</td> <td>1,024</td> </tr> <tr> <td>ROP units</td> <td>16</td> <td>16</td> <td>24</td> <td>24</td> <td>8</td> <td>16</td> <td>16</td> <td>32</td> </tr> <tr> <td>Texture units</td> <td>32</td> <td>40<strong><br /></strong></td> <td>64</td> <td>80</td> <td>24</td> <td>40</td> <td>56</td> <td>64</td> </tr> <tr> <td>Memory</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>1GB</td> <td>1GB</td> <td>2GB</td> <td>2GB</td> </tr> <tr> <td>Memory speed</td> <td>1,350MHz</td> <td>1,350MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,125MHz</td> <td>1,500MHz</td> <td>1,400MHz</td> </tr> <tr> <td>Memory bus</td> <td>128-bit</td> <td>128-bit</td> <td>192-bit</td> <td>192-bit</td> <td>128-bit</td> <td>128-bit</td> <td>128-bit</td> <td>256-bit</td> </tr> <tr> <td>Base clock</td> <td>1,020MHz</td> <td>1,020MHz</td> <td>980MHz</td> <td>980MHz</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> </tr> <tr> <td>Boost clock</td> <td>1,085MHz</td> <td>1,085MHz</td> <td>1,033MHz</td> <td>1,033MHz</td> <td>1,050MHz</td> <td>1,000MHz</td> <td>1,000MHz</td> <td>925MHz</td> </tr> <tr> <td>PCI Express version</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> </tr> <tr> <td>Transistor count</td> <td>1.87 billion</td> <td>1.87 billion</td> <td>2.54 billion</td> <td>2.54 billion</td> <td>1.04 billion</td> <td>1.04 billion</td> <td>2.08 billion</td> <td>2.8 billion</td> </tr> <tr> <td>Power connectors</td> <td>N/A</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>1x six-pin</td> </tr> <tr> <td>TDP</td> <td>54W</td> <td>60W</td> <td>134W</td> <td>140W</td> <td>65W</td> <td>80W</td> <td>115W</td> <td>150W</td> </tr> <tr> <td>Fab process</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> </tr> <tr> <td>Multi-card support</td> <td>No</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>Yes</td> </tr> <tr> <td>Outputs</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, <br />2x HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, <br />HDMI, DisplayPort</td> <td>DVI-S, VGA, HDMI</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, HDMI, DisplayPort</td> </tr> </tbody> </table> <p><em>Provided for reference purposes.<br /></em></p> </div> </div> </div> </div> </div> <h3>How we tested</h3> <p><strong>We lowered our requirements, but not too much</strong></p> <p>We normally test all of our video cards on our standardized test bed, which has now been in operation for a year and a half, with only a few changes along the way. In fact, the only major change we've made to it in the last year was swapping the X79 motherboard and case. The motherboard had endured several hundred video-card insertions, which is well beyond the design specs. The case had also become bent to the point where the video cards were drooping slightly. Some, shall we say, "overzealous" overclocking also caused the motherboard to begin behaving unpredictably. Regardless, it's a top-tier rig with an Intel Core i7-3960X Extreme processor, 16GB of DDR3 memory, an Asus Rampage IV Extreme motherboard, Crucial M500 SSD, and Windows 8 64-bit Enterprise.</p> <p>For the AMD video cards, we loaded Catalyst driver 14.1 Beta 1.6, as that was the latest driver, and for the Nvidia cards, we used the 334.89 WHQL driver that was released just before testing began. We originally planned to run the cards at our normal "midrange GPU" settings, which is 1920x1080 resolution with maximum settings and 4X AA enabled, but after testing began, we realized we needed to back off those settings just a tad. Instead of dialing it down to medium settings, though, as that would run counter to everything we stand for as a magazine, we left the settings on "high" across the board, but disabled AA. These settings were a bit much for the lower-end cards, but rather than lower our settings once again, we decided to stand fast at 1080p with high settings, since we figured that's where you want to be gaming and you deserve to know if some of the less-expensive cards can handle that type of action.</p> <h3>Mantle Reviewed</h3> <p><strong>A word about AMD's Mantle API</strong></p> <p>AMD's Mantle API is a competitor to DirectX, optimized specifically for AMD's GCN hardware. In theory, it should allow for better performance since its code knows exactly what hardware it's talking to, as opposed to DirectX's "works with any card" approach. The Mantle API should be able to give all GCN 1.0 and later AMD cards quite a boost in games that support it. However, AMD points out that Mantle will only show benefits in scenarios that are CPU-bound, not GPU-bound, so if your GPU is already working as hard as it can, Mantle isn’t going to help it. However, if your GPU is always waiting around for instructions from an overloaded CPU, then Mantle can offer respectable gains.</p> <p>To test it out, we ran Battlefield 4 on an older Ivy Bridge quad-core, non-Hyper-Threaded Core i5-3470 test bench with the R7 260X GPU at 1920x1080 and 4X AA enabled. As of press time, there are only two games that support Mantle—Battlefield 4 and an RTS demo on Steam named Star Swarm. In Battlefield 4, we were able to achieve 36fps using DirectX, and 44fps using Mantle, which is a healthy increase and a very respectable showing for a $120 video card. The benefit was much smaller in Star Swarm, however, showing a negligible increase of just two frames per second.</p> <p><img src="/files/u152332/bf4_screen_swap_small.jpg" alt="Enabling Mantle in Battlefield 4 does provide performance boosts for most configs." title="Battlefield 4" width="620" height="207" /></p> <p>We then moved to a much beefier test bench running a six-core, Hyper-Threaded Core i7-3960X and a Radeon R9 290X, and we saw an increase in Star Swarm of 100 percent, going from 25fps with DirectX to 51fps using Mantle in a timed demo. We got a decent bump in Battlefield 4, too, going from 84 fps using DirectX to 98 fps in Mantle.</p> <p>Overall, Mantle is legit, but it’s kind of like PhysX or TressFX in that it’s nice to have when it’s supported, and does provide a boost, but it isn’t something we’d count on being available in most games.</p> <h3>Final Thoughts</h3> <h3>If cost is an issue, you've got options</h3> <p>Testing the cards for this feature was an enlightening experience. We don’t usually dabble in GPU waters that are this shallow, so we really had no idea what to expect from all the cards assembled. To be honest, if we were given a shot of sodium pentothal, we’d have to admit that given these cards’ price points, we had low expectations but thought they’d all at least be able to handle 1920x1080 gaming. As spoiled gamers used to running 2K or 4K resolution, 1080p seems like child’s play to us. But we found out that running that resolution at maximum settings is a bridge too far for any GPU that costs less than $120 or so. The $150 models are the sweet spot, though, and are able to game extremely well at 1080 resolution, meaning the barrier to entry for “sweet gaming” has been lowered by $100, thanks to these new GPUs from AMD and Nvidia.</p> <p>Therefore, the summary of our results is that if you have $150 to spend on a GPU, you should buy the Sapphire Radeon R7 265, as it’s the best card for gaming at this price point, end of discussion. OK, thanks for reading.</p> <p>Oh, are you still here? OK, here’s some more detail. In our testing, the Sapphire R7 265 was hands-down the fastest GPU at its price point—by a non-trivial margin in many tests—and is superior to the GTX 750 Ti from Nvidia. It was also the only GPU to come close to the magical 60fps we desire in every game, making it pretty much the only card in this crowd that came close to satisfying our sky-high demands. The Nvidia GTX 750 Ti card was a close second, though, and provides a totally passable experience at 1080p with all settings maxed. Nvidia’s trump card is that it consumes less than half the power of the R7 265 and runs 10 C cooler, but we doubt most gamers will care except in severely PSU-constrained systems.</p> <p>Moving down one notch to the $120 cards, the GTX 750 and R7 260X trade blows quite well, so there’s no clear winner. Pick your ecosystem and get your game on, because these cards are totally decent, and delivered playable frame rates in every test we ran.</p> <p>The bottom rung of cards, which consists of the R7 250(X) cards, were not playable at 1080p at max settings, so avoid them. They are probably good for 1680x1050 gaming at medium settings or something in that ballpark, but in our world, that is a no-man’s land filled with shattered dreams and sad pixels.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>Nvidia GTX 750 Ti (reference)</td> <td>Gigabyte GTX 750 Ti</td> <td>MSI GTX 750 Gaming</td> <td>Sapphire Radeon R7 265</td> <td>AMD Radeon R7 260X</td> <td>PowerColor Radeon R7 250X</td> <td>MSI Radeon R7 250 OC</td> </tr> <tr> <td class="item">Driver</td> <td class="item-dark">334.89</td> <td>334.89</td> <td>334.89</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> </tr> <tr> <td>3DMark Fire Storm</td> <td>3,960</td> <td>3,974</td> <td>3,558</td> <td><strong>4,686</strong></td> <td>3,832</td> <td>2,806</td> <td>1,524</td> </tr> <tr> <td class="item">Unigine Heaven 4.0 (fps)</td> <td class="item-dark"><strong>30</strong></td> <td><strong>30<br /></strong></td> <td>25</td> <td>29</td> <td>23</td> <td>17</td> <td>9</td> </tr> <tr> <td>Crysis 3 (fps)</td> <td>27</td> <td>25</td> <td>21</td> <td><strong>32</strong></td> <td>26</td> <td>16</td> <td>10</td> </tr> <tr> <td>Far Cry 3 (fps)</td> <td><strong>40</strong></td> <td><strong>40<br /></strong></td> <td>34</td> <td><strong>40</strong></td> <td>34</td> <td>16</td> <td>14</td> </tr> <tr> <td>Tomb Raider (fps)</td> <td>30</td> <td>30</td> <td>26</td> <td><strong>36</strong></td> <td>31</td> <td>20</td> <td>12</td> </tr> <tr> <td>CoD: Ghosts (fps)</td> <td>51</td> <td>49</td> <td>42</td> <td><strong>67</strong></td> <td>51</td> <td>28</td> <td>22</td> </tr> <tr> <td>Battlefield 4 (fps)</td> <td>45</td> <td>45</td> <td>32</td> <td><strong>49</strong></td> <td>40</td> <td>27</td> <td>14</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td><strong>74</strong></td> <td>71</td> <td>61</td> <td>55</td> <td>43</td> <td>34</td> <td>18</td> </tr> <tr> <td>Assassin's Creed: Black Flag (fps)</td> <td>33</td> <td>33</td> <td>29</td> <td><strong>39</strong></td> <td>21</td> <td>21</td> <td>14</td> </tr> </tbody> </table> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7-3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 1920x1080 with no AA except for the 3DMark tests.<br /></em></p> </div> </div> </div> </div> </div> http://www.maximumpc.com/best_cheap_graphics_card_2014#comments 1080p affordable amd benchmarks budget cheap cheap graphics card gpu Hardware Hardware maximum pc may 2014 nvidia Video Card Features Tue, 12 Aug 2014 21:43:32 +0000 Josh Norem 28304 at http://www.maximumpc.com Nvidia Tegra K1 Claims Fame as First 64-Bit ARM Chip for Android http://www.maximumpc.com/nvidia_tegra_k1_claims_fame_first_64-bit_arm_chip_android_2014 <!--paging_filter--><h3><img src="/files/u69/tegra_k1.jpg" alt="Nvidia Tegra K1" title="Nvidia Tegra K1" width="228" height="163" style="float: right;" />Android enters the 64-bit ARM era</h3> <p>Say hello to <strong>"Denver," the codename for Nvidia's 64-bit Tegra K1 System-on-Chip (SoC), which also happens to be the first 64-bit ARM processor for Android</strong>. The new version of Nvidia's Tegra K1 SoC pairs the company's Kepler architecture-based GPU with its own custom-designed, 64-bit, dual-core "Project Denver" CPU, which Nvidia says is fully ARMv8 architecture compatible.</p> <p>So, what's special about this chip besides a 64-bit instruction set? Nvidia designed Denver to offer the highest single-core CPU throughput and industry-leading dual-core performance. Each Denver core (and there are two) sports a 7-way superscaler microarchitecture and includes a 128KB 4-way L1 instruction cache, a 64KB 4-way L1 data cache, and a 2MB 16-way L2 cache that services both cores.</p> <p>Using a process called Dynamic Code Optimization, Denver optimizes frequently used software routines at runtime into dense, highly tuned microcode-equivalent routines stored in a dedicated 128MB main-memory based optimization cache. This allows for faster access and execution, which translates into faster performance, in part because it lessens the need to re-optimize the software routine.</p> <p>Denver will also benefit Android platforms with new low latency power-state transitions. This is in addition to extensive power-gating and dynamic voltage and clock scaling routines based on workloads. The end result is more efficient power usage, which allows Denver's performance to rival even some mainstream PC-class CPUs at significantly reduced power consumption, <a href="http://blogs.nvidia.com/blog/2014/08/11/tegra-k1-denver-64-bit-for-android/" target="_blank">Nvidia says</a>.</p> <p>If you want to dig even further into the architecture, you can get more details <a href="http://www.tiriasresearch.com/downloads/nvidia-charts-its-own-path-to-armv8/" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_tegra_k1_claims_fame_first_64-bit_arm_chip_android_2014#comments 64-bit android ARM cpu denver Hardware nvidia processor tegra k1 News Tue, 12 Aug 2014 17:32:21 +0000 Paul Lilly 28334 at http://www.maximumpc.com Origin Rolls Out Battlebox Titan Z Systems for 4K Gaming http://www.maximumpc.com/origin_rolls_out_battlebox_titan_z_systems_4k_gaming_2014 <!--paging_filter--><h3><img src="/files/u69/battlebox_0.jpg" alt="Battlebox" title="Battlebox" width="228" height="171" style="float: right;" />Refreshed desktops offer 4K gaming performance starting at under $4,000</h3> <p>The 4K era is in its very early stages, and though the technology still has room for improvement (especially on the monitor side), you can make the leap if you're determined. Boutique builder <strong>Origin PC is all too happy to satisfy your 4K gaming needs with its Nvidia Battlebox Titan Z systems</strong> that are now available. These are basically refreshed Genesis, Millennium, and Chronos machines equipped with Nvidia GeForce Titan Z graphics cards.</p> <p>"Whether you’re new to 4K gaming or not, each Origin PC Battlebox Titan Z system was designed to provide the best 4K gaming experience right out of the box," Origin PC explains. "With a wide variety of special bundled options for each system, such as the inclusion of a 4K monitor bundled with a Titan Z graphics card, only Origin PC’s Battlebox TITAN Z systems can deliver the ultimate 4K gaming experience at an incredible value."</p> <p>All three setups start at under $4,000 with a single Titan Z graphics card. Each one also offers different starting configurations, though depending on the model and config, you can find yourself above the $4,000 starting point in a hurry. For example, the Chronos Z is available with a single Titan Z for under $4,000 or with a Titan Z and Asus PB287Q 4K Ultra HD 28-inch monitor, though in this case, the latter option still stays under $4,000 - it's the other two that bump up above, depending on which setup you start with.</p> <p>The same options are available for the Millennium Z, plus a third configuration consisting of dual Titan Z graphics cards for the price of one. Same goes for the Genesis Z, though it adds a fourth option -- dual Cryogenic liquid cooled Titan Z graphics cards for the price of one.</p> <p>If you're interested, just head over to the special <a href="http://www.originpc.com/promotion/4k-titan-z/" target="_blank">4K Battlebox landing page</a> on Origin's website.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/origin_rolls_out_battlebox_titan_z_systems_4k_gaming_2014#comments 4k battlebox chronos z Gaming genesis z Hardware millennium z nvidia OEM origin rigs titan z ultra hd News Mon, 11 Aug 2014 15:40:25 +0000 Paul Lilly 28324 at http://www.maximumpc.com Nvidia Stacks Two Cheap LCD Panels to Quadruple Display Resolution http://www.maximumpc.com/nvidia_stacks_two_cheap_lcd_panels_quadruple_display_resolution_2014 <!--paging_filter--><h3><img src="/files/u166440/cascaded_diplays.jpg" alt="Cascaded Display Parts" title="Cascaded Display Parts" width="200" height="110" style="float: right;" />Company focusing on head-mounted displays</h3> <p>Developers of head-mounted displays (HMDs) could benefit from Nvidia’s recent efforts sometime in the future. <strong>Nvidia was able to quadruple display resolution by stacking two cheap LCD panels</strong> on top of one another.</p> <p>Called cascaded displays, the technique involved the use of two 7-inch 1280x800 LCD monitors. The LCD panels were removed from the casings and the backlight removed from one panel. Both panels were then placed slightly offset, about a quarter-pixel, on top of each other with a quarter-wave film placed between them. The reason the panels were offset, according to the company, is that it acts like a “shutter” for a cluster of four pixels. This is how the resolution is quadrupled. In addition, both panels are able to provide refresh rates over 60Hz.&nbsp;</p> <p>During its research Nvidia created its own HMD prototype, and special software to take advantage of the cascaded displays, and provided screenshots that compared how a game would look on a conventional LCD HMD and a Cascaded LCD HMD. The result was that text and details were a lot clearer on the cascaded displays than a conventional one.</p> <p style="text-align: center;"><img src="/files/u166440/cascaded_displays_001.jpg" alt="Cascaded Display screenshot" title="Cascaded Display screenshot" width="600" height="338" /></p> <p>According to Nvidia’s research paper, the technique is an alternative to the “brute force solution of addressable pixel count” that results in 4K monitors and even mobile displays.&nbsp;</p> <p>If you want to know about cascaded displays in full detail then check out <a title="Nvidia cascaded display research" href="https://research.nvidia.com/publication/cascaded-displays-spatiotemporal-superresolution-using-offset-pixel-layers" target="_blank"><span style="color: #ff0000;">Nvidia’s research</span></a> and <a title="Cascaded display demo" href="https://www.youtube.com/watch?v=0XwaARRMbSA" target="_blank"><span style="color: #ff0000;">YouTube demo</span></a>.&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/nvidia_stacks_two_cheap_lcd_panels_quadruple_display_resolution_2014#comments cascaded displays nvidia nvidia cascaded displays nvidia HMD quadruple display resolution News Tue, 05 Aug 2014 03:30:08 +0000 Sean D Knight 28288 at http://www.maximumpc.com