Video Card http://www.maximumpc.com/taxonomy/term/384/ en Falcon Northwest Tiki-Z Micro Tower Totes a Titan Z Graphics Card http://www.maximumpc.com/falcon_northwest_tiki-z_micro_tower_totes_titan_z_graphics_card_2014 <!--paging_filter--><h3><img src="/files/u69/tiki-z.jpg" alt="Falcon Northwest Tiki Z" title="Falcon Northwest Tiki Z" width="228" height="155" style="float: right;" />A tiny system with the gaming performance of a Titan Z</h3> <p>Of all the systems featuring an <strong>Nvidia GeForce GTX Titan Z graphics card, the Tiki-Z Special Edition from Falcon Northwest </strong>might be the most impressive. That's because the Tiki-Z Special Edition is a micro-tower measuring just 4 inches wide and 13 inches tall --the same size as the standard Tiki and roughly equivalent to the original Xbox console -- yet has enough space to accommodate Nvidia's Titan Z, which is powered by a pair of Kepler GPUs.</p> <p>"Tiki-Z gives our customers the dual GPU option they’ve wanted since Tiki was first released," said Kelt Reeves, president of Falcon Northwest. "They can now play truly demanding 3D games at 4K resolution in a slim PC that can easily fit on anyone’s desk. Tiki-Z takes our power-per-cubic-inch mission to an entirely new level."</p> <p>In order to make room for Nvidia's largest graphics card and keep it cool, Falcon Northwest had to make several modifications, including laser-cut venting with a special exhaust, and the addition of a side window with lighting, which also serves as a custom air intake duct. It also needed help from its hardware partners -- SilverStone created a new version of its tiny 600W PSU.</p> <p>Pricing for the Tiki-Z starts at $5,614 and, for a limited time, will come with an Asus PB287Q 28-inch 4K monitor at no extra charge. Other features include an Asus Z97I Plus motherboard, Intel Core i7 4790K processor, Asetek liquid cooling, 8GB of DDR3-1866 RAM, GeForce GTX Titan Z, Crucial M550 256GB SSD, DVD writer, Windows 8.1, and three-year warranty.</p> <p>The Falcon Northwest Tiki-Z Special Edition is <a href="http://www.falcon-nw.com/promo/tiki-z" target="_blank">available now</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/falcon_northwest_tiki-z_micro_tower_totes_titan_z_graphics_card_2014#comments falcon northwest geforce gtx titan z graphics card Hardware nvidia OEM rigs tiki-z special edition Video Card News Wed, 13 Aug 2014 13:33:27 +0000 Paul Lilly 28337 at http://www.maximumpc.com EVGA Announces Passively Cooled GeForce GT 720 Graphics Cards http://www.maximumpc.com/evga_announces_passively_cooled_geforce_gt_720_graphics_cards_2014 <!--paging_filter--><h3><img src="/files/u69/evga_geforce_gt_720.jpg" alt="EVGA GeForce GT 720" title="EVGA GeForce GT 720" width="228" height="228" style="float: right;" />Available in 1GB and 2GB models</h3> <p><strong>EVGA this week added the GeForce GT 720 with passive cooling to its graphics card lineup</strong>. Compared to integrated graphics, Nvidia says you can expect up to 2x faster web browsing, 5x faster video editing, and 8x faster photo editing. And when it comes time to game, the jump in performance can be up to 70 percent faster, all while taking up just a single slot in your PC, Nvidia says.</p> <p>The EVGA GeForce GT 720 comes with <a href="http://www.evga.com/articles/00864/EVGA-GeForce-GT-720/#2722" target="_blank">1GB</a> or <a href="http://www.evga.com/articles/00864/EVGA-GeForce-GT-720/#2724" target="_blank">2GB</a> of GDDR5 memory and is available in low profile and full height form factors. Other than the amount of RAM and physical size, the specs are the same -- 192 CUDA cores, 797MHz base clock, 1800MHz memory clock, 64-bit bus, 1.43ns memory speed, and 14.4GB/s of memory bandwidth.</p> <p>Connectivity options include VGA, DVI, and HDMI. If you're so inclined, you can drive up to three separate displays at the same time using a single card, <a href="http://www.evga.com/articles/00864/EVGA-GeForce-GT-720/" target="_blank">Nvidia says</a>.</p> <p>The EVGA GeForce GT 720 will be available soon. No word yet on price.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/evga_announces_passively_cooled_geforce_gt_720_graphics_cards_2014#comments Build a PC evga geforce gt 720 gpu graphics card Hardware Video Card News Wed, 13 Aug 2014 13:31:27 +0000 Paul Lilly 28336 at http://www.maximumpc.com Best Cheap Graphics Card http://www.maximumpc.com/best_cheap_graphics_card_2014 <!--paging_filter--><h3>Six entry-level cards battle for budget-board bragging rights</h3> <p>The video-card game is a lot like Hollywood. Movies like My Left Foot and The Artist take home the Oscars every year, but movies like Grown Ups 2 and Transformers 3 pull in all the cash. It's the same with GPUs, in that everyone loves to talk about $1,000 cards, but the actual bread-and-butter of the market is made up of models that cost between $100 and $150. These are not GPUs for 4K gaming, obviously, but they can provide a surprisingly pleasant 1080p gaming experience, and run cool and quiet, too.</p> <p>This arena has been so hot that AMD and Nvidia have recently released no fewer than six cards aimed at budget buyers. Four of these cards are from AMD, and Nvidia launched two models care of its all-new Maxwell architecture, so we decided to pit them against one another in an old-fashioned GPU roundup. All of these cards use either a single six-pin PCIe connector or none at all, so you don't even need a burly power supply to run them, just a little bit of scratch and the desire to get your game on. Let's dive in and see who rules the roost!</p> <h3>Nvidia's Maxwell changes the game</h3> <p>Budget GPUs have always been low-power components, and usually need just a single six-pin PCIe power connector to run them. After all, a budget GPU goes into a budget build, and those PCs typically don't come with the 600W-or-higher power supplies that provide dual six- or eight-pin PCIe connectors. Since many budget PSUs done have PCIe connectors, most of these cards come with Molex adapters in case you don't have one. The typical thermal design power (TDP) of these cards is around 110 watts or so, but that number fluctuates up and down according to spec. For comparison, the Radeon R9 290X has a TDP of roughly 300 watts, and Nvidia's flagship card, the GTX 780 Ti, has a TDP of 250W, so these budget cards don't have a lot of juice to work with. Therefore, efficiency is key, as the GPUs need to make the most out of the teeny, tiny bit of wattage they are allotted. During 2013, we saw AMD and Nvidia release GPUs based on all-new 28nm architectures named GCN and Kepler, respectively, and though Nvidia held a decisive advantage in the efficiency battle, it's taken things to the next level with its new ultra-low-power Maxwell GPUs that were released in February 2014.</p> <p>Beginning with the GTX 750 Ti and the GTX 750, Nvidia is embarking on a whole new course for its GPUs, centered around maximum power efficiency. The goal with its former Kepler architecture was to have better performance per watt compared to the previous architecture named Fermi, and it succeeded, but it's taken that same philosophy even further with Maxwell, which had as its goal to be twice as efficient as Kepler while providing 25 percent more performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/maxwell_small_0.jpg"><img src="/files/u152332/maxwell_small.jpg" alt="Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. " width="620" height="279" /></a></p> <p style="text-align: center;"><strong>Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. </strong></p> <p>Achieving more performance for the same model or SKU from one generation to the next is a tough enough challenge, but to do so by cutting power consumption in half is an even trickier gambit, especially considering the Maxwell GPUs are being fabricated on the same 28nm process it used for Kepler. We always expect more performance for less power when moving from one process to the next, such as 32nm to 28nm or 22nm to 14nm, but to do so on the same process is an amazing achievement indeed. Though Nvidia used many technological advances to reduce power consumption, the main structural change was to how the individual CUDA cores inside the Graphics Processing Clusters (GPCs) are organized and controlled. In Kepler, each GPC contained individual processing units, named SMX units, and each unit featured a piece of control logic that handled scheduling for 192 CUDA cores, which was a major increase from the 32 cores in each block found in Fermi. In Maxwell, Nvidia has gone back to 32 CUDA cores per block, but is putting four blocks into each unit, which are now called SM units. If you're confused, the simple version is this—rather than one piece of logic controlling 192 cores, Maxwell has a piece of logic for each cluster of 32 cores, and there are four clusters per unit, for a total of 128 cores per block. Therefore, it's reduced the number of cores per block by 64, from 192 to 128, which helps save energy. Also, since each piece of control logic only has to pay attention to 32 cores instead of 192, it can run them more efficiently, which also saves energy.</p> <p>The benefit to all this energy-saving is the GTX 750 cards don't need external power, so they can be dropped into pretty much any PC on the market without upgrading the power supply. That makes it a great upgrade for any pre-built POS you have lying around the house.</p> <h4>Gigabyte GTX 750 Ti WindForce</h4> <p>Nvidia's new Maxwell cards run surprisingly cool and quiet in stock trim, and that's with a fan no larger than an oversized Ritz cracker, so you can guess what happens when you throw a mid-sized WindForce cooler onto one of them. Yep, it's so quiet and cool you have to check with your fingers to see if it's even running. This bad boy ran at 45 C under load, making it the coolest-running card we've ever tested, so kudos to Nvidia and Gigabyte on holding it down (the temps, that is). This board comes off the factory line with a very mild overclock of just 13MHz (why even bother, seriously), and its boost clock has been massaged up to 1,111MHz from 1,085MHz, but as always, this is just a starting point for your overclocking adventures. The memory is kept at reference speeds however, running at 5,400MHz. The board sports 2GB of GDDR5 memory, and uses a custom design for its blue-colored PCB. It features two 80mm fans and an 8mm copper heat pipe. Most interesting is the board requires a six-pin PCIe connector, unlike the reference design, which does not.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/9755_big_small_0.jpg"><img src="/files/u152332/9755_big_small.jpg" alt="The WindForce cooler is overkill, but we like it that way. " title="Gigabyte GTX 750 Ti WindForce" width="620" height="500" /></a></p> <p style="text-align: center;"><strong>The WindForce cooler is overkill, but we like it that way. </strong></p> <p>In testing, the GTX 750 Ti WindForce was neck-and-neck with the Nvidia reference design, proving that Nvidia did a pretty good job with this card, and that its cooling requirements don't really warrant such an outlandish cooler. Still, we'll take it, and we loved that it was totally silent at all times. Overclocking potential is higher, of course, but since the reference design overclocked to 1,270MHz or so, we don’t think you should expect moon-shot overclocking records. Still, this card was rock solid, whisper quiet, and extremely cool.</p> <p><strong>Gigabyte GTX 750 Ti WindForce</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$160(Street), <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h4>MSI GeForce GTX 750 Gaming</h4> <p>Much like Gigabyte's GTX 750 Ti WindForce card, the MSI GTX 750 Gaming is a low-power board with a massive Twin Frozr cooler attached to it for truly exceptional cooling performance. The only downside is the formerly waifish GPU has been transformed into a full-size card, measuring more than nine inches long. Unlike the Gigabyte card though, this GPU eschews the six-pin PCIe connector, as it's just a 55W board, and since the PCIe slot delivers up to 75W, it doesn't even need the juice. Despite this card's entry-level billing, MSI has fitted it with “military-class” components for better overclocking and improved stability. It uses twin heat pipes to dual 100mm fans to keep it cool, as well. It also includes a switch that lets you toggle between booting from an older BIOS in case you run into overclocking issues.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msigtx750_small_0.jpg"><img src="/files/u152332/msigtx750_small.jpg" alt="MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card." title="MSI GeForce GTX 750 Gaming" width="620" height="364" /></a></p> <p style="text-align: center;"><strong>MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.</strong></p> <p>Speaking of which, this board lives up to its name and has a beefy overclock right out of the box, running at 1,085MHz base clock and 1,163MHz boost clock. It features 1GB of GDDR5 RAM on a 128-bit interface.</p> <p>The Twin Frozr cooler handles the miniscule amount of heat coming out of this board with aplomb—we were able to press our finger forcibly on the heatsink under load and felt almost no warmth, sort of like when we give Gordon a hug when he arrives at the office. As the only GTX 750 in this test, it showed it could run our entire test suite at decent frame rates, but it traded barbs with the slightly less expensive Radeon R7 260X. On paper, both the GTX 750 and the R7 260X are about $119, but rising prices from either increased demand or low supply have placed both cards in the $150 range, making it a dead heat. Still, it's a very good option for those who want an Nvidia GPU and its ecosystem but can't afford the Ti model.</p> <p><strong>MSI GeForce GTX 750 Gaming</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$140, <a href="http://www.msi.com/ " target="_blank">www.msi.com</a></strong></p> <h4>Sapphire Radeon R7 265 Dual-X</h4> <p>The Sapphire Radeon R7 265 is the odds-on favorite in this roundup, due to its impressive specs and the fact that it consumes more than twice the power of the Nvidia cards. Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270. This card actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. At 150W, its TDP is very high—or at least it seems high, given that the GTX 750 Ti costs the exact same $150 and is sitting at just 60W. Unlike the lower-priced R7 260X Bonaire part, though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire, basically). However, it will support the Mantle API, someday.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small_0.jpg"><img src="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small.jpg" alt="Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. " title="Sapphire Radeon R7 265 Dual-X" width="620" height="473" /></a></p> <p style="text-align: center;"><strong>Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. </strong></p> <p>The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damned impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall, though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check, too, making this the go-to GPU for those with small boxes or small monitors.</p> <p><strong> Sapphire Radeon R7 265 Dual-X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9ka.jpg" alt="score:9ka" title="score:9ka" width="210" height="80" /></div> </div> </div> <p><strong>$150 (MSRP), <a href="http://www.sapphiretech.com/ " target="_blank">www.sapphiretech.com</a></strong></p> <h4>AMD Radeon R7 260X</h4> <p>The Radeon R7 260X was originally AMD's go-to card for 1080p gaming on a budget. It’s the only card in the company’s sub-$200 lineup that supports all the next-gen features that appeared in its Hawaii-based flagship boards, including support for TrueAudio, XDMA Crossfire, Mantle (as in, it worked at launch), and it has the ability to drive up to three displays —all from this tiny $120 GPU. Not bad. In its previous life, this GPU was known as the Radeon HD 7790, aka Bonaire, and it was our favorite "budget" GPU when pitted against the Nvidia GTX 650 Ti Boost due to its decent performance and amazing at-the-time game bundles. It features a 128-bit memory bus, 896 Stream Processors, 2GB of RAM (up from 1GB on the previous card), and a healthy boost clock of 1,100MHz. TDP is just 115W, so it slots right in between the Nvidia cards and the higher-end R7 265 board. Essentially, this is an HD 7790 card with 1GB more RAM, and support for TrueAudio, which we have yet to experience.</p> <p style="text-align: center;"><a class="thickbox" href="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small_0.jpg"><img src="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small.jpg" alt="This $120 card supports Mantle, TrueAudio, and CrossFire. " title="AMD Radeon R7 260X" width="620" height="667" /></a></p> <p style="text-align: center;"><strong>This $120 card supports Mantle, TrueAudio, and CrossFire. </strong></p> <p>In testing, the R7 260X delivered passable performance, staking out the middle ground between the faster R7 265 and the much slower R7 250 cards. It ran at about 30fps in tests like Crysis 3 and Tomb Raider, but hit 51fps on CoD: Ghosts and 40fps on Battlefield 4, so it's certainly got enough horsepower to run the latest games on max settings. The fact that it supports all the latest technology from AMD is what bolsters this card's credentials, though. And the fact that it can run Mantle with no problems is a big plus for Battlefield 4 players. We like this card a lot, just like we enjoyed the HD 7790. While it’s not the fastest card in the bunch, it’s certainly far from the slowest.</p> <p><strong>AMD Radeon R7 260X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$120 <a href="http://www.amd.com/ " target="_blank">www.amd.com</a></strong></p> <h4> <hr />MSI Radeon R7 250 OC</h4> <p>In every competition, there must be one card that represents the lower end of the spectrum, and in this particular roundup, it’s this little guy from MSI. Sure, it's been overclocked a smidge and has a big-boy 2GB of memory, but this GPU is otherwise outgunned, plain and simple. For starters, it has just 384 Stream Processors, which is the lowest number we've ever seen on a modern GPU, so it's already severely handicapped right out of the gate. Board power is a decent 65W, but when looking at the specs of the Nvidia GTX 750, it is clearly outmatched. One other major problem, at least for those of us with big monitors, is we couldn't get it to run our desktop at 2560x1600 out of the box, as it only has one single-link DVI connector instead of dual-link. On the plus side, it doesn't require an auxiliary power connector and is just $100, so it's a very inexpensive board and would make a great upgrade from integrated graphics for someone on a strict budget.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msi_radeon_r7_250_oc_2gb_small_0.jpg"><img src="/files/u152332/msi_radeon_r7_250_oc_2gb_small.jpg" alt="Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB." title="MSI Radeon R7 250 OC" width="620" height="498" /></a></p> <p style="text-align: center;"><strong>Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.</strong></p> <p>That said, we actually felt bad for this card during testing. The sight of it limping along at 9 frames per second in Heaven 4.0 was tough to watch, and it didn't do much better on our other tests, either. Its best score was in Call of Duty: Ghosts, where it hit a barely playable 22fps. In all of our other tests, it was somewhere between 10 and 20 frames per second on high settings, which is simply not playable. We'd love to say something positive about the card though, so we'll note that it probably runs fine at medium settings and has a lot of great reviews on Newegg from people running at 1280x720 or 1680x1050 resolution.</p> <p><strong>MSI Radeon R7 250 OC 1TB</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_6.jpg" alt="score:6" title="score:6" width="210" height="80" /></div> </div> </div> <p><strong>$90 <a href=" http://us.msi.com/ " target="_blank">http://us.msi.com</a></strong></p> <h4>PowerColor Radeon R7 250X</h4> <p>The PowerColor Radeon R7 250X represents a mild bump in specs from the R7 250, as you would expect given its naming convention. It is outfitted with 1GB of RAM however, and a decent 1,000MHz boost clock. It packs 640 Stream Processors, placing it above the regular R7 250 but about mid-pack in this group. Its 1GB of memory runs on the same 128-bit memory bus as other cards in this roundup, so it's a bit constrained in its memory bandwidth, and we saw the effects of it in our testing. It supports DirectX 11.2, though, and has a dual-link DVI connector. It even supports CrossFire with an APU, but not with another PCIe GPU&shy;—or at least that's our understanding of it, since it says it supports CrossFire but doesn't have a connector on top of the card.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r7-250x-angle_small_0.jpg"><img src="/files/u152332/r7-250x-angle_small.jpg" alt="The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. " title="PowerColor Radeon R7 250X " width="620" height="369" /></a></p> <p style="text-align: center;"><strong>The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. </strong></p> <p>When we put the X-card to the test, it ended up faring a smidgen better than the non-X version, but just barely. It was able to hit 27 and 28 frames per second in Battlefield 4 and CoD: Ghosts, and 34 fps in Batman: Arkham Origins, but in the rest of the games in our test suite, its performance was simply not what we would call playable. Much like the R7 250 from MSI, this card can't handle 1080p with all settings maxed out, so this GPU is bringing up the rear in this crowd. Since it's priced "under $100" we won't be too harsh on it, as it seems like a fairly solid option for those on a very tight budget, and we'd definitely take it over the vanilla R7 250. We weren't able to see "street" pricing for this card, as it had not been released at press time, but our guess is even though it's one of the slowest in this bunch, it will likely be the go-to card under $100.</p> <p><strong>PowerColor Radeon R7 250X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$100, <a href="http://www.powercolor.com/ " target="_blank">www.powercolor.com</a></strong></p> <h3>Should you take the red pill or the green pill?</h3> <p><strong>Both companies offer proprietary technologies to lure you into their "ecosystems," so let’s take a look at what each has to offer</strong></p> <h4>Nvidia's Offerings</h4> <p><strong>G-Sync</strong></p> <p>Nvidia's G-Sync technology is arguably one of the strongest cards in Nvidia's hand, as it eliminates tearing in video games caused by the display's refresh rate being out of sync with the frame rate of the GPU. The silicon syncs the refresh rate with the cycle of frames rendered by the GPU, so movement onscreen looks buttery smooth at all times, even below 30fps. The only downside is you must have a G-Sync monitor, so that limits your selection quite a bit.</p> <p><strong>Regular driver releases</strong></p> <p>People love to say Nvidia has "better drivers" than AMD, and though the notion of "better" is debatable, it certainly releases them much more frequently than AMD. That's not to say AMD is a slouch—especially now that it releases a new "beta" build each month—but Nvidia seems to be paying more attention to driver support than AMD.</p> <p><strong>GeForce Experience and ShadowPlay</strong></p> <p>Nvidia's GeForce Experience software will automatically optimize any supported games you have installed, and also lets you stream to Twitch as well as capture in-game footage via ShadowPlay. It's a really slick piece of software, and though we don't need a software program to tell us "hey, max out all settings," we do love ShadowPlay.</p> <p><strong>PhysX</strong></p> <p>Nvidia's proprietary PhysX software allows game developers to include billowing smoke, exploding particles, cloth simulation, flowing liquids, and more, but there's just one problem—very few games utilize it. Even worse, the ones that do utilize it, do so in a way that is simply not that impressive, with one exception: Borderlands 2.</p> <h4>AMD's Offerings</h4> <p><strong>Mantle and TrueAudio</strong></p> <p>AMD is hoping that Mantle and TrueAudio become the must-have "killer technology" it offers over Nvidia, but at this early stage, it's difficult to say with certainty if that will ever happen. Mantle is a lower-level API that allows developers to optimize a game specifically targeted at AMD hardware, allowing for improved performance.</p> <p><strong>TressFX</strong></p> <p>This is proprietary physics technology similar to Nvidia's PhysX in that it only appears in certain games, and does very specific things. Thus far, we've only seen it used once—for Lara Croft's hair in Tomb Raider. Instead of a blocky ponytail, her mane is flowing and gets blown around by the wind. It looks cool but is by no means a must-have item on your shopping list, just like Nvidia's PhysX. <strong>&nbsp;</strong></p> <p><strong>Gaming Evolved by Raptr<br /></strong></p> <p>This software package is for Radeon users only, and does several things. First, it will automatically optimize supported games you have installed, and it also connects you to a huge community of gamers across all platforms, including PC and console. You can see who is playing what, track achievements, chat with friends, and also broadcast to Twitch.tv, too. AMD also has a "rewards" program that doles out points for using the software, and you can exchange those points for gear, games, swag, and more.</p> <p><strong>Currency mining</strong></p> <p>AMD cards are better for currency mining than Nvidia cards for several reasons, but their dominance is not in question. The most basic reason is the algorithms used in currency mining favor the GCN architecture, so much so that AMD cards are usually up to five times faster in performing these operations than their Nvidia equivalent. In fact, the mining craze has pushed the demand for these cards is so high that there's now a supply shortage.</p> <h3>All the cards, side by side</h3> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>MSI Geforce GTX 750 Gaming</td> <td>GigaByte GeForce GTX 750 Ti </td> <td>GeForce GTX 650 Ti Boost *</td> <td>GeForce GTX 660 *</td> <td>MSI Radeon R7 250</td> <td>PowerColor Radeon R7 250X</td> <td>AMD Radeon R7 260X</td> <td>Sapphire Radeon R7 265</td> </tr> <tr> <td class="item">Price</td> <td class="item-dark">$120 </td> <td>$150</td> <td>$160</td> <td>$210</td> <td>$90</td> <td>$100</td> <td>$120</td> <td>$150</td> </tr> <tr> <td>Code-name</td> <td>Maxwell</td> <td>Maxwell</td> <td>Kepler</td> <td>Kepler</td> <td>Oland</td> <td>Cape Verde</td> <td>Bonaire</td> <td>Curaco</td> </tr> <tr> <td class="item">Processing cores</td> <td class="item-dark">512</td> <td>640</td> <td>768</td> <td>960</td> <td>384</td> <td>640</td> <td>896</td> <td>1,024</td> </tr> <tr> <td>ROP units</td> <td>16</td> <td>16</td> <td>24</td> <td>24</td> <td>8</td> <td>16</td> <td>16</td> <td>32</td> </tr> <tr> <td>Texture units</td> <td>32</td> <td>40<strong><br /></strong></td> <td>64</td> <td>80</td> <td>24</td> <td>40</td> <td>56</td> <td>64</td> </tr> <tr> <td>Memory</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>1GB</td> <td>1GB</td> <td>2GB</td> <td>2GB</td> </tr> <tr> <td>Memory speed</td> <td>1,350MHz</td> <td>1,350MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,125MHz</td> <td>1,500MHz</td> <td>1,400MHz</td> </tr> <tr> <td>Memory bus</td> <td>128-bit</td> <td>128-bit</td> <td>192-bit</td> <td>192-bit</td> <td>128-bit</td> <td>128-bit</td> <td>128-bit</td> <td>256-bit</td> </tr> <tr> <td>Base clock</td> <td>1,020MHz</td> <td>1,020MHz</td> <td>980MHz</td> <td>980MHz</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> </tr> <tr> <td>Boost clock</td> <td>1,085MHz</td> <td>1,085MHz</td> <td>1,033MHz</td> <td>1,033MHz</td> <td>1,050MHz</td> <td>1,000MHz</td> <td>1,000MHz</td> <td>925MHz</td> </tr> <tr> <td>PCI Express version</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> </tr> <tr> <td>Transistor count</td> <td>1.87 billion</td> <td>1.87 billion</td> <td>2.54 billion</td> <td>2.54 billion</td> <td>1.04 billion</td> <td>1.04 billion</td> <td>2.08 billion</td> <td>2.8 billion</td> </tr> <tr> <td>Power connectors</td> <td>N/A</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>1x six-pin</td> </tr> <tr> <td>TDP</td> <td>54W</td> <td>60W</td> <td>134W</td> <td>140W</td> <td>65W</td> <td>80W</td> <td>115W</td> <td>150W</td> </tr> <tr> <td>Fab process</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> </tr> <tr> <td>Multi-card support</td> <td>No</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>Yes</td> </tr> <tr> <td>Outputs</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, <br />2x HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, <br />HDMI, DisplayPort</td> <td>DVI-S, VGA, HDMI</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, HDMI, DisplayPort</td> </tr> </tbody> </table> <p><em>Provided for reference purposes.<br /></em></p> </div> </div> </div> </div> </div> <h3>How we tested</h3> <p><strong>We lowered our requirements, but not too much</strong></p> <p>We normally test all of our video cards on our standardized test bed, which has now been in operation for a year and a half, with only a few changes along the way. In fact, the only major change we've made to it in the last year was swapping the X79 motherboard and case. The motherboard had endured several hundred video-card insertions, which is well beyond the design specs. The case had also become bent to the point where the video cards were drooping slightly. Some, shall we say, "overzealous" overclocking also caused the motherboard to begin behaving unpredictably. Regardless, it's a top-tier rig with an Intel Core i7-3960X Extreme processor, 16GB of DDR3 memory, an Asus Rampage IV Extreme motherboard, Crucial M500 SSD, and Windows 8 64-bit Enterprise.</p> <p>For the AMD video cards, we loaded Catalyst driver 14.1 Beta 1.6, as that was the latest driver, and for the Nvidia cards, we used the 334.89 WHQL driver that was released just before testing began. We originally planned to run the cards at our normal "midrange GPU" settings, which is 1920x1080 resolution with maximum settings and 4X AA enabled, but after testing began, we realized we needed to back off those settings just a tad. Instead of dialing it down to medium settings, though, as that would run counter to everything we stand for as a magazine, we left the settings on "high" across the board, but disabled AA. These settings were a bit much for the lower-end cards, but rather than lower our settings once again, we decided to stand fast at 1080p with high settings, since we figured that's where you want to be gaming and you deserve to know if some of the less-expensive cards can handle that type of action.</p> <h3>Mantle Reviewed</h3> <p><strong>A word about AMD's Mantle API</strong></p> <p>AMD's Mantle API is a competitor to DirectX, optimized specifically for AMD's GCN hardware. In theory, it should allow for better performance since its code knows exactly what hardware it's talking to, as opposed to DirectX's "works with any card" approach. The Mantle API should be able to give all GCN 1.0 and later AMD cards quite a boost in games that support it. However, AMD points out that Mantle will only show benefits in scenarios that are CPU-bound, not GPU-bound, so if your GPU is already working as hard as it can, Mantle isn’t going to help it. However, if your GPU is always waiting around for instructions from an overloaded CPU, then Mantle can offer respectable gains.</p> <p>To test it out, we ran Battlefield 4 on an older Ivy Bridge quad-core, non-Hyper-Threaded Core i5-3470 test bench with the R7 260X GPU at 1920x1080 and 4X AA enabled. As of press time, there are only two games that support Mantle—Battlefield 4 and an RTS demo on Steam named Star Swarm. In Battlefield 4, we were able to achieve 36fps using DirectX, and 44fps using Mantle, which is a healthy increase and a very respectable showing for a $120 video card. The benefit was much smaller in Star Swarm, however, showing a negligible increase of just two frames per second.</p> <p><img src="/files/u152332/bf4_screen_swap_small.jpg" alt="Enabling Mantle in Battlefield 4 does provide performance boosts for most configs." title="Battlefield 4" width="620" height="207" /></p> <p>We then moved to a much beefier test bench running a six-core, Hyper-Threaded Core i7-3960X and a Radeon R9 290X, and we saw an increase in Star Swarm of 100 percent, going from 25fps with DirectX to 51fps using Mantle in a timed demo. We got a decent bump in Battlefield 4, too, going from 84 fps using DirectX to 98 fps in Mantle.</p> <p>Overall, Mantle is legit, but it’s kind of like PhysX or TressFX in that it’s nice to have when it’s supported, and does provide a boost, but it isn’t something we’d count on being available in most games.</p> <h3>Final Thoughts</h3> <h3>If cost is an issue, you've got options</h3> <p>Testing the cards for this feature was an enlightening experience. We don’t usually dabble in GPU waters that are this shallow, so we really had no idea what to expect from all the cards assembled. To be honest, if we were given a shot of sodium pentothal, we’d have to admit that given these cards’ price points, we had low expectations but thought they’d all at least be able to handle 1920x1080 gaming. As spoiled gamers used to running 2K or 4K resolution, 1080p seems like child’s play to us. But we found out that running that resolution at maximum settings is a bridge too far for any GPU that costs less than $120 or so. The $150 models are the sweet spot, though, and are able to game extremely well at 1080 resolution, meaning the barrier to entry for “sweet gaming” has been lowered by $100, thanks to these new GPUs from AMD and Nvidia.</p> <p>Therefore, the summary of our results is that if you have $150 to spend on a GPU, you should buy the Sapphire Radeon R7 265, as it’s the best card for gaming at this price point, end of discussion. OK, thanks for reading.</p> <p>Oh, are you still here? OK, here’s some more detail. In our testing, the Sapphire R7 265 was hands-down the fastest GPU at its price point—by a non-trivial margin in many tests—and is superior to the GTX 750 Ti from Nvidia. It was also the only GPU to come close to the magical 60fps we desire in every game, making it pretty much the only card in this crowd that came close to satisfying our sky-high demands. The Nvidia GTX 750 Ti card was a close second, though, and provides a totally passable experience at 1080p with all settings maxed. Nvidia’s trump card is that it consumes less than half the power of the R7 265 and runs 10 C cooler, but we doubt most gamers will care except in severely PSU-constrained systems.</p> <p>Moving down one notch to the $120 cards, the GTX 750 and R7 260X trade blows quite well, so there’s no clear winner. Pick your ecosystem and get your game on, because these cards are totally decent, and delivered playable frame rates in every test we ran.</p> <p>The bottom rung of cards, which consists of the R7 250(X) cards, were not playable at 1080p at max settings, so avoid them. They are probably good for 1680x1050 gaming at medium settings or something in that ballpark, but in our world, that is a no-man’s land filled with shattered dreams and sad pixels.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>Nvidia GTX 750 Ti (reference)</td> <td>Gigabyte GTX 750 Ti</td> <td>MSI GTX 750 Gaming</td> <td>Sapphire Radeon R7 265</td> <td>AMD Radeon R7 260X</td> <td>PowerColor Radeon R7 250X</td> <td>MSI Radeon R7 250 OC</td> </tr> <tr> <td class="item">Driver</td> <td class="item-dark">334.89</td> <td>334.89</td> <td>334.89</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> </tr> <tr> <td>3DMark Fire Storm</td> <td>3,960</td> <td>3,974</td> <td>3,558</td> <td><strong>4,686</strong></td> <td>3,832</td> <td>2,806</td> <td>1,524</td> </tr> <tr> <td class="item">Unigine Heaven 4.0 (fps)</td> <td class="item-dark"><strong>30</strong></td> <td><strong>30<br /></strong></td> <td>25</td> <td>29</td> <td>23</td> <td>17</td> <td>9</td> </tr> <tr> <td>Crysis 3 (fps)</td> <td>27</td> <td>25</td> <td>21</td> <td><strong>32</strong></td> <td>26</td> <td>16</td> <td>10</td> </tr> <tr> <td>Far Cry 3 (fps)</td> <td><strong>40</strong></td> <td><strong>40<br /></strong></td> <td>34</td> <td><strong>40</strong></td> <td>34</td> <td>16</td> <td>14</td> </tr> <tr> <td>Tomb Raider (fps)</td> <td>30</td> <td>30</td> <td>26</td> <td><strong>36</strong></td> <td>31</td> <td>20</td> <td>12</td> </tr> <tr> <td>CoD: Ghosts (fps)</td> <td>51</td> <td>49</td> <td>42</td> <td><strong>67</strong></td> <td>51</td> <td>28</td> <td>22</td> </tr> <tr> <td>Battlefield 4 (fps)</td> <td>45</td> <td>45</td> <td>32</td> <td><strong>49</strong></td> <td>40</td> <td>27</td> <td>14</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td><strong>74</strong></td> <td>71</td> <td>61</td> <td>55</td> <td>43</td> <td>34</td> <td>18</td> </tr> <tr> <td>Assassin's Creed: Black Flag (fps)</td> <td>33</td> <td>33</td> <td>29</td> <td><strong>39</strong></td> <td>21</td> <td>21</td> <td>14</td> </tr> </tbody> </table> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7-3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 1920x1080 with no AA except for the 3DMark tests.<br /></em></p> </div> </div> </div> </div> </div> http://www.maximumpc.com/best_cheap_graphics_card_2014#comments 1080p affordable amd benchmarks budget cheap cheap graphics card gpu Hardware Hardware maximum pc may 2014 nvidia Video Card Features Tue, 12 Aug 2014 21:43:32 +0000 Josh Norem 28304 at http://www.maximumpc.com Asus Announces Semi Passive Strix GeForce GTX 750 Ti OC Graphics Card http://www.maximumpc.com/asus_announces_semi_passive_strix_geforce_gtx_750_ti_oc_graphics_card <!--paging_filter--><h3><img src="/files/u69/asus_strix_gtx_750.jpg" alt="Asus Strix GTX 750 Ti OC" title="Asus Strix GTX 750 Ti OC" width="228" height="232" style="float: right;" />Work in peace and quiet before jumping into a game</h3> <p><strong>Asus today unveiled its new Strix GeForce GTX 750 Ti OC graphics card</strong>, and the first time you install it, you might be inclined to think something's wrong when the fans don't start spinning. Don't fret though, that's by design. Using the company's semi-passive Strix technology, the card's fans will sit there motionless and let the rest of the cooler passively chill the card until thermals reach 65C.</p> <p>That means you can work all day long in silence -- unless your case cooling is atrocious, you're unlikely to reach 65C on the GPU by just typing out TPS reports and surfing the web. Once you fire up a demanding game, however, things are likely to heat up in a hurry, and when they do, the fans will kick on to prevent the card from cooking itself. For games like Counter Strike, if you're playing at 1920x1080, Asus says temps will probably hover around 50C, thus allowing the card to run silent.</p> <p><a href="http://rog.asus.com/338932014/gaming-graphics-cards-2/pr-asus-announces-strix-gtx-750-ti-oc/" target="_blank">According to Asus</a>, its cooling solution keeps the card up to 58 percent chillier than a reference cooler and is three times quieter. Using the company's DirectCU II cooling technology, 6mm copper cooling pipes come in direct contact with the GPU for superior heat dissipation. The heatsink area is also 190 percent larger than reference</p> <p>Other specs include a 1,124Mhz base clockspeed, 1,202 boost clockspeed, and 2GB of GDDR5 memory clocked at 5,400MHz (effective) on a 128-bit bus.</p> <p>The Strix GTX 750 Ti OC will be available by the end of July. No word yet on price.</p> <p><iframe src="//www.youtube.com/embed/27hBljJjmGI" width="620" height="349" frameborder="0"></iframe></p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/asus_announces_semi_passive_strix_geforce_gtx_750_ti_oc_graphics_card#comments asus Build a PC Gaming geforce gtx 750 ti oc graphics card Hardware strix Video Card News Wed, 16 Jul 2014 17:33:43 +0000 Paul Lilly 28178 at http://www.maximumpc.com Overclocked EVGA GeForce GTX 780 Ti Graphics Card Tops 2GHz, Sets 3DMark Record http://www.maximumpc.com/overclocked_evga_geforce_gtx_780_ti_graphics_card_tops_2ghz_sets_3dmark_record_2014 <!--paging_filter--><h3><img src="/files/u69/kingpin_classified_780_ti.jpg" alt="EVGA GeForce GTX 780 Ti Classified Kingpin" title="EVGA GeForce GTX 780 Ti Classified Kingpin" width="228" height="147" style="float: right;" />Blowing past the 2GHz barrier</h3> <p><strong>A pair of renowned overclockers used an EVGA graphics card to blast through the 2GHz barrier</strong> en route to setting a new 3DMark Fire Strike Extreme world record. Vince "K|GNP|N" Lucido and Illya "Tin" Tsemenko accomplished the feat with an EVGA GeForce GTX 780 Ti graphics card plugged into an EVGA X79 Dark motherboard and powered by an EVGA brand (what else?) power supply.</p> <p>In doing so, the overclocking duo were able to coax the GPU to run at 2,025MHz, which set a record in and of itself. At that frequency, the team completed a successful 3DMark Fire Strike Extreme run and posted a <a href="http://www.3dmark.com/fs/2382812" target="_blank">record breaking score</a> of 8,793 points. Here's how it scored by category:</p> <ul> <li>Graphics Score: 9,230</li> <li>Physics Score: 20,896</li> <li>Combined Score: 3,954</li> </ul> <p>"These accomplishments once again prove EVGA's dedication to the enthusiast community, and why EVGA hardware is the number one choice for gamers and extreme overclockers," EVGA was quick to boast.</p> <p>Other parts of the record breaking configuration included an Intel Core i7 4960X processor overclocked to 5.6GHz, 16GB of G.Skill DDR3-1600 RAM, 120GB G.Skill Phoenix III SSD, and Windows 7 64-bit.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/overclocked_evga_geforce_gtx_780_ti_graphics_card_tops_2ghz_sets_3dmark_record_2014#comments 3dmark Build a PC evga geforce gtx 780 ti graphics card Hardware overclocking Video Card News Tue, 08 Jul 2014 16:16:41 +0000 Paul Lilly 28127 at http://www.maximumpc.com A Quick History of Multi-GPU Video Cards http://www.maximumpc.com/quick_history_multi-gpu_video_cards_2014 <!--paging_filter--><h3><span style="font-weight: normal;"><img src="/files/u162579/voodoo2creatfb.jpg" alt="Voodoo2" title="Voodoo2" width="250" height="124" style="float: right;" />Join us as we look back at the storied history of multi-GPU cards</span></h3> <p>The Voodoo-line of graphics cards might be long gone, but their impact is still felt today. They ushered in a new era of consumer PCs with relatively powerful video cards that could power the ultra demanding games of yesteryear like Quake and Unreal. It all started with the 3Dfx Voodoo2 and has continued on with modern cards like the Titan Z and R9 295X2.&nbsp;</p> <p>Some of these boards were more important, popular, and successful than others, but they're all important in the history of consumer graphics cards.&nbsp;</p> <p>Before we run down the list, it's important that we explain what exactly a GPU is. The term was first coined as part of Nvidia's marketing for the GeForce 256. The company defined it as "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second." For our purposes we're sticking with the idea that a GPU is any processor that's specifically made to render pixels.</p> <p>Do you own any dual-GPU cards?</p> http://www.maximumpc.com/quick_history_multi-gpu_video_cards_2014#comments amd ati dual-gpu fastest Gaming graphics card history multi nvidia two graphics cards Video Card voodoo Features Fri, 30 May 2014 17:08:23 +0000 Ben Kim 27876 at http://www.maximumpc.com Nvidia's Dual GPU GeForce GTX Titan Z Graphics Card Arrives http://www.maximumpc.com/nvidias_dual_gpu_geforce_gtx_titan_z_graphics_card_hits_retail_2014 <!--paging_filter--><h3><img src="/files/u69/titan_z.jpg" alt="Nvidia GeForce GTX Titan Z" title="Nvidia GeForce GTX Titan Z" width="228" height="146" style="float: right;" />Two-headed beast from Team Nvidia is ready to hit the town</h3> <p>We're getting bombarded with press releases from <a href="http://www.maximumpc.com/maingear_now_offering_nvidia_geforce_gtx_titan_z_graphics_card_options_all_desktops_2014">boutique builders</a> and graphics card makers announcing the availability of Nvidia's GeForce GTX Titan Z, and with good reason. Today is the day <strong>Nvidia is launching the dual-GPU Titan Z</strong>, which brings tons of pixel pushing power to the gaming and high-end graphics scene. If you really want to make a statement (and a dent in your bank account), you can grab two and rock a quad-SLI rig.</p> <p>"GTX Titan Z is the fastest and most advanced graphics card we’ve ever made. A technical masterpiece, designed from top to bottom for record breaking performance, the innovatively-designed GTX Titan Z has 12 GB of 7Gbps video memory, a 12 phase power supply with dynamic power balancing, full speed double precision support, 5,760 CUDA cores, and two GK110 GTX Titan Black GPUs to power 3840x2160 resolutions," Nvidia says.</p> <p>According to Nvidia, the double-precision computational power of multiple GTX Titan Z-accelerated systems now eclipses that of multi-million dollar supercomputers, and does it while using a fraction of the power and space. Pretty impressive.</p> <p><iframe src="//www.youtube.com/embed/gw7yFAdBZu8" width="620" height="349" frameborder="0"></iframe></p> <p>You have to pay to play in this kind of high-end territory with an MSRP that sits at $2,999. Expect to see custom cooled models going for even more.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidias_dual_gpu_geforce_gtx_titan_z_graphics_card_hits_retail_2014#comments Build a PC dual-gpu geforce gtx titan z graphics card Hardware nvidia Video Card News Wed, 28 May 2014 15:26:03 +0000 Paul Lilly 27892 at http://www.maximumpc.com Nvidia GeForce 337.88 Driver Now Available to Download http://www.maximumpc.com/nvidia_geforce_33788_driver_now_available_download <!--paging_filter--><h3><img src="/files/u69/geforce_close.jpg" alt="GeForce Close" title="GeForce Close" width="228" height="152" style="float: right;" />New drivers coincide with Watch Dogs launch</h3> <p><strong>Nvidia on Monday launched new GeForce 337.88 WHQL certified drivers</strong> in preparation for today's release of Ubisoft's much anticipated Watch Dogs title. According to Nvidia, this latest release "ensures you'll have the best possible gaming experience for Watch Dogs." In addition, Nvidia promises performance gains of 10 percent or more in several titles at 2560x1400 and 3840x2160 (4K) resolutions.</p> <p>Some of these include Call of Duty: Ghosts, F1 2013, Hitman Absolution, Sniper v2, DiRT 3, Just Cause 2, Team Fortress 2, Sleeping Dogs, Thief, and a few others.</p> <p>Nvidia also said it made some key DirectX optimizations that should result in lower game loading times and "significant performance increases" in a bunch of titles compared to the previous 335.23 WHQL drivers. You can also expect CPU overhead reductions, which should improve performance across the board.</p> <p>You find out more in the <a href="http://us.download.nvidia.com/Windows/337.88/337.88-win8-win7-winvista-desktop-release-notes.pdf" target="_blank">Release Notes (PDF)</a> and the grab the updated drivers direct from <a href="http://www.nvidia.com/Download/Find.aspx?lang=en-us" target="_blank">Nvidia</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_geforce_33788_driver_now_available_download#comments 337.88 driver Gaming geforce gpu graphics card nvidia Software Video Card watch dogs News Tue, 27 May 2014 15:38:03 +0000 Paul Lilly 27882 at http://www.maximumpc.com AMD Gets a Handle on Production, Pricing for Radeon R9 Graphics Cards No Longer Inflated http://www.maximumpc.com/amd_gets_handle_production_pricing_radeon_r9_graphics_cards_no_longer_inflated_2014 <!--paging_filter--><h3><img src="/files/u69/r9_270.jpg" alt="AMD Radeon R9" title="AMD Radeon R9" width="228" height="222" style="float: right;" />It's time to go GPU shopping again</h3> <p>Have you put off upgrading your graphics card because you're interested in AMD's R9 series but didn't like the inflated price points (compared to MSRP)? Well, good news, folks -- apparently that's no longer going to be a concern. <strong>AMD is reportedly putting the word out that its entire line of R9 video cards is available</strong>, in stock, and with street prices back down to where they should be.</p> <p>The <a href="http://www.forbes.com/sites/jasonevangelho/2014/05/13/retail-pricing-for-radeon-r9-graphics-cards-back-to-normal-amd-promises-stability/" target="_blank">news comes from <em>Forbes</em></a>, which says it received the communication direct from AMD's headquarters in a "very carefully" worded manner so as not to come right out and blame the shortage of parts (and subsequent price hikes) on virtual coin mining. However, it's no secret virtual coin mining is the root cause -- AMD's newest GPUs are simply better than Nvidia's at mining cryptocurrencies like Litecoin and Dogecoin, and miners were quick to gobble up inventory.</p> <p>Regardless, R9 cards are now back in stock and priced appropriately. For example, one of the lowest priced Radeon R9 270 graphics cards on Newegg currently is the <a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16814131545" target="_blank">PowerColor model</a> that sells for $170 (and comes with two games). Much higher on the totem pole is a <a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16814131522" target="_blank">Sapphire Radeon R9 290X</a> graphics card for $522 shipped -- it comes with three games.</p> <p>In any event, if it's been some time since you last checked AMD GPU pricing, give it a look -- you might be surprised.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/amd_gets_handle_production_pricing_radeon_r9_graphics_cards_no_longer_inflated_2014#comments amd buiild a pc Gaming graphics card Hardware r9 radeon Video Card News Thu, 15 May 2014 16:47:55 +0000 Paul Lilly 27817 at http://www.maximumpc.com PowerColor Teases New Devil 13 Graphics Card, Leaked Photos Suggest Radeon R9 295X2 http://www.maximumpc.com/powercolor_teases_new_devil_13_graphics_card_leaked_photos_suggest_radeon_r9_295x2 <!--paging_filter--><h3><img src="/files/u69/devil_13.jpg" alt="PowerColor Devil 13 Radeon R9 295X2" title="PowerColor Devil 13 Radeon R9 295X2" width="228" height="152" style="float: right;" />PowerColor's next Devil 13 graphics card may require four 8-pin PCI-E power connectors</h3> <p>You might have expected hell would freeze over before you'd ever see a graphics card with the audacity to demand four -- yes FOUR! -- 8-pin PCI-Express power connectors. You'd also be wrong. Maybe, anyway -- if leaked photos posted to a Chinese language web forum turn out to be legitimate, then PowerColor's upcoming Devil 13 Radeon R9 295X2 dual-GPU graphics card will have a hellish thirst for electricity.</p> <p>Here's what we know for sure -- PowerColor is conjuring up another Devil 13 graphics card. We know this because the company teased a Devil 13 photograph on its <a href="https://www.facebook.com/PowerColor.Europe/posts/10152100216771381?stream_ref=10" target="_blank">Facebook page</a> with the caption, "Something very special is coming in May... dare to guess?"</p> <p>Guesses might not be necessary, as a <a href="http://www.chiphell.com/thread-1027201-1-1.html" target="_blank">handful of blurry photos</a> (aren't they always?) posted to Chip Hell show what the poster claims is PowerColor's Devil 13 Radeon R9 295X2 graphics card. One of the photos shows a row of 8-pin PCI-E connectors, four in all, waiting to be filled. That's twice as many as found on a reference Radeon R9 295X2.</p> <p style="text-align: center;"><img src="/files/u69/devil_13_r9_295x2_connectors.jpg" alt="PowerColor Devil 13 Radeon R9 295X2 Connectors" title="PowerColor Devil 13 Radeon R9 295X2 Connectors" width="580" height="393" /></p> <p>The rest of the card sports an aggressive design that takes up three slots looks authentic, so we're inclined to believe this is the real deal.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/powercolor_teases_new_devil_13_graphics_card_leaked_photos_suggest_radeon_r9_295x2#comments Build a PC devil 13 gpu graphics card Hardware powercolor r9 295x2 radeon Video Card News Wed, 07 May 2014 17:59:03 +0000 Paul Lilly 27770 at http://www.maximumpc.com Grab a Select EVGA GeForce GTX Graphics Card, Score Watch Dogs for Free http://www.maximumpc.com/grab_select_evga_geforce_gtx_graphics_card_score_watch_dogs_free_2014 <!--paging_filter--><h3><img src="/files/u69/watch_dogs_0.jpg" alt="Watch Dogs" title="Watch Dogs" width="228" height="158" style="float: right;" />A sweet deal from EVGA and Ubisoft</h3> <p>The GPU wars certainly heated up in the past year or so, and they don't show any signs of cooling down. It's not just a game of playing leapfrog for the performance crown anymore, either. When all things are equal (or close to being equal), game bundles can be the nudge you need if you're on the fence. With that in mind, <strong>EVGA let us know that it's giving away copies of Ubisoft's Watch Dogs with the purchase of select GeForce GTX graphics cards</strong>.</p> <p>The promotion applies to EVGA brand GeForce GTX 660 or higher video cards. Purchases must have been made on or after April 29, 2014 from EVGA's online store.</p> <p>To collect your game code, you need to register your card with EVGA, upload your invoice showing a qualifying purchase, and request a Watch Dogs code by filling out a form.</p> <p>Easy cheesy, right? Full details for the promotion can be found on a <a href="http://www.evga.com/articles/00836/" target="_blank">special page on EVGA's website</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/grab_select_evga_geforce_gtx_graphics_card_score_watch_dogs_free_2014#comments Build a PC evga games GeForce GTX graphics card Hardware nvidia Software ubisoft Video Card watch dogs News Wed, 30 Apr 2014 15:46:37 +0000 Paul Lilly 27723 at http://www.maximumpc.com Testing the Speed Limit: Sapphire Launches First Overclocked Radeon R9 295X2 Graphics Card http://www.maximumpc.com/testing_speed_limit_sapphire_launches_first_overclocked_radeon_r9_295x2_graphics_card <!--paging_filter--><h3><img src="/files/u69/sappire_295x2_oc.jpg" alt="Sapphire Radeon R9 295X2 OC" title="Sapphire Radeon R9 295X2 OC" width="228" height="197" style="float: right;" />When two stock-clocked GPUs just isn't enough</h3> <p>Kudos to AMD for releasing a monster dual-GPU graphics card that comes standard with a liquid cooling setup, and props to Sapphire for having the gems to overclock AMD's two-headed beast. Available as a limited edition part, <strong>Sapphire said it's now shipping the R9 295X2 OC</strong>, which features two goosed GPUs running at 1030MHz (core) with 8GB of GDDR5 memory pushed up to 5200MHz (effective).</p> <p>Sapphire's also shipping a vanilla version (if you can call it that) of the R9 295X2 that isn't overclocked. Either one will drive your 4K gaming needs, or get you ready for 4K gaming if you're holding out for an ultra-high definition monitor that isn't plagued with goofy issues.</p> <p>As a reminder, AMD just <a href="http://www.maximumpc.com/amd_revitalizes_never_settle_game_bundle_save_codes_future_use_2014">recently refreshed</a> its Never Settle Forever game bundle with new titles and also the ability to save codes until a later date. Just bear in mind that you have to purchase your card from a qualifying vendor, details of which you can find on AMD's <a href="http://sites.amd.com/us/promo/never-settle/Pages/nsreloadedforever.aspx" target="_blank">Never Settle Forever page</a>.</p> <p>We've spotted Sapphire's stock clocked model online for around $1,500 (MSRP and street) but have yet to dig up the <a href="http://www.sapphiretech.com/presentation/product/?cid=1&amp;gid=3&amp;sgid=1227&amp;pid=2293&amp;psn=&amp;lid=1&amp;leg=0" target="_blank">OC version</a>. No word from Sapphire on how much it's supposed to cost.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/testing_speed_limit_sapphire_launches_first_overclocked_radeon_r9_295x2_graphics_card#comments Build a PC graphics card Hardware overclocking radeon r9 295x2 sapphire Video Card News Mon, 28 Apr 2014 18:10:28 +0000 Paul Lilly 27707 at http://www.maximumpc.com Sapphire Radeon R7 265 Dual-X Review http://www.maximumpc.com/sapphire_radeon_r7_265_dual-x_review_2014 <!--paging_filter--><p><img src="/files/u163784/sapphire_radeon_r7_265_dualx_2gb.jpg" alt="Sapphire Radeon R7 265 Dual-X" title="Sapphire Radeon R7 265 Dual-X" width="250" height="190" style="float: right;" /></p> <h3>Meet the new king of Budget GPUs</h3> <p>In the roundup of budget GPUs from the May 2014 issue, the Sapphire Radeon R7 275 is the odds-on favorite due to its impressive specs and the fact that it consumes more than twice the power of <a title="Nvidia GTX 750 Ti Benchmarks" href="http://www.maximumpc.com/GTX_750ti_Benchmarks_2014?page=0,0" target="_blank">Nvidia cards. </a>Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270.&nbsp;</p> <p>The R7 265 Dual-X actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. Its TDP is a very high 150w, or at least it seems high given the fact that GTX 750 Ti costs the exact same $150 and is sitting at just 60w. Unlike the lower-priced R7 260X Bonaire part though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire basically). It will support the Mantle API however, some day.</p> <p>The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damn impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check too, making this the go-to GPU for those with small boxes or small monitors.</p> <p><strong>$160,&nbsp;</strong><a title="Sapphire Technology" href="http://www.sapphiretech.com" target="_blank">www.sapphiretech.com</a></p> http://www.maximumpc.com/sapphire_radeon_r7_265_dual-x_review_2014#comments amd computer hardware Kick-Ass Award R7 265 radeon Review sapphire Video Card Reviews Fri, 18 Apr 2014 18:12:56 +0000 Josh Norem 27662 at http://www.maximumpc.com AMD's Flagship FirePro W9100 Graphics Card Boasts 16GB of GDDR5 Memory http://www.maximumpc.com/amds_flagship_firepro_w9100_graphics_card_boasts_16gb_gddr5_memory <!--paging_filter--><h3><img src="/files/u69/amd_firepro_w9100.jpg" alt="AMD FirePro W9100" title="AMD FirePro W9100" width="228" height="230" style="float: right;" />A professional graphics card designed for next generation 4K workstations</h3> <p>What do you get when you pile on heaps of GDDR5 memory to a slab of silicon rocking a fast GPU? You end up with<strong> AMD's FirePro W9100, a monster graphics card featuring an industry-first 16GB of GDDR5 onboard memory</strong>. According to AMD, the FirePro W9100 spits out up to 2.62 TFLOPS of double precision GPU compute performance, up to 5.24 TFLOPS of peak single precision GPU compute performance, and is prepped and primed for getting work done on 4K ultra high resolution workstations.</p> <p>With that much onboard memory, AMD says professionals will have no trouble multitasking across up to half a dozen displays where they can load massive assemblies and data sets to manipulate, edit, color-correct, and layer in multiple effects to 4K video projects, all in real-time.</p> <p>"Now is the time when 4K displays are more readily available and accessible," said Matt Skynner, corporate vice president and general manager of AMD Graphics. "We’re seeing even more applications demand increased memory support while pushing the limits of real-time 4K video production and rendering. AMD has delivered a product at the right time to meet these needs -- the new AMD FirePro W9100 professional graphics card -- designed for the most demanding workflows in next-generation workstations."</p> <p><img src="/files/u69/amd_firepro_w9100_displayports.jpg" alt="AMD FirePro W9100 DisplayPorts" title="AMD FirePro W9100 DisplayPorts" width="620" height="217" /></p> <p>The new card sports 2,816 stream processors, a 512-bit memory interface, 320GB/s of memory bandwidth, DirectGMA support, and six mini DisplayPort 1.2 outputs. It's 28nm GPU is clocked at 930MHz.</p> <p>AMD says the FirePro W9100 will be available this spring from <a href="http://www.sapphirepgs.com/productdetail.asp?IDno=75&amp;lang=eng" target="_blank">Sapphire Technology</a>, AMD FirePro Ultra Workstation providers, and in HP Z820 and Z620 workstations.<span style="text-decoration: line-through;"> No word on how much it will cost to adopt this beast.</span> AMD tells us the MSRP is $3,999.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/amds_flagship_firepro_w9100_graphics_card_boasts_16gb_gddr5_memory#comments 4k 9100 amd Build a PC firepro graphics card Hardware Video Card News Mon, 07 Apr 2014 16:46:17 +0000 Paul Lilly 27580 at http://www.maximumpc.com Sapphire Announces Overclocked Radeon R9 290 Vapor-X OC Graphics Card http://www.maximumpc.com/sapphire_announces_overclocked_radeon_r9_290_vapor-x_oc_graphics_card <!--paging_filter--><h3><img src="/files/u69/sapphire_r9_290_vapor_x_oc.jpg" alt="Sapphire R9 290 Vapor-X OC" title="Sapphire R9 290 Vapor-X OC" width="228" height="222" style="float: right;" /></h3> <h3>Factory overclocked and custom cooled</h3> <p>AMD's Radeon R9 290 graphics card already runs fast, but kicking things up a notch is <strong>Sapphire, which just launched its R9 290 Vapor-X OC</strong> with a custom cooling solution. It has 2,560 Stream Processors, 4GB of GDDR5 memory running at 1,400MHz (5.6GB/s effective), and an overclocked GPU that's been goosed to 1,030MHz, up from a maximum clockspeed of 947MHz on reference cards.</p> <p>To keep things stable, Sapphire combined its vapor chamber cooler with its Tri-X fans. The vapor chamber is mounted between the GPU itself and the base of the heatsink and cooler assembly to pull the most amount of heat away from the GPU and into the cooler.</p> <p>Sapphire's Tri-X cooling solution consists of three 90mm fans with dust repelling ball bearings and five heat pipes, including a 10mm core heat pipe flanked by two 8mm pipes and two 6mm pipes snaking through the cooler.</p> <p>"The stylish fan shroud also contains molded guides to control the airflow across and through the cooler, including over the Power Control circuitry," <a href="http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&amp;articleID=5683&amp;lid=1" target="_blank">Sapphire explains</a>. "The combination of Vapor-X and Tri-X technologies results in GPU temperatures between 5 and 10 degrees cooler than with Tri-X alone – delivering industry leading cooling for this generation of graphics cards."</p> <p><img src="/files/u69/sapphire_r9_290_vapor-x_oc_front.jpg" alt="Sapphire R9 290 Vapor-X OC" title="Sapphire R9 290 Vapor-X OC" width="620" height="279" /></p> <p>No word yet on when the <a href="http://www.sapphiretech.com/presentation/product/?cid=1&amp;gid=3&amp;sgid=1227&amp;pid=2167&amp;psn=&amp;lid=1&amp;leg=0" target="_blank">Sapphire R9 290 Vapor-X OC</a> will be available or for how much.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/sapphire_announces_overclocked_radeon_r9_290_vapor-x_oc_graphics_card#comments Build a PC gpu graphics card Hardware overclocking r9 290 vapor-x oc radeon sapphire Video Card News Fri, 04 Apr 2014 15:21:38 +0000 Paul Lilly 27566 at http://www.maximumpc.com