gpu http://www.maximumpc.com/taxonomy/term/324/ en EVGA Announces Passively Cooled GeForce GT 720 Graphics Cards http://www.maximumpc.com/evga_announces_passively_cooled_geforce_gt_720_graphics_cards_2014 <!--paging_filter--><h3><img src="/files/u69/evga_geforce_gt_720.jpg" alt="EVGA GeForce GT 720" title="EVGA GeForce GT 720" width="228" height="228" style="float: right;" />Available in 1GB and 2GB models</h3> <p><strong>EVGA this week added the GeForce GT 720 with passive cooling to its graphics card lineup</strong>. Compared to integrated graphics, Nvidia says you can expect up to 2x faster web browsing, 5x faster video editing, and 8x faster photo editing. And when it comes time to game, the jump in performance can be up to 70 percent faster, all while taking up just a single slot in your PC, Nvidia says.</p> <p>The EVGA GeForce GT 720 comes with <a href="http://www.evga.com/articles/00864/EVGA-GeForce-GT-720/#2722" target="_blank">1GB</a> or <a href="http://www.evga.com/articles/00864/EVGA-GeForce-GT-720/#2724" target="_blank">2GB</a> of GDDR5 memory and is available in low profile and full height form factors. Other than the amount of RAM and physical size, the specs are the same -- 192 CUDA cores, 797MHz base clock, 1800MHz memory clock, 64-bit bus, 1.43ns memory speed, and 14.4GB/s of memory bandwidth.</p> <p>Connectivity options include VGA, DVI, and HDMI. If you're so inclined, you can drive up to three separate displays at the same time using a single card, <a href="http://www.evga.com/articles/00864/EVGA-GeForce-GT-720/" target="_blank">Nvidia says</a>.</p> <p>The EVGA GeForce GT 720 will be available soon. No word yet on price.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/evga_announces_passively_cooled_geforce_gt_720_graphics_cards_2014#comments Build a PC evga geforce gt 720 gpu graphics card Hardware Video Card News Wed, 13 Aug 2014 13:31:27 +0000 Paul Lilly 28336 at http://www.maximumpc.com Best Cheap Graphics Card http://www.maximumpc.com/best_cheap_graphics_card_2014 <!--paging_filter--><h3>Six entry-level cards battle for budget-board bragging rights</h3> <p>The video-card game is a lot like Hollywood. Movies like My Left Foot and The Artist take home the Oscars every year, but movies like Grown Ups 2 and Transformers 3 pull in all the cash. It's the same with GPUs, in that everyone loves to talk about $1,000 cards, but the actual bread-and-butter of the market is made up of models that cost between $100 and $150. These are not GPUs for 4K gaming, obviously, but they can provide a surprisingly pleasant 1080p gaming experience, and run cool and quiet, too.</p> <p>This arena has been so hot that AMD and Nvidia have recently released no fewer than six cards aimed at budget buyers. Four of these cards are from AMD, and Nvidia launched two models care of its all-new Maxwell architecture, so we decided to pit them against one another in an old-fashioned GPU roundup. All of these cards use either a single six-pin PCIe connector or none at all, so you don't even need a burly power supply to run them, just a little bit of scratch and the desire to get your game on. Let's dive in and see who rules the roost!</p> <h3>Nvidia's Maxwell changes the game</h3> <p>Budget GPUs have always been low-power components, and usually need just a single six-pin PCIe power connector to run them. After all, a budget GPU goes into a budget build, and those PCs typically don't come with the 600W-or-higher power supplies that provide dual six- or eight-pin PCIe connectors. Since many budget PSUs done have PCIe connectors, most of these cards come with Molex adapters in case you don't have one. The typical thermal design power (TDP) of these cards is around 110 watts or so, but that number fluctuates up and down according to spec. For comparison, the Radeon R9 290X has a TDP of roughly 300 watts, and Nvidia's flagship card, the GTX 780 Ti, has a TDP of 250W, so these budget cards don't have a lot of juice to work with. Therefore, efficiency is key, as the GPUs need to make the most out of the teeny, tiny bit of wattage they are allotted. During 2013, we saw AMD and Nvidia release GPUs based on all-new 28nm architectures named GCN and Kepler, respectively, and though Nvidia held a decisive advantage in the efficiency battle, it's taken things to the next level with its new ultra-low-power Maxwell GPUs that were released in February 2014.</p> <p>Beginning with the GTX 750 Ti and the GTX 750, Nvidia is embarking on a whole new course for its GPUs, centered around maximum power efficiency. The goal with its former Kepler architecture was to have better performance per watt compared to the previous architecture named Fermi, and it succeeded, but it's taken that same philosophy even further with Maxwell, which had as its goal to be twice as efficient as Kepler while providing 25 percent more performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/maxwell_small_0.jpg"><img src="/files/u152332/maxwell_small.jpg" alt="Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. " width="620" height="279" /></a></p> <p style="text-align: center;"><strong>Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. </strong></p> <p>Achieving more performance for the same model or SKU from one generation to the next is a tough enough challenge, but to do so by cutting power consumption in half is an even trickier gambit, especially considering the Maxwell GPUs are being fabricated on the same 28nm process it used for Kepler. We always expect more performance for less power when moving from one process to the next, such as 32nm to 28nm or 22nm to 14nm, but to do so on the same process is an amazing achievement indeed. Though Nvidia used many technological advances to reduce power consumption, the main structural change was to how the individual CUDA cores inside the Graphics Processing Clusters (GPCs) are organized and controlled. In Kepler, each GPC contained individual processing units, named SMX units, and each unit featured a piece of control logic that handled scheduling for 192 CUDA cores, which was a major increase from the 32 cores in each block found in Fermi. In Maxwell, Nvidia has gone back to 32 CUDA cores per block, but is putting four blocks into each unit, which are now called SM units. If you're confused, the simple version is this—rather than one piece of logic controlling 192 cores, Maxwell has a piece of logic for each cluster of 32 cores, and there are four clusters per unit, for a total of 128 cores per block. Therefore, it's reduced the number of cores per block by 64, from 192 to 128, which helps save energy. Also, since each piece of control logic only has to pay attention to 32 cores instead of 192, it can run them more efficiently, which also saves energy.</p> <p>The benefit to all this energy-saving is the GTX 750 cards don't need external power, so they can be dropped into pretty much any PC on the market without upgrading the power supply. That makes it a great upgrade for any pre-built POS you have lying around the house.</p> <h4>Gigabyte GTX 750 Ti WindForce</h4> <p>Nvidia's new Maxwell cards run surprisingly cool and quiet in stock trim, and that's with a fan no larger than an oversized Ritz cracker, so you can guess what happens when you throw a mid-sized WindForce cooler onto one of them. Yep, it's so quiet and cool you have to check with your fingers to see if it's even running. This bad boy ran at 45 C under load, making it the coolest-running card we've ever tested, so kudos to Nvidia and Gigabyte on holding it down (the temps, that is). This board comes off the factory line with a very mild overclock of just 13MHz (why even bother, seriously), and its boost clock has been massaged up to 1,111MHz from 1,085MHz, but as always, this is just a starting point for your overclocking adventures. The memory is kept at reference speeds however, running at 5,400MHz. The board sports 2GB of GDDR5 memory, and uses a custom design for its blue-colored PCB. It features two 80mm fans and an 8mm copper heat pipe. Most interesting is the board requires a six-pin PCIe connector, unlike the reference design, which does not.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/9755_big_small_0.jpg"><img src="/files/u152332/9755_big_small.jpg" alt="The WindForce cooler is overkill, but we like it that way. " title="Gigabyte GTX 750 Ti WindForce" width="620" height="500" /></a></p> <p style="text-align: center;"><strong>The WindForce cooler is overkill, but we like it that way. </strong></p> <p>In testing, the GTX 750 Ti WindForce was neck-and-neck with the Nvidia reference design, proving that Nvidia did a pretty good job with this card, and that its cooling requirements don't really warrant such an outlandish cooler. Still, we'll take it, and we loved that it was totally silent at all times. Overclocking potential is higher, of course, but since the reference design overclocked to 1,270MHz or so, we don’t think you should expect moon-shot overclocking records. Still, this card was rock solid, whisper quiet, and extremely cool.</p> <p><strong>Gigabyte GTX 750 Ti WindForce</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$160(Street), <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h4>MSI GeForce GTX 750 Gaming</h4> <p>Much like Gigabyte's GTX 750 Ti WindForce card, the MSI GTX 750 Gaming is a low-power board with a massive Twin Frozr cooler attached to it for truly exceptional cooling performance. The only downside is the formerly waifish GPU has been transformed into a full-size card, measuring more than nine inches long. Unlike the Gigabyte card though, this GPU eschews the six-pin PCIe connector, as it's just a 55W board, and since the PCIe slot delivers up to 75W, it doesn't even need the juice. Despite this card's entry-level billing, MSI has fitted it with “military-class” components for better overclocking and improved stability. It uses twin heat pipes to dual 100mm fans to keep it cool, as well. It also includes a switch that lets you toggle between booting from an older BIOS in case you run into overclocking issues.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msigtx750_small_0.jpg"><img src="/files/u152332/msigtx750_small.jpg" alt="MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card." title="MSI GeForce GTX 750 Gaming" width="620" height="364" /></a></p> <p style="text-align: center;"><strong>MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.</strong></p> <p>Speaking of which, this board lives up to its name and has a beefy overclock right out of the box, running at 1,085MHz base clock and 1,163MHz boost clock. It features 1GB of GDDR5 RAM on a 128-bit interface.</p> <p>The Twin Frozr cooler handles the miniscule amount of heat coming out of this board with aplomb—we were able to press our finger forcibly on the heatsink under load and felt almost no warmth, sort of like when we give Gordon a hug when he arrives at the office. As the only GTX 750 in this test, it showed it could run our entire test suite at decent frame rates, but it traded barbs with the slightly less expensive Radeon R7 260X. On paper, both the GTX 750 and the R7 260X are about $119, but rising prices from either increased demand or low supply have placed both cards in the $150 range, making it a dead heat. Still, it's a very good option for those who want an Nvidia GPU and its ecosystem but can't afford the Ti model.</p> <p><strong>MSI GeForce GTX 750 Gaming</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$140, <a href="http://www.msi.com/ " target="_blank">www.msi.com</a></strong></p> <h4>Sapphire Radeon R7 265 Dual-X</h4> <p>The Sapphire Radeon R7 265 is the odds-on favorite in this roundup, due to its impressive specs and the fact that it consumes more than twice the power of the Nvidia cards. Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270. This card actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. At 150W, its TDP is very high—or at least it seems high, given that the GTX 750 Ti costs the exact same $150 and is sitting at just 60W. Unlike the lower-priced R7 260X Bonaire part, though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire, basically). However, it will support the Mantle API, someday.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small_0.jpg"><img src="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small.jpg" alt="Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. " title="Sapphire Radeon R7 265 Dual-X" width="620" height="473" /></a></p> <p style="text-align: center;"><strong>Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. </strong></p> <p>The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damned impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall, though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check, too, making this the go-to GPU for those with small boxes or small monitors.</p> <p><strong> Sapphire Radeon R7 265 Dual-X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9ka.jpg" alt="score:9ka" title="score:9ka" width="210" height="80" /></div> </div> </div> <p><strong>$150 (MSRP), <a href="http://www.sapphiretech.com/ " target="_blank">www.sapphiretech.com</a></strong></p> <h4>AMD Radeon R7 260X</h4> <p>The Radeon R7 260X was originally AMD's go-to card for 1080p gaming on a budget. It’s the only card in the company’s sub-$200 lineup that supports all the next-gen features that appeared in its Hawaii-based flagship boards, including support for TrueAudio, XDMA Crossfire, Mantle (as in, it worked at launch), and it has the ability to drive up to three displays —all from this tiny $120 GPU. Not bad. In its previous life, this GPU was known as the Radeon HD 7790, aka Bonaire, and it was our favorite "budget" GPU when pitted against the Nvidia GTX 650 Ti Boost due to its decent performance and amazing at-the-time game bundles. It features a 128-bit memory bus, 896 Stream Processors, 2GB of RAM (up from 1GB on the previous card), and a healthy boost clock of 1,100MHz. TDP is just 115W, so it slots right in between the Nvidia cards and the higher-end R7 265 board. Essentially, this is an HD 7790 card with 1GB more RAM, and support for TrueAudio, which we have yet to experience.</p> <p style="text-align: center;"><a class="thickbox" href="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small_0.jpg"><img src="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small.jpg" alt="This $120 card supports Mantle, TrueAudio, and CrossFire. " title="AMD Radeon R7 260X" width="620" height="667" /></a></p> <p style="text-align: center;"><strong>This $120 card supports Mantle, TrueAudio, and CrossFire. </strong></p> <p>In testing, the R7 260X delivered passable performance, staking out the middle ground between the faster R7 265 and the much slower R7 250 cards. It ran at about 30fps in tests like Crysis 3 and Tomb Raider, but hit 51fps on CoD: Ghosts and 40fps on Battlefield 4, so it's certainly got enough horsepower to run the latest games on max settings. The fact that it supports all the latest technology from AMD is what bolsters this card's credentials, though. And the fact that it can run Mantle with no problems is a big plus for Battlefield 4 players. We like this card a lot, just like we enjoyed the HD 7790. While it’s not the fastest card in the bunch, it’s certainly far from the slowest.</p> <p><strong>AMD Radeon R7 260X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$120 <a href="http://www.amd.com/ " target="_blank">www.amd.com</a></strong></p> <h4> <hr />MSI Radeon R7 250 OC</h4> <p>In every competition, there must be one card that represents the lower end of the spectrum, and in this particular roundup, it’s this little guy from MSI. Sure, it's been overclocked a smidge and has a big-boy 2GB of memory, but this GPU is otherwise outgunned, plain and simple. For starters, it has just 384 Stream Processors, which is the lowest number we've ever seen on a modern GPU, so it's already severely handicapped right out of the gate. Board power is a decent 65W, but when looking at the specs of the Nvidia GTX 750, it is clearly outmatched. One other major problem, at least for those of us with big monitors, is we couldn't get it to run our desktop at 2560x1600 out of the box, as it only has one single-link DVI connector instead of dual-link. On the plus side, it doesn't require an auxiliary power connector and is just $100, so it's a very inexpensive board and would make a great upgrade from integrated graphics for someone on a strict budget.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msi_radeon_r7_250_oc_2gb_small_0.jpg"><img src="/files/u152332/msi_radeon_r7_250_oc_2gb_small.jpg" alt="Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB." title="MSI Radeon R7 250 OC" width="620" height="498" /></a></p> <p style="text-align: center;"><strong>Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.</strong></p> <p>That said, we actually felt bad for this card during testing. The sight of it limping along at 9 frames per second in Heaven 4.0 was tough to watch, and it didn't do much better on our other tests, either. Its best score was in Call of Duty: Ghosts, where it hit a barely playable 22fps. In all of our other tests, it was somewhere between 10 and 20 frames per second on high settings, which is simply not playable. We'd love to say something positive about the card though, so we'll note that it probably runs fine at medium settings and has a lot of great reviews on Newegg from people running at 1280x720 or 1680x1050 resolution.</p> <p><strong>MSI Radeon R7 250 OC 1TB</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_6.jpg" alt="score:6" title="score:6" width="210" height="80" /></div> </div> </div> <p><strong>$90 <a href=" http://us.msi.com/ " target="_blank">http://us.msi.com</a></strong></p> <h4>PowerColor Radeon R7 250X</h4> <p>The PowerColor Radeon R7 250X represents a mild bump in specs from the R7 250, as you would expect given its naming convention. It is outfitted with 1GB of RAM however, and a decent 1,000MHz boost clock. It packs 640 Stream Processors, placing it above the regular R7 250 but about mid-pack in this group. Its 1GB of memory runs on the same 128-bit memory bus as other cards in this roundup, so it's a bit constrained in its memory bandwidth, and we saw the effects of it in our testing. It supports DirectX 11.2, though, and has a dual-link DVI connector. It even supports CrossFire with an APU, but not with another PCIe GPU&shy;—or at least that's our understanding of it, since it says it supports CrossFire but doesn't have a connector on top of the card.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r7-250x-angle_small_0.jpg"><img src="/files/u152332/r7-250x-angle_small.jpg" alt="The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. " title="PowerColor Radeon R7 250X " width="620" height="369" /></a></p> <p style="text-align: center;"><strong>The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. </strong></p> <p>When we put the X-card to the test, it ended up faring a smidgen better than the non-X version, but just barely. It was able to hit 27 and 28 frames per second in Battlefield 4 and CoD: Ghosts, and 34 fps in Batman: Arkham Origins, but in the rest of the games in our test suite, its performance was simply not what we would call playable. Much like the R7 250 from MSI, this card can't handle 1080p with all settings maxed out, so this GPU is bringing up the rear in this crowd. Since it's priced "under $100" we won't be too harsh on it, as it seems like a fairly solid option for those on a very tight budget, and we'd definitely take it over the vanilla R7 250. We weren't able to see "street" pricing for this card, as it had not been released at press time, but our guess is even though it's one of the slowest in this bunch, it will likely be the go-to card under $100.</p> <p><strong>PowerColor Radeon R7 250X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$100, <a href="http://www.powercolor.com/ " target="_blank">www.powercolor.com</a></strong></p> <h3>Should you take the red pill or the green pill?</h3> <p><strong>Both companies offer proprietary technologies to lure you into their "ecosystems," so let’s take a look at what each has to offer</strong></p> <h4>Nvidia's Offerings</h4> <p><strong>G-Sync</strong></p> <p>Nvidia's G-Sync technology is arguably one of the strongest cards in Nvidia's hand, as it eliminates tearing in video games caused by the display's refresh rate being out of sync with the frame rate of the GPU. The silicon syncs the refresh rate with the cycle of frames rendered by the GPU, so movement onscreen looks buttery smooth at all times, even below 30fps. The only downside is you must have a G-Sync monitor, so that limits your selection quite a bit.</p> <p><strong>Regular driver releases</strong></p> <p>People love to say Nvidia has "better drivers" than AMD, and though the notion of "better" is debatable, it certainly releases them much more frequently than AMD. That's not to say AMD is a slouch—especially now that it releases a new "beta" build each month—but Nvidia seems to be paying more attention to driver support than AMD.</p> <p><strong>GeForce Experience and ShadowPlay</strong></p> <p>Nvidia's GeForce Experience software will automatically optimize any supported games you have installed, and also lets you stream to Twitch as well as capture in-game footage via ShadowPlay. It's a really slick piece of software, and though we don't need a software program to tell us "hey, max out all settings," we do love ShadowPlay.</p> <p><strong>PhysX</strong></p> <p>Nvidia's proprietary PhysX software allows game developers to include billowing smoke, exploding particles, cloth simulation, flowing liquids, and more, but there's just one problem—very few games utilize it. Even worse, the ones that do utilize it, do so in a way that is simply not that impressive, with one exception: Borderlands 2.</p> <h4>AMD's Offerings</h4> <p><strong>Mantle and TrueAudio</strong></p> <p>AMD is hoping that Mantle and TrueAudio become the must-have "killer technology" it offers over Nvidia, but at this early stage, it's difficult to say with certainty if that will ever happen. Mantle is a lower-level API that allows developers to optimize a game specifically targeted at AMD hardware, allowing for improved performance.</p> <p><strong>TressFX</strong></p> <p>This is proprietary physics technology similar to Nvidia's PhysX in that it only appears in certain games, and does very specific things. Thus far, we've only seen it used once—for Lara Croft's hair in Tomb Raider. Instead of a blocky ponytail, her mane is flowing and gets blown around by the wind. It looks cool but is by no means a must-have item on your shopping list, just like Nvidia's PhysX. <strong>&nbsp;</strong></p> <p><strong>Gaming Evolved by Raptr<br /></strong></p> <p>This software package is for Radeon users only, and does several things. First, it will automatically optimize supported games you have installed, and it also connects you to a huge community of gamers across all platforms, including PC and console. You can see who is playing what, track achievements, chat with friends, and also broadcast to Twitch.tv, too. AMD also has a "rewards" program that doles out points for using the software, and you can exchange those points for gear, games, swag, and more.</p> <p><strong>Currency mining</strong></p> <p>AMD cards are better for currency mining than Nvidia cards for several reasons, but their dominance is not in question. The most basic reason is the algorithms used in currency mining favor the GCN architecture, so much so that AMD cards are usually up to five times faster in performing these operations than their Nvidia equivalent. In fact, the mining craze has pushed the demand for these cards is so high that there's now a supply shortage.</p> <h3>All the cards, side by side</h3> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>MSI Geforce GTX 750 Gaming</td> <td>GigaByte GeForce GTX 750 Ti </td> <td>GeForce GTX 650 Ti Boost *</td> <td>GeForce GTX 660 *</td> <td>MSI Radeon R7 250</td> <td>PowerColor Radeon R7 250X</td> <td>AMD Radeon R7 260X</td> <td>Sapphire Radeon R7 265</td> </tr> <tr> <td class="item">Price</td> <td class="item-dark">$120 </td> <td>$150</td> <td>$160</td> <td>$210</td> <td>$90</td> <td>$100</td> <td>$120</td> <td>$150</td> </tr> <tr> <td>Code-name</td> <td>Maxwell</td> <td>Maxwell</td> <td>Kepler</td> <td>Kepler</td> <td>Oland</td> <td>Cape Verde</td> <td>Bonaire</td> <td>Curaco</td> </tr> <tr> <td class="item">Processing cores</td> <td class="item-dark">512</td> <td>640</td> <td>768</td> <td>960</td> <td>384</td> <td>640</td> <td>896</td> <td>1,024</td> </tr> <tr> <td>ROP units</td> <td>16</td> <td>16</td> <td>24</td> <td>24</td> <td>8</td> <td>16</td> <td>16</td> <td>32</td> </tr> <tr> <td>Texture units</td> <td>32</td> <td>40<strong><br /></strong></td> <td>64</td> <td>80</td> <td>24</td> <td>40</td> <td>56</td> <td>64</td> </tr> <tr> <td>Memory</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>1GB</td> <td>1GB</td> <td>2GB</td> <td>2GB</td> </tr> <tr> <td>Memory speed</td> <td>1,350MHz</td> <td>1,350MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,125MHz</td> <td>1,500MHz</td> <td>1,400MHz</td> </tr> <tr> <td>Memory bus</td> <td>128-bit</td> <td>128-bit</td> <td>192-bit</td> <td>192-bit</td> <td>128-bit</td> <td>128-bit</td> <td>128-bit</td> <td>256-bit</td> </tr> <tr> <td>Base clock</td> <td>1,020MHz</td> <td>1,020MHz</td> <td>980MHz</td> <td>980MHz</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> </tr> <tr> <td>Boost clock</td> <td>1,085MHz</td> <td>1,085MHz</td> <td>1,033MHz</td> <td>1,033MHz</td> <td>1,050MHz</td> <td>1,000MHz</td> <td>1,000MHz</td> <td>925MHz</td> </tr> <tr> <td>PCI Express version</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> </tr> <tr> <td>Transistor count</td> <td>1.87 billion</td> <td>1.87 billion</td> <td>2.54 billion</td> <td>2.54 billion</td> <td>1.04 billion</td> <td>1.04 billion</td> <td>2.08 billion</td> <td>2.8 billion</td> </tr> <tr> <td>Power connectors</td> <td>N/A</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>1x six-pin</td> </tr> <tr> <td>TDP</td> <td>54W</td> <td>60W</td> <td>134W</td> <td>140W</td> <td>65W</td> <td>80W</td> <td>115W</td> <td>150W</td> </tr> <tr> <td>Fab process</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> </tr> <tr> <td>Multi-card support</td> <td>No</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>Yes</td> </tr> <tr> <td>Outputs</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, <br />2x HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, <br />HDMI, DisplayPort</td> <td>DVI-S, VGA, HDMI</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, HDMI, DisplayPort</td> </tr> </tbody> </table> <p><em>Provided for reference purposes.<br /></em></p> </div> </div> </div> </div> </div> <h3>How we tested</h3> <p><strong>We lowered our requirements, but not too much</strong></p> <p>We normally test all of our video cards on our standardized test bed, which has now been in operation for a year and a half, with only a few changes along the way. In fact, the only major change we've made to it in the last year was swapping the X79 motherboard and case. The motherboard had endured several hundred video-card insertions, which is well beyond the design specs. The case had also become bent to the point where the video cards were drooping slightly. Some, shall we say, "overzealous" overclocking also caused the motherboard to begin behaving unpredictably. Regardless, it's a top-tier rig with an Intel Core i7-3960X Extreme processor, 16GB of DDR3 memory, an Asus Rampage IV Extreme motherboard, Crucial M500 SSD, and Windows 8 64-bit Enterprise.</p> <p>For the AMD video cards, we loaded Catalyst driver 14.1 Beta 1.6, as that was the latest driver, and for the Nvidia cards, we used the 334.89 WHQL driver that was released just before testing began. We originally planned to run the cards at our normal "midrange GPU" settings, which is 1920x1080 resolution with maximum settings and 4X AA enabled, but after testing began, we realized we needed to back off those settings just a tad. Instead of dialing it down to medium settings, though, as that would run counter to everything we stand for as a magazine, we left the settings on "high" across the board, but disabled AA. These settings were a bit much for the lower-end cards, but rather than lower our settings once again, we decided to stand fast at 1080p with high settings, since we figured that's where you want to be gaming and you deserve to know if some of the less-expensive cards can handle that type of action.</p> <h3>Mantle Reviewed</h3> <p><strong>A word about AMD's Mantle API</strong></p> <p>AMD's Mantle API is a competitor to DirectX, optimized specifically for AMD's GCN hardware. In theory, it should allow for better performance since its code knows exactly what hardware it's talking to, as opposed to DirectX's "works with any card" approach. The Mantle API should be able to give all GCN 1.0 and later AMD cards quite a boost in games that support it. However, AMD points out that Mantle will only show benefits in scenarios that are CPU-bound, not GPU-bound, so if your GPU is already working as hard as it can, Mantle isn’t going to help it. However, if your GPU is always waiting around for instructions from an overloaded CPU, then Mantle can offer respectable gains.</p> <p>To test it out, we ran Battlefield 4 on an older Ivy Bridge quad-core, non-Hyper-Threaded Core i5-3470 test bench with the R7 260X GPU at 1920x1080 and 4X AA enabled. As of press time, there are only two games that support Mantle—Battlefield 4 and an RTS demo on Steam named Star Swarm. In Battlefield 4, we were able to achieve 36fps using DirectX, and 44fps using Mantle, which is a healthy increase and a very respectable showing for a $120 video card. The benefit was much smaller in Star Swarm, however, showing a negligible increase of just two frames per second.</p> <p><img src="/files/u152332/bf4_screen_swap_small.jpg" alt="Enabling Mantle in Battlefield 4 does provide performance boosts for most configs." title="Battlefield 4" width="620" height="207" /></p> <p>We then moved to a much beefier test bench running a six-core, Hyper-Threaded Core i7-3960X and a Radeon R9 290X, and we saw an increase in Star Swarm of 100 percent, going from 25fps with DirectX to 51fps using Mantle in a timed demo. We got a decent bump in Battlefield 4, too, going from 84 fps using DirectX to 98 fps in Mantle.</p> <p>Overall, Mantle is legit, but it’s kind of like PhysX or TressFX in that it’s nice to have when it’s supported, and does provide a boost, but it isn’t something we’d count on being available in most games.</p> <h3>Final Thoughts</h3> <h3>If cost is an issue, you've got options</h3> <p>Testing the cards for this feature was an enlightening experience. We don’t usually dabble in GPU waters that are this shallow, so we really had no idea what to expect from all the cards assembled. To be honest, if we were given a shot of sodium pentothal, we’d have to admit that given these cards’ price points, we had low expectations but thought they’d all at least be able to handle 1920x1080 gaming. As spoiled gamers used to running 2K or 4K resolution, 1080p seems like child’s play to us. But we found out that running that resolution at maximum settings is a bridge too far for any GPU that costs less than $120 or so. The $150 models are the sweet spot, though, and are able to game extremely well at 1080 resolution, meaning the barrier to entry for “sweet gaming” has been lowered by $100, thanks to these new GPUs from AMD and Nvidia.</p> <p>Therefore, the summary of our results is that if you have $150 to spend on a GPU, you should buy the Sapphire Radeon R7 265, as it’s the best card for gaming at this price point, end of discussion. OK, thanks for reading.</p> <p>Oh, are you still here? OK, here’s some more detail. In our testing, the Sapphire R7 265 was hands-down the fastest GPU at its price point—by a non-trivial margin in many tests—and is superior to the GTX 750 Ti from Nvidia. It was also the only GPU to come close to the magical 60fps we desire in every game, making it pretty much the only card in this crowd that came close to satisfying our sky-high demands. The Nvidia GTX 750 Ti card was a close second, though, and provides a totally passable experience at 1080p with all settings maxed. Nvidia’s trump card is that it consumes less than half the power of the R7 265 and runs 10 C cooler, but we doubt most gamers will care except in severely PSU-constrained systems.</p> <p>Moving down one notch to the $120 cards, the GTX 750 and R7 260X trade blows quite well, so there’s no clear winner. Pick your ecosystem and get your game on, because these cards are totally decent, and delivered playable frame rates in every test we ran.</p> <p>The bottom rung of cards, which consists of the R7 250(X) cards, were not playable at 1080p at max settings, so avoid them. They are probably good for 1680x1050 gaming at medium settings or something in that ballpark, but in our world, that is a no-man’s land filled with shattered dreams and sad pixels.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>Nvidia GTX 750 Ti (reference)</td> <td>Gigabyte GTX 750 Ti</td> <td>MSI GTX 750 Gaming</td> <td>Sapphire Radeon R7 265</td> <td>AMD Radeon R7 260X</td> <td>PowerColor Radeon R7 250X</td> <td>MSI Radeon R7 250 OC</td> </tr> <tr> <td class="item">Driver</td> <td class="item-dark">334.89</td> <td>334.89</td> <td>334.89</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> </tr> <tr> <td>3DMark Fire Storm</td> <td>3,960</td> <td>3,974</td> <td>3,558</td> <td><strong>4,686</strong></td> <td>3,832</td> <td>2,806</td> <td>1,524</td> </tr> <tr> <td class="item">Unigine Heaven 4.0 (fps)</td> <td class="item-dark"><strong>30</strong></td> <td><strong>30<br /></strong></td> <td>25</td> <td>29</td> <td>23</td> <td>17</td> <td>9</td> </tr> <tr> <td>Crysis 3 (fps)</td> <td>27</td> <td>25</td> <td>21</td> <td><strong>32</strong></td> <td>26</td> <td>16</td> <td>10</td> </tr> <tr> <td>Far Cry 3 (fps)</td> <td><strong>40</strong></td> <td><strong>40<br /></strong></td> <td>34</td> <td><strong>40</strong></td> <td>34</td> <td>16</td> <td>14</td> </tr> <tr> <td>Tomb Raider (fps)</td> <td>30</td> <td>30</td> <td>26</td> <td><strong>36</strong></td> <td>31</td> <td>20</td> <td>12</td> </tr> <tr> <td>CoD: Ghosts (fps)</td> <td>51</td> <td>49</td> <td>42</td> <td><strong>67</strong></td> <td>51</td> <td>28</td> <td>22</td> </tr> <tr> <td>Battlefield 4 (fps)</td> <td>45</td> <td>45</td> <td>32</td> <td><strong>49</strong></td> <td>40</td> <td>27</td> <td>14</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td><strong>74</strong></td> <td>71</td> <td>61</td> <td>55</td> <td>43</td> <td>34</td> <td>18</td> </tr> <tr> <td>Assassin's Creed: Black Flag (fps)</td> <td>33</td> <td>33</td> <td>29</td> <td><strong>39</strong></td> <td>21</td> <td>21</td> <td>14</td> </tr> </tbody> </table> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7-3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 1920x1080 with no AA except for the 3DMark tests.<br /></em></p> </div> </div> </div> </div> </div> http://www.maximumpc.com/best_cheap_graphics_card_2014#comments 1080p affordable amd benchmarks budget cheap cheap graphics card gpu Hardware Hardware maximum pc may 2014 nvidia Video Card Features Tue, 12 Aug 2014 21:43:32 +0000 Josh Norem 28304 at http://www.maximumpc.com AMD FirePro S9150 Brings 2.53 TFLOPS of Double Precision Performance to Servers http://www.maximumpc.com/amd_firepro_s9150_brings_253_tflops_double_precision_performance_servers <!--paging_filter--><h3><img src="/files/u69/amd_firepro_s9150.jpg" alt="AMD FirePro S9150" title="AMd FirePro S9150" width="228" height="191" style="float: right;" />Busting through the 2.0 TFLOPS barrier</h3> <p><strong>AMD on Wednesday let loose its FirePro S9150 server card</strong>, supposedly the most powerful server GPU ever built for High Performance Computing (HPC) and the first to support double precision and break the 2.0 TFLOPS double precision barrier. Based on AMD's Graphics Core Next (GCN) architecture, the FirePro S9150 is specifically designed for compute workloads and is aided by 16GB of GDDR5 memory on a 512-bit memory interface for up to 320GB/s of memory bandwidth.</p> <p>It has a maximum power consumption of 235 watts, and at full blast, the card is capable of 5.07 TFLOPS of peak single-precision floating point performance, which is up to 18 percent more than the competition, AMD says. Double-precision floating point performance peaks at 2.53 TFLOPS. It's made up of 2,816 stream processors (44 GCN compute units) and is ready to support OpenCL 2.0.</p> <p>"Today’s supercomputers feature an increasing mix of GPUs, CPUs and co-processors to achieve great performance, and many of them are being implemented in an environmentally responsible manner to help reduce power and water consumption," <a href="http://www.amd.com/en-us/press-releases/Pages/worlds-most-powerful-2014aug06.aspx" target="_blank">said David Cummings</a>, senior director and general manager, professional graphics, AMD. "Designed for large scale multi-GPU support and unmatched compute performance, AMD FirePro S9150 ushers in a new era of supercomputing. Its memory configuration, compute capabilities and performance per watt are unmatched in its class, and can help take supercomputers to the next level of performance and energy efficiency."</p> <p>AMD also rolled out its FirePro S9050 GPU with 12GB of GDDR5 memory on a 384-bit bus for up to 264GB/s of memory bandwidth, 1,792 stream processors (28 GCN compute units), and 225W maximum power consumption.</p> <p>Both the <a href="http://www.amd.com/en-us/products/graphics/workstation/firepro-remote-graphics/s9150" target="_blank">FirePro S9150</a> and <a href="http://www.amd.com/en-us/products/graphics/workstation/firepro-remote-graphics/s9050" target="_blank">FirePro S9050</a> will be available in the third quarter of this year. No word yet on price.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/amd_firepro_s9150_brings_253_tflops_double_precision_performance_servers#comments amd firepro s9150 gpu graphics card Hardware server News Wed, 06 Aug 2014 16:34:32 +0000 Paul Lilly 28300 at http://www.maximumpc.com Gigabyte Radeon R9 290X OC Review http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review <!--paging_filter--><h3>As good as it gets, if you can find one to buy</h3> <p>Aftermarket Radeon R9 290X GPUs are beginning to make the rounds, and this month we had a WindForce-cooled behemoth from <a title="gigabyte" href="http://www.maximumpc.com/tags/Gigabyte" target="_blank">Gigabyte</a> strutting its stuff in the lab. Unlike last month’s <a title="sapphire tri x r9 290x" href="http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review" target="_blank">Sapphire Tri-X R9 290X</a>, this board features a custom PCB in addition to the custom cooler, whereas the Sapphire slapped a huge cooler onto the reference design circuit board. Theoretically, this could allow for higher overclocks on the Gigabyte due to better-quality components, but more on that later.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/windforce14052_small_0.jpg"><img src="/files/u152332/windforce14052_small.jpg" alt="Unlike the reference design, Gigabyte’s R9 290X is cool, quiet, and overclockable." title="Gigabyte Radeon R9 290X OC" width="620" height="476" /></a></p> <p style="text-align: center;"><strong>Unlike the reference design, Gigabyte’s R9 290X is cool, quiet, and overclockable.</strong></p> <p>This is the overclocked version of the card, so it clocks up to 1,040MHz under load, which is a mere 40MHz over stock. These boards always have conservative overclocks out of the box, though, and that is by no means the final clock speed for this card. We’ve covered its WindForce cooler in past reviews, so we won’t go into all the details, but it’s a three-fan cooler that only takes up two PCIe slots and uses six heat pipes with inclined heatsinks to better dissipate the warm. It’s good for 450W of heat dispersal, according to Gigabyte, and since the R9 290X is roughly a 300W card (AMD has never given a TDP for this particular model for some reason), the WindForce cooler should be more than up to the job.</p> <p>Like all Radeon R9 290X boards, this sucker is big and long, measuring 11.5 inches. Gigabyte recommends you use at least a 600W power supply with it, and it sports two dual-link DVI ports for 2560x1600 gaming, as well as HDMI 1.4 and DisplayPort 1.2a if you want to run 4K. The card comes bundled with a free set of headphones. It used to include a free copy of Battlefield 4, but the company told us it was no longer offering the game bundle because it had run out of coupons. The MSRP of the board is $620, but some stores had it for $599 while others marked it up to $700.</p> <p>Once we had this Windy Bad Boy in the lab, we were very curious to compare it to the Sapphire Tri-X R9 290X we tested last month. Since both cards feature enormous aftermarket coolers, have the exact same specs and clocks, and are roughly the same price, we weren’t surprised to find that they performed identically for the most part.</p> <p>If you look at the benchmark chart, in every test the two cards are almost exactly the same—the only exception being Metro, but since that’s a PhysX game, AMD cards can get a bit wonky sometimes. In every other test, the two cards are within a few frames-per-second difference, making them interchangeable. Both cards also run in the mid–70 C zone under load, which is 20 C cooler than the reference design. We were able to overclock both cards to just a smidge over 1,100MHz, as well.</p> <p>“Okay,” you are saying to yourself. “I’m ready to buy!” Well, that’s where we run into a small problem. Gigabyte’s MSRP for this card is $620—the same as the Sapphire Tri-X card—but at press time, the cheapest we could find it for was $700 on Newegg. We can’t ding Gigabyte for Newegg’s pricing, but it’s a real shame these R9 290X cards are so damned expensive.</p> <p><strong>$620,</strong> <a href="http://www.gigabyte.us/">www.gigabyte.us</a></p> http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review#comments Air Cooling amd april issues 2014 Gigabyte Radeon R9 290X OC gpu graphics card Hardware maximum pc Review Reviews Tue, 05 Aug 2014 19:52:42 +0000 Josh Norem 28227 at http://www.maximumpc.com Sapphire Tri-X Radeon R9 290X Review http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review <!--paging_filter--><h3>A real gem of a GPU</h3> <p>For those who haven’t kept up with current events: Late last year AMD launched its all-new Hawaii GPUs, starting with its flagship Radeon R9 290X that featured a blower-type cooler designed by AMD. In testing, it ran hotter than any GPU we’ve ever tested, hitting 94 C at full load, which is about 20 C higher than normal. AMD assured everyone this was no problemo, and that the board was designed to run those temps until the meerkats came home. It was stable at 94 C, but the GPU throttled performance at those temps. The stock fan was also a bit loud at max revs, so though the card offered kick-ass performance, it was clearly being held back by the reference cooler.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_13650_small_0.jpg"><img src="/files/u152332/sapphire_13650_small.jpg" alt="The Tri-X throws off AMD’s meh cooler." title="Sapphire Tri-X Radeon R9 290X" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>The Tri-X throws off AMD’s meh cooler.</strong></p> <p>Therefore, we all eagerly awaited the arrival of cards with aftermarket coolers, and this month we received the first aftermarket Radeon R9 290X—the massive triple-fan Tri-X model from Sapphire; and we must say, all of our Radeon prayers have been answered by this card.</p> <p>Not only does it run totally cool and quiet at all times, but because it runs so chilly it has plenty of room to overclock, making it a card that addresses every single one of our complaints about the reference design from AMD. There is one caveat: price. The Sapphire card is $50 more expensive than the reference card at $600, but you are obviously getting quite a bit of additional horsepower for your ducats.</p> <p>When we first fired it up, we were amazed to see it hit 1,040MHz under load, and stay there throughout testing. Even more surprising were the temps we were seeing. Since the reference card hits 94 C all day long, this is obviously a really hot GPU, but the Sapphire Tri-X cooler was holding it down at a chilly 75 C. The card was whisper-quiet too, which was also a pleasant surprise given the noise level of the reference cooler. We were also able to overclock it to 1,113MHz, which is a turnaround in that we could not overclock the reference board at all since it throttles at stock settings.</p> <p><strong>$600,</strong> <a href="http://www.sapphiretech.com/landing.aspx?lid=1">www.sapphiretech.com</a></p> <p><span style="font-style: italic;">Note: This review was originally featured in the March 2014 issue of the&nbsp;</span><a style="font-style: italic;" title="maximum pc mag" href="https://w1.buysub.com/pubs/IM/MAX/MAX-subscribe.jsp?cds_page_id=63027&amp;cds_mag_code=MAX&amp;id=1366314265949&amp;lsid=31081444255021801&amp;vid=1&amp;cds_response_key=IHTH31ANN" target="_blank">magazine</a><span style="font-style: italic;">.</span></p> http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review#comments Air Cooling amd gpu graphics card Hardware March issues 2014 maximun pc Review Sapphire Tri-X Radeon R9 290X Reviews Thu, 24 Jul 2014 22:09:13 +0000 Josh Norem 28024 at http://www.maximumpc.com Nvidia GeForce 337.88 Driver Now Available to Download http://www.maximumpc.com/nvidia_geforce_33788_driver_now_available_download <!--paging_filter--><h3><img src="/files/u69/geforce_close.jpg" alt="GeForce Close" title="GeForce Close" width="228" height="152" style="float: right;" />New drivers coincide with Watch Dogs launch</h3> <p><strong>Nvidia on Monday launched new GeForce 337.88 WHQL certified drivers</strong> in preparation for today's release of Ubisoft's much anticipated Watch Dogs title. According to Nvidia, this latest release "ensures you'll have the best possible gaming experience for Watch Dogs." In addition, Nvidia promises performance gains of 10 percent or more in several titles at 2560x1400 and 3840x2160 (4K) resolutions.</p> <p>Some of these include Call of Duty: Ghosts, F1 2013, Hitman Absolution, Sniper v2, DiRT 3, Just Cause 2, Team Fortress 2, Sleeping Dogs, Thief, and a few others.</p> <p>Nvidia also said it made some key DirectX optimizations that should result in lower game loading times and "significant performance increases" in a bunch of titles compared to the previous 335.23 WHQL drivers. You can also expect CPU overhead reductions, which should improve performance across the board.</p> <p>You find out more in the <a href="http://us.download.nvidia.com/Windows/337.88/337.88-win8-win7-winvista-desktop-release-notes.pdf" target="_blank">Release Notes (PDF)</a> and the grab the updated drivers direct from <a href="http://www.nvidia.com/Download/Find.aspx?lang=en-us" target="_blank">Nvidia</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_geforce_33788_driver_now_available_download#comments 337.88 driver Gaming geforce gpu graphics card nvidia Software Video Card watch dogs News Tue, 27 May 2014 15:38:03 +0000 Paul Lilly 27882 at http://www.maximumpc.com Graphics Cards Shipments Down Slightly During Seasonal Lull http://www.maximumpc.com/graphics_cards_shipments_down_slightly_during_seasonal_lull_2014 <!--paging_filter--><h3><img src="/files/u69/gpu.jpg" alt="Nvidia GPU" title="Nvidia GPU" width="228" height="152" style="float: right;" />Add-in board graphics is mostly a two horse race</h3> <p>Market research firm <strong>Jon Peddie Research (JPR) said the decline in add-in graphics boards (disrete graphics cards, in other words, as opposed to integrated GPUs) during the first quarter of 2014 was "disappointing, but seasonally understandable."</strong> On a sequential basis, AIB shipments dropped 6.7 percent, though on a year-to-year basis, they're only down 0.8 percent, compared to desktop PCs as a whole, which declined 1.1 percent.</p> <p>The attach rate of AIBs to desktop PCs has fallen sharply over the last several years, dropping from 63 percent in Q1 2008 to 45 percent in Q1 2014. That's not surprising when you consider that both Intel and AMD include integrated graphics on their processors these days. In addition, the attach rate is actually up compared to last quarter, when it was 43.8 percent.</p> <p>According to JPR, Nvidia's share of the AIB market is now 65 percent, up a tick from 64.9 percent last quarter and 64.2 percent a year ago. Meanwhile, AMD's share is holding steady at 35 percent, the same as it was last quarter and down a smidgendd from 35.6 percent a year ago.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/graphics_cards_shipments_down_slightly_during_seasonal_lull_2014#comments amd Build a PC gpu graphics cards Hardware Jon Peddie Research jpr nvidia Video cards News Thu, 22 May 2014 16:14:54 +0000 Paul Lilly 27856 at http://www.maximumpc.com PowerColor Teases New Devil 13 Graphics Card, Leaked Photos Suggest Radeon R9 295X2 http://www.maximumpc.com/powercolor_teases_new_devil_13_graphics_card_leaked_photos_suggest_radeon_r9_295x2 <!--paging_filter--><h3><img src="/files/u69/devil_13.jpg" alt="PowerColor Devil 13 Radeon R9 295X2" title="PowerColor Devil 13 Radeon R9 295X2" width="228" height="152" style="float: right;" />PowerColor's next Devil 13 graphics card may require four 8-pin PCI-E power connectors</h3> <p>You might have expected hell would freeze over before you'd ever see a graphics card with the audacity to demand four -- yes FOUR! -- 8-pin PCI-Express power connectors. You'd also be wrong. Maybe, anyway -- if leaked photos posted to a Chinese language web forum turn out to be legitimate, then PowerColor's upcoming Devil 13 Radeon R9 295X2 dual-GPU graphics card will have a hellish thirst for electricity.</p> <p>Here's what we know for sure -- PowerColor is conjuring up another Devil 13 graphics card. We know this because the company teased a Devil 13 photograph on its <a href="https://www.facebook.com/PowerColor.Europe/posts/10152100216771381?stream_ref=10" target="_blank">Facebook page</a> with the caption, "Something very special is coming in May... dare to guess?"</p> <p>Guesses might not be necessary, as a <a href="http://www.chiphell.com/thread-1027201-1-1.html" target="_blank">handful of blurry photos</a> (aren't they always?) posted to Chip Hell show what the poster claims is PowerColor's Devil 13 Radeon R9 295X2 graphics card. One of the photos shows a row of 8-pin PCI-E connectors, four in all, waiting to be filled. That's twice as many as found on a reference Radeon R9 295X2.</p> <p style="text-align: center;"><img src="/files/u69/devil_13_r9_295x2_connectors.jpg" alt="PowerColor Devil 13 Radeon R9 295X2 Connectors" title="PowerColor Devil 13 Radeon R9 295X2 Connectors" width="580" height="393" /></p> <p>The rest of the card sports an aggressive design that takes up three slots looks authentic, so we're inclined to believe this is the real deal.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/powercolor_teases_new_devil_13_graphics_card_leaked_photos_suggest_radeon_r9_295x2#comments Build a PC devil 13 gpu graphics card Hardware powercolor r9 295x2 radeon Video Card News Wed, 07 May 2014 17:59:03 +0000 Paul Lilly 27770 at http://www.maximumpc.com PAX East 2014: Gigabyte Shows Off Nvidia GeForce GTX 800M-Powered Notebooks [Video] http://www.maximumpc.com/pax_east_2014_gigabyte_shows_nvidia_geforce_gtx_800m-powered_notebooks_video <!--paging_filter--><h3>Even as rivals opt for higher-res options, Gigabyte sticks to full HD</h3> <p>When Nvidia launched its latest GeForce GTX 800M Series GPUs last month, Gigabyte wasted little time in announcing an entire <a href="http://www.maximumpc.com/gigabyte_releases_several_gaming_laptops_nvidia_gtx_800m_series_graphics#slide-2" target="_blank"><strong>lineup of gaming notebooks built around the new mobile graphics cards</strong></a>. Maximum PC’s Jimmy Thang found Gigabyte parading some of the new notebooks at PAX East.</p> <p><iframe src="//www.youtube.com/embed/eMpGxjAk8YE?feature=player_detailpage" width="640" height="360" frameborder="0"></iframe></p> <p><strong>Gigabyte P25X v2:</strong> Starting at $2,099, the 15.6-inch P25X v2 is a high-end gaming laptop that packs quite a powerful punch with its Intel Core i7-4810MQ CPU and GeForce GTX 880M GDDR5 8GB graphics card. Other specs include full HD IPS matte display, tri-storage slots with RAID 0 support, backlit gaming keyboard, subwoofer, and Blu-ray writer. It is expected to begin shipping later this month.</p> <p><strong>Gigabyte P35W v2:</strong> Like the P25X v2, this one also sports a 15.6-inch full HD display. The P35W v2 has enough room in its 20.9mm chassis to accommodate a quad-core Core i7 CPU, GeForce GTX 870M GDDR5 6GB graphics card, two 512GB mSATA SSDs and as many 1.5TB hard drives. It is expected to hit the market later this month starting at $1,599.</p> <p><strong>Gigabyte P34G v2:</strong> The smallest of the three notebooks sighted by us, this upcoming thin and light (20.9mm and 3.57 lbs) notebook features a 14-inch full HD TN matte display, Intel Core i7-4700HQ CPU, GeForce GTX 860M GDDR5 4GB GPU, 8GB RAM, 1TB HDD and 128GB SSD. The P34G V2 starts at $1,499.</p> <p>Follow Pulkit on <a href="https://plus.google.com/107395408525066230351?rel=author" target="_blank">Google+</a></p> http://www.maximumpc.com/pax_east_2014_gigabyte_shows_nvidia_geforce_gtx_800m-powered_notebooks_video#comments Gaming gigabyte gpu Hardware notebooks nvidia geforce gtx 800m pax east 2014 News Mon, 14 Apr 2014 07:37:55 +0000 Pulkit Chandna and Jimmy Thang 27626 at http://www.maximumpc.com Sapphire Announces Overclocked Radeon R9 290 Vapor-X OC Graphics Card http://www.maximumpc.com/sapphire_announces_overclocked_radeon_r9_290_vapor-x_oc_graphics_card <!--paging_filter--><h3><img src="/files/u69/sapphire_r9_290_vapor_x_oc.jpg" alt="Sapphire R9 290 Vapor-X OC" title="Sapphire R9 290 Vapor-X OC" width="228" height="222" style="float: right;" /></h3> <h3>Factory overclocked and custom cooled</h3> <p>AMD's Radeon R9 290 graphics card already runs fast, but kicking things up a notch is <strong>Sapphire, which just launched its R9 290 Vapor-X OC</strong> with a custom cooling solution. It has 2,560 Stream Processors, 4GB of GDDR5 memory running at 1,400MHz (5.6GB/s effective), and an overclocked GPU that's been goosed to 1,030MHz, up from a maximum clockspeed of 947MHz on reference cards.</p> <p>To keep things stable, Sapphire combined its vapor chamber cooler with its Tri-X fans. The vapor chamber is mounted between the GPU itself and the base of the heatsink and cooler assembly to pull the most amount of heat away from the GPU and into the cooler.</p> <p>Sapphire's Tri-X cooling solution consists of three 90mm fans with dust repelling ball bearings and five heat pipes, including a 10mm core heat pipe flanked by two 8mm pipes and two 6mm pipes snaking through the cooler.</p> <p>"The stylish fan shroud also contains molded guides to control the airflow across and through the cooler, including over the Power Control circuitry," <a href="http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&amp;articleID=5683&amp;lid=1" target="_blank">Sapphire explains</a>. "The combination of Vapor-X and Tri-X technologies results in GPU temperatures between 5 and 10 degrees cooler than with Tri-X alone – delivering industry leading cooling for this generation of graphics cards."</p> <p><img src="/files/u69/sapphire_r9_290_vapor-x_oc_front.jpg" alt="Sapphire R9 290 Vapor-X OC" title="Sapphire R9 290 Vapor-X OC" width="620" height="279" /></p> <p>No word yet on when the <a href="http://www.sapphiretech.com/presentation/product/?cid=1&amp;gid=3&amp;sgid=1227&amp;pid=2167&amp;psn=&amp;lid=1&amp;leg=0" target="_blank">Sapphire R9 290 Vapor-X OC</a> will be available or for how much.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/sapphire_announces_overclocked_radeon_r9_290_vapor-x_oc_graphics_card#comments Build a PC gpu graphics card Hardware overclocking r9 290 vapor-x oc radeon sapphire Video Card News Fri, 04 Apr 2014 15:21:38 +0000 Paul Lilly 27566 at http://www.maximumpc.com AMD Tweaks Wafer Agreement with GlobalFoundries to Include GPUs http://www.maximumpc.com/amd_tweaks_wafer_agreement_globalfoundries_include_gpus_2014 <!--paging_filter--><h3><img src="/files/u69/amd_gpu.jpg" alt="AMD GPU" title="AMD GPU" width="228" height="223" style="float: right;" />Amended agreement includes $50 million in additional purchase commitments</h3> <p><strong>AMD bumped up its purchase commitments with GlobalFoundries</strong> in 2014 by about $50 million. Under terms of the amended Wafer Supply Agreement (WSA), AMD expects to pay $1.2 billion in all this year, though what's interesting is that the deal is no longer limited to traditional CPUs and APUs; it now includes GPUs and semi-custom game console chips, such as those found in the Xbox 360 and PlayStation 4.</p> <p>AMD has leaned on Taiwan Semiconductor Manufacturing Company (TSMC) to produce graphics chips, including more than 10 million Xbox One and PS4 chips. With console sales expected to keep climbing, AMD essentially ensured that a shortage of parts won't become an issue.</p> <p>"The successful close of our amended wafer supply agreement with GlobalFoundries demonstrates the continued commitment from our two companies to strengthen our business relationship as long-term strategic partners, and GlobalFoundries’ ability to execute in alignment with our product roadmap," <a href="http://www.amd.com/en-us/press-releases/Pages/amd-amends-wafer-2014apr1.aspx" target="_blank">said Rory Read</a>, president and chief executive officer, AMD. "This latest step in AMD’s continued transformation plays a critical role in our goals for 2014."</p> <p>While specific figures weren't disclosed, AMD and GlobalFoundries did establish fixed pricing and other commitments as part of the amended agreement.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/amd_tweaks_wafer_agreement_globalfoundries_include_gpus_2014#comments amd apu cpu globalfoundries gpu Hardware manufacturing wafers News Wed, 02 Apr 2014 16:13:14 +0000 Paul Lilly 27551 at http://www.maximumpc.com Here's What AMD's Dual-GPU Radeon R9 295 X2 Graphics Card Could Look Like http://www.maximumpc.com/heres_what_amds_dual-gpu_radeon_r9_295_x2_graphics_card_could_look <!--paging_filter--><h3><img src="/files/u69/amd_radeon_r9_295_x2_chiphell.jpg" alt="AMD Radeon R9 295 X2" title="AMD Radeon R9 295 X2" width="228" height="153" style="float: right;" />AMD's dual-GPU card might rock a hybrid cooling solution</h3> <p>It's good practice to take Internet rumors with a healthy dose of skepticism, especially on April Fool's Day. Got it? Good, now we can continue with what are claimed to be the <strong>first photos of AMD's rumored Radeon R9 295 X2</strong>, a dual-GPU graphics card that some have <a href="http://www.maximumpc.com/rumors_500w_dual-gpu_radeon_r9_295_graphics_card_hit_web">surmised</a> could carry a massive 500W TDP. If that's true, a gnarly cooling solution would be mandatory, and certainly that's what these early photos show.</p> <p>Chinese language website <a href="http://www.chiphell.com/thread-1002731-1-1.html" target="_blank"><em>Chip Hell</em> posted a couple photos</a> of what the author says is AMD's upcoming card, and to keep the two GPUs chilly, it's shown with a dual-slot hybrid cooler. It essentially combines a self-contained liquid cooler with a traditional air cooling shroud to give the card a double dose of cooling potential.</p> <p>It's a funky looking design and one that's sure to draw criticism if AMD goes forward with a setup like this, especially since Nvidia hasn't needed to resort to anything this extravagant. At the same time, it's believed that each GPU carries a 250W TDP, so with two of those underneath the hood, it wouldn't be shocking if AMD used a hybrid cooling solution.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/heres_what_amds_dual-gpu_radeon_r9_295_x2_graphics_card_could_look#comments amd Build a PC gpu graphics card Hardware radeon r9 295 Video Card News Tue, 01 Apr 2014 16:20:37 +0000 Paul Lilly 27546 at http://www.maximumpc.com GPU Doubletake: Nvidia Launches Dual-GPU GeForce Titan Z for $2,999 http://www.maximumpc.com/gpu_doubletake_nvidia_launches_dual-gpu_geforce_titan_z_2999 <!--paging_filter--><h3><img src="/files/u69/titan_z.jpg" alt="Nvidia GeForce Titan Z" title="Nvidia GeForce Titan Z" width="228" height="146" style="float: right;" />This dual-GPU monster wields 5,760 CUDA cores and 12GB of memory</h3> <p>If you were hoping Nvidia would unleash a beast at the GPU Technology Conference (GTC), you certainly won't be disappointed. You might, however, suffer a stroke induced from sticker shock -- totally understandable, given that <strong>Nvidia just unveiled the GeForce Titan Z for $2,999!</strong> Information is coming in fast and furious, but so far it appears the Titan Z is essentially a dual-GPU version of the Titan Black.</p> <p>What that translates into is an insane spec sheet. Covering the basics, the Titan Z is overflowing with 5,760 CUDA cores and 12GB of memory for 8 TeraFLOPS of performance. Don't bother sending emails about typos, those numbers are correct.</p> <p>The big question here is whether potential customers will be willing to drop $2,999 on the GeForce Z. As a point of reference, the single-GPU Titan Black streets for around $1,000 to $1,100. The advantage of the Titan Z is that it's a single card solution, though the luxury of running one card versus two carriers a hefty premium.</p> <p>"If you’re in desperate need of a supercomputer that you need to fit under your desk, we have just the card for you," Jen-Hsun stated in a <a href="http://blogs.nvidia.com/blog/2014/03/25/titan-z/" target="_blank">blog post</a>.</p> <p><img src="/files/u69/titan_z_specs.jpg" alt="GeForce Titan Z Specs" title="GeForce Titan Z Specs" width="620" height="421" /></p> <p>According to Nvidia, the Titan Z is engineered for next-generation 5K and multi-monitor gaming. In addition, the GPU maker says the the card runs cool and quiet, thanks to low-profile components and ducted baseplate channels that reduce turbulence.</p> <p>Is your mind blown yet? If not, we can take care of that -- picture two of these cards running in tandem. BAM.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/gpu_doubletake_nvidia_launches_dual-gpu_geforce_titan_z_2999#comments Build a PC geforce titan z gpu graphics Hardware News Tue, 25 Mar 2014 17:25:35 +0000 Paul Lilly 27508 at http://www.maximumpc.com Nvidia Details Legacy Support Plans for DirectX 10 Generation Graphics Cards http://www.maximumpc.com/nvidia_details_legacy_support_plans_directx_10_generation_graphics_cards_2014 <!--paging_filter--><h3><img src="/files/u69/bfg_9800_gtx.png" alt="BFG 9800 GTX" title="BFG 9800 GTX" width="228" height="205" style="float: right;" />It's time to think about upgrading that GeForce 9800 GTX card</h3> <p>Just as Microsoft is getting ready to end support for Windows XP next month, Nvidia also has an end in sight for its own legacy products, though it's not coming up quite as quickly. <strong>When Nvidia gets around to releasing its GeForce 343 drivers, support will officially end for all DirectX 10 generation graphics cards</strong>, freeing the GPU maker to focus soley on Fermi, Kepler, and Maxwell products.</p> <p>This doesn't mean you have to rush out and buy a new graphics card, as Nvidia will continue to support your old hardware up through its upcoming GeForce 340 drivers. However, that will be the end of the line for such products.</p> <p>"The Release 340 drivers will continue to support these products until April 1, 2016, and the Nvidia support team will continue to address driver issues for these products in driver branches up to and including Release 340. However, future driver enhancements and optimizations in driver releases after Release 340 will not support these products," <a href="http://nvidia.custhelp.com/app/answers/detail/a_id/3473" target="_blank">Nvidia states</a> on a new support page.</p> <p>Lots of GPUs will be affected by this, including all GeForce 8 and 9 Series desktop parts, and GeForce 7, 8, and 9 Series notebook GPUs, to name just a few. This also affects Nvidia's professional line, such as the Quadro FX 5800 and others.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_details_legacy_support_plans_directx_10_generation_graphics_cards_2014#comments Build a PC directx 10 Drivers DX10 gpu graphics card Hardware nvidia Video Card News Thu, 13 Mar 2014 16:44:03 +0000 Paul Lilly 27435 at http://www.maximumpc.com Nvidia Envisions Longer Lasting Gaming Laptops with 800M Series GPUs http://www.maximumpc.com/nvidia_envisions_longer_lasting_gaming_laptops_800m_series_gpus <!--paging_filter--><h3><img src="/files/u69/geforce_gtx_880m.jpg" alt="GeForce GTX 880M" title="GeForce GTX 880M" width="228" height="137" style="float: right;" />A top to bottom GPU refresh</h3> <p><strong>Nvidia today splashed the mobile market with more than half a dozen new GPUs</strong> comprising the company's GeForce GTX 800M Series. This is a top to bottom release, meaning the new GPUs range from entry-level (GeForce 830M) all the way up to what Nvidia claims is the fastest mobile graphics chip in the world, the GeForce GTX 880M. The new releases join Nvidia's already available 820M GPU.</p> <p>According to Nvidia, the new GPUs are 30 percent, 40 percent, and even 60 percent (in some cases) faster than its previous generation of mobile GPUs, and none more burly than the 880M. Nvidia's flagship part is based on Kepler (not Maxwell) and features 1,536 CUDA cores and 128 TMUs. It has a core clockspeed of 954MHz and supports up to 4GB of GDDR5 memory clocked at 5,000MHz (effective) on a 256-bit memory bus.</p> <p>This release isn't solely focused on speed, however, as Nvidia points out that its Battery Boost technology is one of several new feature additions. This one in particular is supposed to be able to deliver up to twice the gaming battery life when enabled.</p> <p>"Here’s how it works: instead of your notebook pushing every component to its max, Battery Boost targets a user defined frame rate, such as 30 FPS. The driver level governor takes over from there, and operates all your system components, including CPU, GPU, and memory at peak efficiency, while maintaining a smooth, playable experience," Nvidia explains in a <a href="http://blogs.nvidia.com/blog/2014/03/12/new-geforce-800m-gpus/" target="_blank">blog post</a>.</p> <p>More details on the GeForce 830M, 840M, GTX 850M, GTX 860M, GTX 870M, and GTX 880M are available on <a href="http://www.geforce.com/hardware/notebook-gpus" target="_blank">Nvidia's website</a>.</p> <p><em>Follow Paul on <a href="https://plus.google.com/+PaulLilly?rel=author" target="_blank">Google+</a>, <a href="https://twitter.com/#!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="http://www.facebook.com/Paul.B.Lilly" target="_blank">Facebook</a></em></p> http://www.maximumpc.com/nvidia_envisions_longer_lasting_gaming_laptops_800m_series_gpus#comments 800m Gaming geforce gpu graphics Hardware laptops mobile notebooks nvidia News Wed, 12 Mar 2014 16:56:06 +0000 Paul Lilly 27426 at http://www.maximumpc.com