Hardware http://www.maximumpc.com/taxonomy/term/41/ en OCZ Vertex 460 240GB Review http://www.maximumpc.com/ocz_vertex_460_240gb_review <!--paging_filter--><h3>Rumors of its death were greatly exaggerated</h3> <p>That last time we heard from OCZ was back before the end of 2013, when the company was in the grips of bankruptcy and nobody was sure what its future held. Fast forward to March 2014, and things are looking rather good for the formerly beleaguered company, much to everyone’s surprise. Rather than simply dissolve and fade away like we had feared, the company has been acquired by storage behemoth Toshiba, and is now operating as an independent subsidiary.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/vertex460_lrg_small_0.jpg"><img src="/files/u152332/vertex460_lrg_small.jpg" alt="OCZ’s new drive has a more subdued, corporate look to it, thanks to a takeover by “the man.”" title="OCZ Vertex 460 240GB" width="620" height="449" /></a></p> <p style="text-align: center;"><strong>OCZ’s new drive has a more subdued, corporate look to it, thanks to a takeover by “the man.”</strong></p> <p>The best news is OCZ’s NAND-acquisition troubles are seemingly a thing of the past, as Toshiba is one of the world’s largest manufacturers of NAND. So, it is no surprise that the first drive we’re seeing from the new venture is essentially a reborn Vector drive, only with Toshiba NAND flash. Dubbed the Vertex 460, this “new” drive blends the company’s proprietary Barefoot 3 controller found on its high-end Vector drives with Toshiba’s 19nm MLC NAND flash, so it’s ditching the Micron NAND it used previously. The result is basically a slight watering-down of its Vector 150 drive in order to make it more affordable and consumer-friendly. It also needed to bring its Barefoot 3 controller over to its mainstream line of Vertex-branded drives, so this drive accomplishes that feat, as well.</p> <p>In many ways, the Vertex 460 is very similar to the company’s recent Vector 150 drive, the only difference being the Vector has a five-year warranty and has a higher overall endurance rating to reflect its use of binned NAND flash. The Vertex 460 is no slouch, though, and is rated to handle up to 20GB of NAND writes per day for three years. The drive also utilizes over-provisioning, so 12 percent of the drive is reserved for NAND management by the Barefoot 3 controller. Though you lose some capacity, you gain longer endurance and better performance, so it’s a worthwhile trade-off. The Vertex 460 also offers hardware encryption support, which is very uncommon for a mainstream drive, and though we’d never use it, it’s nice to have options. Otherwise, its specs are par for the course in that it’s a 7mm drive and is available in 120GB, 240GB, and 480GB flavors. It’s also bundled with a 3.5-inch bay adapter as well as a copy of Acronis True Image, which is appreciated.</p> <p>When we strapped the Vertex to our test bench, we saw results that were consistently impressive. In every test, the Vertex 460 was very close to the fastest drives in its class, and in all scenarios it’s very close to saturating the SATA bus, so it’s not really possible for it to be any faster. It had no problem handling small queue depths of four commands in ATTO, and held its own with a 32 queue depth in Iometer, too. It was a minute slower than the Samsung 840 EVO in our Sony Vegas test, which writes a 20GB uncompressed AVI file to the drive, but also much faster than the Crucial M500 in the same test. Overall, there were no weak points whatsoever in its performance, but it is not faster than the Samsung 840 EVO, and its OCZ Toolbox software utility is extremely rudimentary compared to the Samsung app. Though the Vertex 460 is an overall very solid drive, it doesn’t exceed our expectations in any particular category. In other words, it’s a great SSD, but not quite Kick Ass.</p> <p><strong>$190,</strong> <a href="http://ocz.com/">www.ocz.com</a></p> http://www.maximumpc.com/ocz_vertex_460_240gb_review#comments Hard Drive Hardware HDD May issues 2014 OCZ Vertex 460 240GB Review solid state drive ssd Reviews Wed, 20 Aug 2014 14:16:12 +0000 Josh Norem 28382 at http://www.maximumpc.com Nvidia Shield Tablet Review http://www.maximumpc.com/nvidia_shield_tablet_review_2014 <!--paging_filter--><h3>Updated: Now with video review!&nbsp;</h3> <p>Despite its problems, we actually liked <a title="Nvidia Shield review" href="http://www.maximumpc.com/nvidia_shield_review_2013" target="_blank">Nvidia’s original Shield Android gaming handheld</a>. Our biggest issue with it was that it was bulky and heavy. With rumors swirling around about a Shield 2, we were hoping to see a slimmer, lighter design. So consider us initially disappointed when we learned that the next iteration of Shield would just be yet another Android tablet. Yawn, right? The fact of the matter is that the Shield Tablet may be playing in an oversaturated market, but it’s still great at what it sets out to be.</p> <p><iframe src="//www.youtube.com/embed/dGigsxi9-K4" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>We've updated our review to include the video review above.</strong></p> <p>At eight inches, the Shield Tablet features a gorgeous 1900x1200 display, which shares the same resolution as Google’s flagship <a title="nexus 7 review" href="http://www.maximumpc.com/google_nexus_7_review_2013" target="_blank">Nexus 7</a> tablet. At 13.1 ounces, the Shield Tablet is about three ounces heavier than the Nexus 7 but still a lot lighter than the original’s 1 lb. 4.7 ounces.&nbsp;</p> <p>Part of the weight increase with the Shield Tablet over the Nexus 7 is due to the extra inch that you’re getting from the screen, but also because the Shield Tablet is passively cooled and has an extra thermal shield built inside to dissipate heat. It’s a little heavier than we like, but isn’t likely to cause any wrist problems. On the back of the Shield is an anti-slip surface and a 5MP camera, and on the front of the tablet is a front-facing 5MP camera and two front-facing speakers. While the speakers are not going to blow away dedicated Bluetooth speakers, they sound excellent for a tablet. In addition to the speakers, the Shield Tablet has a 3.5mm headphone jack up at the top. Other ports include Micro USB, Mini HDMI out, and a MicroSD card slot capable of taking up to 128GB cards. Buttons on the Shield include a volume rocker and a power button, which we found to be a little small and shallow for our liking.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_exploded_view_black_bckgr.jpg" alt="Nvidia Shield Tablet guts" title="Nvidia Shield Tablet guts" width="620" height="349" /></p> <p style="text-align: center;"><strong>The guts of the Nvidia Shield Tablet.</strong></p> <p>All of this is running on the latest version of Android KitKat (4.4). Nvidia says that it will update the tablet to Android L within a few weeks of Google’s official release. If Nvidia’s original Shield is any indication of how well the company keeps up with OS updates, you should be able to expect to get the latest version of Android after a couple of weeks, if not a months, after release. Regardless, the Shield Tablet is running a pretty stock version of Android to begin with, the main difference being that Nvidia has pre-loaded the tablet with its Shield Hub, which is a 10-foot UI used to purchase, download, and launch games.</p> <p>Arguably, the real star of the tablet is Nvidia’s new Tegra K1 mobile superchip. The 2.2GHz quad-core A15 SOC features Nvidia’s Kepler GPU architecture and 192 CUDA cores along with 2GB of low-power DDR3. K1 supports many of the graphical features commonplace in GeForce graphics cards, including tesselation, HDR lighting, Global illumination, subsurface scattering, and more.</p> <p>In our performance benchmarks, the K1 killed it. Up until now, the original Shield’s actively cooled Tegra 4 is arguably one of the most, if not <em>the</em> most, powerful Android SOC on the market, but the K1 slaughters it across the board. In Antutu and GeekBench benchmark, we saw modest gains of 12 percent to 23 percent in Shield vs. Shield Tablet action. But in Passmark and GFX Bench’s Trex test, we saw nearly a 50 percent spread, and in 3DMark’s mobile Icestorm Unlimited test, we saw an astounding 90 percent advantage for the Shield Tablet. This is incredible when you consider that the tablet has no fans and a two-watt TDP. Compared to the second-gen Nexus 7, the Shield Tablet benchmarks anywhere from 77 percent to 250 percent faster. This SOC is smoking fast.</p> <p>In terms of battery life, Nvidia claims you’ll get 10 hours watching/surfing the web and about five hours from gaming with its 19.75 Wh battery. This is up 3.75 Wh up from Google’s Nexus 7 equivalent, and from our experiential tests, we found those figures to be fairly accurate if not a best-case scenario. It will pretty much last you all day, but you'll still want to let it sip juice every night.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_war_thunder.jpg" alt="Shield Tablet review" title="Shield Tablet review" width="620" height="343" /></p> <p style="text-align: center;"><strong>The new wireless controller uses Wi-Fi Direct instead of Bluetooth for lower latency.</strong></p> <p>Of course, if you’re going to game with it, you’re going to need Nvidia’s new wireless Shield Controller. Sold separately for $60, the 11.2-ounce Shield Controller maintains the same button layout as the original Shield controller, but feels a lot lighter and is more comfortable to hold. While most Android game controllers operate over Bluetooth, Nvidia opted to go with Wi-Fi Direct, stating that it offers 2x faster response time and more bandwidth. The extra bandwidth allows you to plug a 3.5mm headphone into the controller and also allows you to link up to four controllers to the device, which is an appreciated feature when you hook up the tablet to your HDTV via the Shield Tablet’s <a title="shield console mode" href="http://www.maximumpc.com/nvidia_sweetens_shield_console_android_442_kitkat_price_drop_199_through_april" target="_blank">Console Mode</a>. Other unique features of the controller include capacitive-touch buttons for Android’s home, back, and play buttons. There’s also a big green Nvidia button that launches Shield Hub. The controller also has a small, triangle-shaped clickable touch pad which allows you to navigate your tablet from afar. One quibble with it is that we wish the trackpad was more square, to at least mimic the dimensions of the tablet; the triangle shape was a little awkward to interface with. Another problem that we initially had with the controller was that the + volume button stopped working after a while. We contacted Nvidia about this and the company sent us a new unit, which remedied the issue. One noticeable feature missing from the controller is rumble support. Nvidia said this was omitted on the original Shield to keep the weight down; its omission is a little more glaring this time around, however, since there's no screen attached to the device.</p> <p>The controller isn’t the only accessory that you’ll need to purchase separately if you want to tap into the full Shield Tablet experience. To effectively game with the tablet, you’ll need the Shield Tablet cover, which also acts as a stand. Like most tablets, a magnet in the cover shuts off the Shield Tablet when closed, but otherwise setting up the cover and getting it to act as a stand is initially pretty confusing. The cover currently only comes in black, and while we’re generally not big on marketing aesthetics, it would be nice to have an Nvidia green option to give the whole look a little more pop. We actually think the cover should just be thrown in gratis, especially considering that the cheapest 16GB model costs $300. On the upside though, you do get Nvidia’s new passive DirectStylus 2 that stows away nicely in the body of the Shield Tablet. Nvidia has pre-installed note-writing software and its own Nvidia Dabbler painting program. The nice thing about Dabbler is that it leverages the K1’s GPU acceleration so that you can virtually paint and blend colors in real time. There’s also a realistic mode where the “paint” slowly drips down the virtual canvas like it would in real life.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_trine2_0.jpg" alt="Shield tablet review" title="Shield tablet review" width="620" height="404" /></p> <p style="text-align: center;"><strong>The Shield Controller is a lot lighter and less blocky than the original Shield Portable.</strong></p> <p>But that’s probably not why you’re interested in the Shield Tablet. This device is first and foremost a gaming tablet and even comes with a free Android copy of Trine 2. Trine 2 was originally a PC game and it’s made a great transition to the Shield Tablet. While the game was never known to be a polygon pusher, it looks just as good as it ever did on its x86 debut.&nbsp;</p> <p>With gaming as the primary driver for Shield Tablet, you may wonder why Nvidia didn’t bundle its new controller. The company likely learned from Microsoft’s mistake with Kinect and the Xbox One: Gamers don’t like to spend money and getting the price as low as possible was likely on Nvidia’s mind. Of course, not everyone may even want a controller, with the general lack of support for them in games. Nvidia says there are now around 400 Android titles that support its controller, but that’s only a small percentage of Android games and the straight truth is that the overwhelming majority of these games are garbage.&nbsp;</p> <p>Nvidia is making a push for Android gaming, however. The company worked with Valve to port over Half Life 2 and Portal to the Shield and they look surprisingly fantastic and are easily the two prettiest games on Android at the moment. Whether Android will ever become a legitimate platform for hardcore gaming is anyone’s guess, but at least the Shield Tablet will net you a great front seat if the time ever arises.</p> <p>Luckily, you won’t have to rely solely on the Google Play store to get your gaming fix. Emulators run just as well here as they did on the original Shield and this iteration of Shield is also compatible with Gamestream, which is Nvidia’s streaming technology that allows you to stream games from your PC to your Shield. Gamestream, in theory, lets you play your controller-enabled PC games on a Shield.</p> <p>At this point, Nvidia says Gamestream supports more than 100 games such as Batman: Arkham Origins and Titanfall from EA’s Origin and Valve’s Steam service. The problem, though, is that there are hundreds more games on Steam and Origin that support controllers—but not the Shield Tablet’s controller. For example, Final Fantasy VII, a game that we couldn’t get to work with the original Shield, still isn't supported even though it works with an Xbox controller on the PC. When Gamestream does work, however, it’s relatively lag-free and kind of wonderful. The one caveat here is that you’ll have to get a 5GHz dual-band router to effectively get it working.&nbsp;</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/rh7fWdQT2eE" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Nvidia Shield Video demo.</strong></p> <p>Would we buy the Shield Tablet if we owned the original Shield (now renamed the Shield Portable)? Probably not. If we were looking for a new tablet and top-notch gaming performance was on the checklist, the Shield Tablet is easily the top contender today. We’d take it over the second-gen Nexus 7 in a heartbeat. While we understand why Nvidia decided to separate the cover and controller to keep the prices down and avoid the Kinect factor, we think a bundled package with a small price break as an alternative would have been nice. All things considered though, consider us surprised. The Shield Tablet is pretty dang cool.&nbsp;</p> <p><strong>$300</strong></p> <p><em><strong>Update:</strong> The original article incorrectly labled the Shield Portable benchmarks with the Nexus 7 figures. The issue has been resolved and both benchmark charts are listed below.&nbsp;</em></p> http://www.maximumpc.com/nvidia_shield_tablet_review_2014#comments android Google Hardware KitKat maximum pc nvidia portable Review shield tablet wireless controller News Reviews Tablets Mon, 18 Aug 2014 21:36:57 +0000 Jimmy Thang 28263 at http://www.maximumpc.com Best Cheap Graphics Card http://www.maximumpc.com/best_cheap_graphics_card_2014 <!--paging_filter--><h3>Six entry-level cards battle for budget-board bragging rights</h3> <p>The video-card game is a lot like Hollywood. Movies like My Left Foot and The Artist take home the Oscars every year, but movies like Grown Ups 2 and Transformers 3 pull in all the cash. It's the same with GPUs, in that everyone loves to talk about $1,000 cards, but the actual bread-and-butter of the market is made up of models that cost between $100 and $150. These are not GPUs for 4K gaming, obviously, but they can provide a surprisingly pleasant 1080p gaming experience, and run cool and quiet, too.</p> <p>This arena has been so hot that AMD and Nvidia have recently released no fewer than six cards aimed at budget buyers. Four of these cards are from AMD, and Nvidia launched two models care of its all-new Maxwell architecture, so we decided to pit them against one another in an old-fashioned GPU roundup. All of these cards use either a single six-pin PCIe connector or none at all, so you don't even need a burly power supply to run them, just a little bit of scratch and the desire to get your game on. Let's dive in and see who rules the roost!</p> <h3>Nvidia's Maxwell changes the game</h3> <p>Budget GPUs have always been low-power components, and usually need just a single six-pin PCIe power connector to run them. After all, a budget GPU goes into a budget build, and those PCs typically don't come with the 600W-or-higher power supplies that provide dual six- or eight-pin PCIe connectors. Since many budget PSUs done have PCIe connectors, most of these cards come with Molex adapters in case you don't have one. The typical thermal design power (TDP) of these cards is around 110 watts or so, but that number fluctuates up and down according to spec. For comparison, the Radeon R9 290X has a TDP of roughly 300 watts, and Nvidia's flagship card, the GTX 780 Ti, has a TDP of 250W, so these budget cards don't have a lot of juice to work with. Therefore, efficiency is key, as the GPUs need to make the most out of the teeny, tiny bit of wattage they are allotted. During 2013, we saw AMD and Nvidia release GPUs based on all-new 28nm architectures named GCN and Kepler, respectively, and though Nvidia held a decisive advantage in the efficiency battle, it's taken things to the next level with its new ultra-low-power Maxwell GPUs that were released in February 2014.</p> <p>Beginning with the GTX 750 Ti and the GTX 750, Nvidia is embarking on a whole new course for its GPUs, centered around maximum power efficiency. The goal with its former Kepler architecture was to have better performance per watt compared to the previous architecture named Fermi, and it succeeded, but it's taken that same philosophy even further with Maxwell, which had as its goal to be twice as efficient as Kepler while providing 25 percent more performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/maxwell_small_0.jpg"><img src="/files/u152332/maxwell_small.jpg" alt="Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. " width="620" height="279" /></a></p> <p style="text-align: center;"><strong>Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. </strong></p> <p>Achieving more performance for the same model or SKU from one generation to the next is a tough enough challenge, but to do so by cutting power consumption in half is an even trickier gambit, especially considering the Maxwell GPUs are being fabricated on the same 28nm process it used for Kepler. We always expect more performance for less power when moving from one process to the next, such as 32nm to 28nm or 22nm to 14nm, but to do so on the same process is an amazing achievement indeed. Though Nvidia used many technological advances to reduce power consumption, the main structural change was to how the individual CUDA cores inside the Graphics Processing Clusters (GPCs) are organized and controlled. In Kepler, each GPC contained individual processing units, named SMX units, and each unit featured a piece of control logic that handled scheduling for 192 CUDA cores, which was a major increase from the 32 cores in each block found in Fermi. In Maxwell, Nvidia has gone back to 32 CUDA cores per block, but is putting four blocks into each unit, which are now called SM units. If you're confused, the simple version is this—rather than one piece of logic controlling 192 cores, Maxwell has a piece of logic for each cluster of 32 cores, and there are four clusters per unit, for a total of 128 cores per block. Therefore, it's reduced the number of cores per block by 64, from 192 to 128, which helps save energy. Also, since each piece of control logic only has to pay attention to 32 cores instead of 192, it can run them more efficiently, which also saves energy.</p> <p>The benefit to all this energy-saving is the GTX 750 cards don't need external power, so they can be dropped into pretty much any PC on the market without upgrading the power supply. That makes it a great upgrade for any pre-built POS you have lying around the house.</p> <h4>Gigabyte GTX 750 Ti WindForce</h4> <p>Nvidia's new Maxwell cards run surprisingly cool and quiet in stock trim, and that's with a fan no larger than an oversized Ritz cracker, so you can guess what happens when you throw a mid-sized WindForce cooler onto one of them. Yep, it's so quiet and cool you have to check with your fingers to see if it's even running. This bad boy ran at 45 C under load, making it the coolest-running card we've ever tested, so kudos to Nvidia and Gigabyte on holding it down (the temps, that is). This board comes off the factory line with a very mild overclock of just 13MHz (why even bother, seriously), and its boost clock has been massaged up to 1,111MHz from 1,085MHz, but as always, this is just a starting point for your overclocking adventures. The memory is kept at reference speeds however, running at 5,400MHz. The board sports 2GB of GDDR5 memory, and uses a custom design for its blue-colored PCB. It features two 80mm fans and an 8mm copper heat pipe. Most interesting is the board requires a six-pin PCIe connector, unlike the reference design, which does not.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/9755_big_small_0.jpg"><img src="/files/u152332/9755_big_small.jpg" alt="The WindForce cooler is overkill, but we like it that way. " title="Gigabyte GTX 750 Ti WindForce" width="620" height="500" /></a></p> <p style="text-align: center;"><strong>The WindForce cooler is overkill, but we like it that way. </strong></p> <p>In testing, the GTX 750 Ti WindForce was neck-and-neck with the Nvidia reference design, proving that Nvidia did a pretty good job with this card, and that its cooling requirements don't really warrant such an outlandish cooler. Still, we'll take it, and we loved that it was totally silent at all times. Overclocking potential is higher, of course, but since the reference design overclocked to 1,270MHz or so, we don’t think you should expect moon-shot overclocking records. Still, this card was rock solid, whisper quiet, and extremely cool.</p> <p><strong>Gigabyte GTX 750 Ti WindForce</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$160(Street), <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h4>MSI GeForce GTX 750 Gaming</h4> <p>Much like Gigabyte's GTX 750 Ti WindForce card, the MSI GTX 750 Gaming is a low-power board with a massive Twin Frozr cooler attached to it for truly exceptional cooling performance. The only downside is the formerly waifish GPU has been transformed into a full-size card, measuring more than nine inches long. Unlike the Gigabyte card though, this GPU eschews the six-pin PCIe connector, as it's just a 55W board, and since the PCIe slot delivers up to 75W, it doesn't even need the juice. Despite this card's entry-level billing, MSI has fitted it with “military-class” components for better overclocking and improved stability. It uses twin heat pipes to dual 100mm fans to keep it cool, as well. It also includes a switch that lets you toggle between booting from an older BIOS in case you run into overclocking issues.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msigtx750_small_0.jpg"><img src="/files/u152332/msigtx750_small.jpg" alt="MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card." title="MSI GeForce GTX 750 Gaming" width="620" height="364" /></a></p> <p style="text-align: center;"><strong>MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.</strong></p> <p>Speaking of which, this board lives up to its name and has a beefy overclock right out of the box, running at 1,085MHz base clock and 1,163MHz boost clock. It features 1GB of GDDR5 RAM on a 128-bit interface.</p> <p>The Twin Frozr cooler handles the miniscule amount of heat coming out of this board with aplomb—we were able to press our finger forcibly on the heatsink under load and felt almost no warmth, sort of like when we give Gordon a hug when he arrives at the office. As the only GTX 750 in this test, it showed it could run our entire test suite at decent frame rates, but it traded barbs with the slightly less expensive Radeon R7 260X. On paper, both the GTX 750 and the R7 260X are about $119, but rising prices from either increased demand or low supply have placed both cards in the $150 range, making it a dead heat. Still, it's a very good option for those who want an Nvidia GPU and its ecosystem but can't afford the Ti model.</p> <p><strong>MSI GeForce GTX 750 Gaming</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$140, <a href="http://www.msi.com/ " target="_blank">www.msi.com</a></strong></p> <h4>Sapphire Radeon R7 265 Dual-X</h4> <p>The Sapphire Radeon R7 265 is the odds-on favorite in this roundup, due to its impressive specs and the fact that it consumes more than twice the power of the Nvidia cards. Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270. This card actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. At 150W, its TDP is very high—or at least it seems high, given that the GTX 750 Ti costs the exact same $150 and is sitting at just 60W. Unlike the lower-priced R7 260X Bonaire part, though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire, basically). However, it will support the Mantle API, someday.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small_0.jpg"><img src="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small.jpg" alt="Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. " title="Sapphire Radeon R7 265 Dual-X" width="620" height="473" /></a></p> <p style="text-align: center;"><strong>Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. </strong></p> <p>The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damned impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall, though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check, too, making this the go-to GPU for those with small boxes or small monitors.</p> <p><strong> Sapphire Radeon R7 265 Dual-X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9ka.jpg" alt="score:9ka" title="score:9ka" width="210" height="80" /></div> </div> </div> <p><strong>$150 (MSRP), <a href="http://www.sapphiretech.com/ " target="_blank">www.sapphiretech.com</a></strong></p> <h4>AMD Radeon R7 260X</h4> <p>The Radeon R7 260X was originally AMD's go-to card for 1080p gaming on a budget. It’s the only card in the company’s sub-$200 lineup that supports all the next-gen features that appeared in its Hawaii-based flagship boards, including support for TrueAudio, XDMA Crossfire, Mantle (as in, it worked at launch), and it has the ability to drive up to three displays —all from this tiny $120 GPU. Not bad. In its previous life, this GPU was known as the Radeon HD 7790, aka Bonaire, and it was our favorite "budget" GPU when pitted against the Nvidia GTX 650 Ti Boost due to its decent performance and amazing at-the-time game bundles. It features a 128-bit memory bus, 896 Stream Processors, 2GB of RAM (up from 1GB on the previous card), and a healthy boost clock of 1,100MHz. TDP is just 115W, so it slots right in between the Nvidia cards and the higher-end R7 265 board. Essentially, this is an HD 7790 card with 1GB more RAM, and support for TrueAudio, which we have yet to experience.</p> <p style="text-align: center;"><a class="thickbox" href="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small_0.jpg"><img src="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small.jpg" alt="This $120 card supports Mantle, TrueAudio, and CrossFire. " title="AMD Radeon R7 260X" width="620" height="667" /></a></p> <p style="text-align: center;"><strong>This $120 card supports Mantle, TrueAudio, and CrossFire. </strong></p> <p>In testing, the R7 260X delivered passable performance, staking out the middle ground between the faster R7 265 and the much slower R7 250 cards. It ran at about 30fps in tests like Crysis 3 and Tomb Raider, but hit 51fps on CoD: Ghosts and 40fps on Battlefield 4, so it's certainly got enough horsepower to run the latest games on max settings. The fact that it supports all the latest technology from AMD is what bolsters this card's credentials, though. And the fact that it can run Mantle with no problems is a big plus for Battlefield 4 players. We like this card a lot, just like we enjoyed the HD 7790. While it’s not the fastest card in the bunch, it’s certainly far from the slowest.</p> <p><strong>AMD Radeon R7 260X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$120 <a href="http://www.amd.com/ " target="_blank">www.amd.com</a></strong></p> <h4> <hr />MSI Radeon R7 250 OC</h4> <p>In every competition, there must be one card that represents the lower end of the spectrum, and in this particular roundup, it’s this little guy from MSI. Sure, it's been overclocked a smidge and has a big-boy 2GB of memory, but this GPU is otherwise outgunned, plain and simple. For starters, it has just 384 Stream Processors, which is the lowest number we've ever seen on a modern GPU, so it's already severely handicapped right out of the gate. Board power is a decent 65W, but when looking at the specs of the Nvidia GTX 750, it is clearly outmatched. One other major problem, at least for those of us with big monitors, is we couldn't get it to run our desktop at 2560x1600 out of the box, as it only has one single-link DVI connector instead of dual-link. On the plus side, it doesn't require an auxiliary power connector and is just $100, so it's a very inexpensive board and would make a great upgrade from integrated graphics for someone on a strict budget.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msi_radeon_r7_250_oc_2gb_small_0.jpg"><img src="/files/u152332/msi_radeon_r7_250_oc_2gb_small.jpg" alt="Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB." title="MSI Radeon R7 250 OC" width="620" height="498" /></a></p> <p style="text-align: center;"><strong>Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.</strong></p> <p>That said, we actually felt bad for this card during testing. The sight of it limping along at 9 frames per second in Heaven 4.0 was tough to watch, and it didn't do much better on our other tests, either. Its best score was in Call of Duty: Ghosts, where it hit a barely playable 22fps. In all of our other tests, it was somewhere between 10 and 20 frames per second on high settings, which is simply not playable. We'd love to say something positive about the card though, so we'll note that it probably runs fine at medium settings and has a lot of great reviews on Newegg from people running at 1280x720 or 1680x1050 resolution.</p> <p><strong>MSI Radeon R7 250 OC 1TB</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_6.jpg" alt="score:6" title="score:6" width="210" height="80" /></div> </div> </div> <p><strong>$90 <a href=" http://us.msi.com/ " target="_blank">http://us.msi.com</a></strong></p> <h4>PowerColor Radeon R7 250X</h4> <p>The PowerColor Radeon R7 250X represents a mild bump in specs from the R7 250, as you would expect given its naming convention. It is outfitted with 1GB of RAM however, and a decent 1,000MHz boost clock. It packs 640 Stream Processors, placing it above the regular R7 250 but about mid-pack in this group. Its 1GB of memory runs on the same 128-bit memory bus as other cards in this roundup, so it's a bit constrained in its memory bandwidth, and we saw the effects of it in our testing. It supports DirectX 11.2, though, and has a dual-link DVI connector. It even supports CrossFire with an APU, but not with another PCIe GPU&shy;—or at least that's our understanding of it, since it says it supports CrossFire but doesn't have a connector on top of the card.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r7-250x-angle_small_0.jpg"><img src="/files/u152332/r7-250x-angle_small.jpg" alt="The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. " title="PowerColor Radeon R7 250X " width="620" height="369" /></a></p> <p style="text-align: center;"><strong>The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. </strong></p> <p>When we put the X-card to the test, it ended up faring a smidgen better than the non-X version, but just barely. It was able to hit 27 and 28 frames per second in Battlefield 4 and CoD: Ghosts, and 34 fps in Batman: Arkham Origins, but in the rest of the games in our test suite, its performance was simply not what we would call playable. Much like the R7 250 from MSI, this card can't handle 1080p with all settings maxed out, so this GPU is bringing up the rear in this crowd. Since it's priced "under $100" we won't be too harsh on it, as it seems like a fairly solid option for those on a very tight budget, and we'd definitely take it over the vanilla R7 250. We weren't able to see "street" pricing for this card, as it had not been released at press time, but our guess is even though it's one of the slowest in this bunch, it will likely be the go-to card under $100.</p> <p><strong>PowerColor Radeon R7 250X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$100, <a href="http://www.powercolor.com/ " target="_blank">www.powercolor.com</a></strong></p> <h3>Should you take the red pill or the green pill?</h3> <p><strong>Both companies offer proprietary technologies to lure you into their "ecosystems," so let’s take a look at what each has to offer</strong></p> <h4>Nvidia's Offerings</h4> <p><strong>G-Sync</strong></p> <p>Nvidia's G-Sync technology is arguably one of the strongest cards in Nvidia's hand, as it eliminates tearing in video games caused by the display's refresh rate being out of sync with the frame rate of the GPU. The silicon syncs the refresh rate with the cycle of frames rendered by the GPU, so movement onscreen looks buttery smooth at all times, even below 30fps. The only downside is you must have a G-Sync monitor, so that limits your selection quite a bit.</p> <p><strong>Regular driver releases</strong></p> <p>People love to say Nvidia has "better drivers" than AMD, and though the notion of "better" is debatable, it certainly releases them much more frequently than AMD. That's not to say AMD is a slouch—especially now that it releases a new "beta" build each month—but Nvidia seems to be paying more attention to driver support than AMD.</p> <p><strong>GeForce Experience and ShadowPlay</strong></p> <p>Nvidia's GeForce Experience software will automatically optimize any supported games you have installed, and also lets you stream to Twitch as well as capture in-game footage via ShadowPlay. It's a really slick piece of software, and though we don't need a software program to tell us "hey, max out all settings," we do love ShadowPlay.</p> <p><strong>PhysX</strong></p> <p>Nvidia's proprietary PhysX software allows game developers to include billowing smoke, exploding particles, cloth simulation, flowing liquids, and more, but there's just one problem—very few games utilize it. Even worse, the ones that do utilize it, do so in a way that is simply not that impressive, with one exception: Borderlands 2.</p> <h4>AMD's Offerings</h4> <p><strong>Mantle and TrueAudio</strong></p> <p>AMD is hoping that Mantle and TrueAudio become the must-have "killer technology" it offers over Nvidia, but at this early stage, it's difficult to say with certainty if that will ever happen. Mantle is a lower-level API that allows developers to optimize a game specifically targeted at AMD hardware, allowing for improved performance.</p> <p><strong>TressFX</strong></p> <p>This is proprietary physics technology similar to Nvidia's PhysX in that it only appears in certain games, and does very specific things. Thus far, we've only seen it used once—for Lara Croft's hair in Tomb Raider. Instead of a blocky ponytail, her mane is flowing and gets blown around by the wind. It looks cool but is by no means a must-have item on your shopping list, just like Nvidia's PhysX. <strong>&nbsp;</strong></p> <p><strong>Gaming Evolved by Raptr<br /></strong></p> <p>This software package is for Radeon users only, and does several things. First, it will automatically optimize supported games you have installed, and it also connects you to a huge community of gamers across all platforms, including PC and console. You can see who is playing what, track achievements, chat with friends, and also broadcast to Twitch.tv, too. AMD also has a "rewards" program that doles out points for using the software, and you can exchange those points for gear, games, swag, and more.</p> <p><strong>Currency mining</strong></p> <p>AMD cards are better for currency mining than Nvidia cards for several reasons, but their dominance is not in question. The most basic reason is the algorithms used in currency mining favor the GCN architecture, so much so that AMD cards are usually up to five times faster in performing these operations than their Nvidia equivalent. In fact, the mining craze has pushed the demand for these cards is so high that there's now a supply shortage.</p> <h3>All the cards, side by side</h3> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>MSI Geforce GTX 750 Gaming</td> <td>GigaByte GeForce GTX 750 Ti </td> <td>GeForce GTX 650 Ti Boost *</td> <td>GeForce GTX 660 *</td> <td>MSI Radeon R7 250</td> <td>PowerColor Radeon R7 250X</td> <td>AMD Radeon R7 260X</td> <td>Sapphire Radeon R7 265</td> </tr> <tr> <td class="item">Price</td> <td class="item-dark">$120 </td> <td>$150</td> <td>$160</td> <td>$210</td> <td>$90</td> <td>$100</td> <td>$120</td> <td>$150</td> </tr> <tr> <td>Code-name</td> <td>Maxwell</td> <td>Maxwell</td> <td>Kepler</td> <td>Kepler</td> <td>Oland</td> <td>Cape Verde</td> <td>Bonaire</td> <td>Curaco</td> </tr> <tr> <td class="item">Processing cores</td> <td class="item-dark">512</td> <td>640</td> <td>768</td> <td>960</td> <td>384</td> <td>640</td> <td>896</td> <td>1,024</td> </tr> <tr> <td>ROP units</td> <td>16</td> <td>16</td> <td>24</td> <td>24</td> <td>8</td> <td>16</td> <td>16</td> <td>32</td> </tr> <tr> <td>Texture units</td> <td>32</td> <td>40<strong><br /></strong></td> <td>64</td> <td>80</td> <td>24</td> <td>40</td> <td>56</td> <td>64</td> </tr> <tr> <td>Memory</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>1GB</td> <td>1GB</td> <td>2GB</td> <td>2GB</td> </tr> <tr> <td>Memory speed</td> <td>1,350MHz</td> <td>1,350MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,125MHz</td> <td>1,500MHz</td> <td>1,400MHz</td> </tr> <tr> <td>Memory bus</td> <td>128-bit</td> <td>128-bit</td> <td>192-bit</td> <td>192-bit</td> <td>128-bit</td> <td>128-bit</td> <td>128-bit</td> <td>256-bit</td> </tr> <tr> <td>Base clock</td> <td>1,020MHz</td> <td>1,020MHz</td> <td>980MHz</td> <td>980MHz</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> </tr> <tr> <td>Boost clock</td> <td>1,085MHz</td> <td>1,085MHz</td> <td>1,033MHz</td> <td>1,033MHz</td> <td>1,050MHz</td> <td>1,000MHz</td> <td>1,000MHz</td> <td>925MHz</td> </tr> <tr> <td>PCI Express version</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> </tr> <tr> <td>Transistor count</td> <td>1.87 billion</td> <td>1.87 billion</td> <td>2.54 billion</td> <td>2.54 billion</td> <td>1.04 billion</td> <td>1.04 billion</td> <td>2.08 billion</td> <td>2.8 billion</td> </tr> <tr> <td>Power connectors</td> <td>N/A</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>1x six-pin</td> </tr> <tr> <td>TDP</td> <td>54W</td> <td>60W</td> <td>134W</td> <td>140W</td> <td>65W</td> <td>80W</td> <td>115W</td> <td>150W</td> </tr> <tr> <td>Fab process</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> </tr> <tr> <td>Multi-card support</td> <td>No</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>Yes</td> </tr> <tr> <td>Outputs</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, <br />2x HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, <br />HDMI, DisplayPort</td> <td>DVI-S, VGA, HDMI</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, HDMI, DisplayPort</td> </tr> </tbody> </table> <p><em>Provided for reference purposes.<br /></em></p> </div> </div> </div> </div> </div> <h3>How we tested</h3> <p><strong>We lowered our requirements, but not too much</strong></p> <p>We normally test all of our video cards on our standardized test bed, which has now been in operation for a year and a half, with only a few changes along the way. In fact, the only major change we've made to it in the last year was swapping the X79 motherboard and case. The motherboard had endured several hundred video-card insertions, which is well beyond the design specs. The case had also become bent to the point where the video cards were drooping slightly. Some, shall we say, "overzealous" overclocking also caused the motherboard to begin behaving unpredictably. Regardless, it's a top-tier rig with an Intel Core i7-3960X Extreme processor, 16GB of DDR3 memory, an Asus Rampage IV Extreme motherboard, Crucial M500 SSD, and Windows 8 64-bit Enterprise.</p> <p>For the AMD video cards, we loaded Catalyst driver 14.1 Beta 1.6, as that was the latest driver, and for the Nvidia cards, we used the 334.89 WHQL driver that was released just before testing began. We originally planned to run the cards at our normal "midrange GPU" settings, which is 1920x1080 resolution with maximum settings and 4X AA enabled, but after testing began, we realized we needed to back off those settings just a tad. Instead of dialing it down to medium settings, though, as that would run counter to everything we stand for as a magazine, we left the settings on "high" across the board, but disabled AA. These settings were a bit much for the lower-end cards, but rather than lower our settings once again, we decided to stand fast at 1080p with high settings, since we figured that's where you want to be gaming and you deserve to know if some of the less-expensive cards can handle that type of action.</p> <h3>Mantle Reviewed</h3> <p><strong>A word about AMD's Mantle API</strong></p> <p>AMD's Mantle API is a competitor to DirectX, optimized specifically for AMD's GCN hardware. In theory, it should allow for better performance since its code knows exactly what hardware it's talking to, as opposed to DirectX's "works with any card" approach. The Mantle API should be able to give all GCN 1.0 and later AMD cards quite a boost in games that support it. However, AMD points out that Mantle will only show benefits in scenarios that are CPU-bound, not GPU-bound, so if your GPU is already working as hard as it can, Mantle isn’t going to help it. However, if your GPU is always waiting around for instructions from an overloaded CPU, then Mantle can offer respectable gains.</p> <p>To test it out, we ran Battlefield 4 on an older Ivy Bridge quad-core, non-Hyper-Threaded Core i5-3470 test bench with the R7 260X GPU at 1920x1080 and 4X AA enabled. As of press time, there are only two games that support Mantle—Battlefield 4 and an RTS demo on Steam named Star Swarm. In Battlefield 4, we were able to achieve 36fps using DirectX, and 44fps using Mantle, which is a healthy increase and a very respectable showing for a $120 video card. The benefit was much smaller in Star Swarm, however, showing a negligible increase of just two frames per second.</p> <p><img src="/files/u152332/bf4_screen_swap_small.jpg" alt="Enabling Mantle in Battlefield 4 does provide performance boosts for most configs." title="Battlefield 4" width="620" height="207" /></p> <p>We then moved to a much beefier test bench running a six-core, Hyper-Threaded Core i7-3960X and a Radeon R9 290X, and we saw an increase in Star Swarm of 100 percent, going from 25fps with DirectX to 51fps using Mantle in a timed demo. We got a decent bump in Battlefield 4, too, going from 84 fps using DirectX to 98 fps in Mantle.</p> <p>Overall, Mantle is legit, but it’s kind of like PhysX or TressFX in that it’s nice to have when it’s supported, and does provide a boost, but it isn’t something we’d count on being available in most games.</p> <h3>Final Thoughts</h3> <h3>If cost is an issue, you've got options</h3> <p>Testing the cards for this feature was an enlightening experience. We don’t usually dabble in GPU waters that are this shallow, so we really had no idea what to expect from all the cards assembled. To be honest, if we were given a shot of sodium pentothal, we’d have to admit that given these cards’ price points, we had low expectations but thought they’d all at least be able to handle 1920x1080 gaming. As spoiled gamers used to running 2K or 4K resolution, 1080p seems like child’s play to us. But we found out that running that resolution at maximum settings is a bridge too far for any GPU that costs less than $120 or so. The $150 models are the sweet spot, though, and are able to game extremely well at 1080 resolution, meaning the barrier to entry for “sweet gaming” has been lowered by $100, thanks to these new GPUs from AMD and Nvidia.</p> <p>Therefore, the summary of our results is that if you have $150 to spend on a GPU, you should buy the Sapphire Radeon R7 265, as it’s the best card for gaming at this price point, end of discussion. OK, thanks for reading.</p> <p>Oh, are you still here? OK, here’s some more detail. In our testing, the Sapphire R7 265 was hands-down the fastest GPU at its price point—by a non-trivial margin in many tests—and is superior to the GTX 750 Ti from Nvidia. It was also the only GPU to come close to the magical 60fps we desire in every game, making it pretty much the only card in this crowd that came close to satisfying our sky-high demands. The Nvidia GTX 750 Ti card was a close second, though, and provides a totally passable experience at 1080p with all settings maxed. Nvidia’s trump card is that it consumes less than half the power of the R7 265 and runs 10 C cooler, but we doubt most gamers will care except in severely PSU-constrained systems.</p> <p>Moving down one notch to the $120 cards, the GTX 750 and R7 260X trade blows quite well, so there’s no clear winner. Pick your ecosystem and get your game on, because these cards are totally decent, and delivered playable frame rates in every test we ran.</p> <p>The bottom rung of cards, which consists of the R7 250(X) cards, were not playable at 1080p at max settings, so avoid them. They are probably good for 1680x1050 gaming at medium settings or something in that ballpark, but in our world, that is a no-man’s land filled with shattered dreams and sad pixels.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>Nvidia GTX 750 Ti (reference)</td> <td>Gigabyte GTX 750 Ti</td> <td>MSI GTX 750 Gaming</td> <td>Sapphire Radeon R7 265</td> <td>AMD Radeon R7 260X</td> <td>PowerColor Radeon R7 250X</td> <td>MSI Radeon R7 250 OC</td> </tr> <tr> <td class="item">Driver</td> <td class="item-dark">334.89</td> <td>334.89</td> <td>334.89</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> </tr> <tr> <td>3DMark Fire Storm</td> <td>3,960</td> <td>3,974</td> <td>3,558</td> <td><strong>4,686</strong></td> <td>3,832</td> <td>2,806</td> <td>1,524</td> </tr> <tr> <td class="item">Unigine Heaven 4.0 (fps)</td> <td class="item-dark"><strong>30</strong></td> <td><strong>30<br /></strong></td> <td>25</td> <td>29</td> <td>23</td> <td>17</td> <td>9</td> </tr> <tr> <td>Crysis 3 (fps)</td> <td>27</td> <td>25</td> <td>21</td> <td><strong>32</strong></td> <td>26</td> <td>16</td> <td>10</td> </tr> <tr> <td>Far Cry 3 (fps)</td> <td><strong>40</strong></td> <td><strong>40<br /></strong></td> <td>34</td> <td><strong>40</strong></td> <td>34</td> <td>16</td> <td>14</td> </tr> <tr> <td>Tomb Raider (fps)</td> <td>30</td> <td>30</td> <td>26</td> <td><strong>36</strong></td> <td>31</td> <td>20</td> <td>12</td> </tr> <tr> <td>CoD: Ghosts (fps)</td> <td>51</td> <td>49</td> <td>42</td> <td><strong>67</strong></td> <td>51</td> <td>28</td> <td>22</td> </tr> <tr> <td>Battlefield 4 (fps)</td> <td>45</td> <td>45</td> <td>32</td> <td><strong>49</strong></td> <td>40</td> <td>27</td> <td>14</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td><strong>74</strong></td> <td>71</td> <td>61</td> <td>55</td> <td>43</td> <td>34</td> <td>18</td> </tr> <tr> <td>Assassin's Creed: Black Flag (fps)</td> <td>33</td> <td>33</td> <td>29</td> <td><strong>39</strong></td> <td>21</td> <td>21</td> <td>14</td> </tr> </tbody> </table> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7-3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 1920x1080 with no AA except for the 3DMark tests.<br /></em></p> </div> </div> </div> </div> </div> http://www.maximumpc.com/best_cheap_graphics_card_2014#comments 1080p affordable amd benchmarks budget cheap cheap graphics card gpu Hardware Hardware maximum pc may 2014 nvidia Video Card Features Tue, 12 Aug 2014 21:43:32 +0000 Josh Norem 28304 at http://www.maximumpc.com Xidax M6 Mining Rig Review http://www.maximumpc.com/xidax_m6_mining_rig_review_2014 <!--paging_filter--><h3>A gaming rig that pays for itself</h3> <p>Exotic car paint, multiple GPUs, and custom-built chassis’ be damned, boutique PC builder <a title="xidax" href="http://www.maximumpc.com/tags/Xidax" target="_blank">Xidax</a> thinks it has the sexiest sales pitch on the planet with its <strong>M6 Mining Rig</strong>: It pays for itself! Now, we can’t say this PC is basically “free” because it ain’t that, but Xidax says by using the box’s spare GPU cycles to mine for crypto-currency, this baby would be paid off in about four months. To be honest, it’s not something we’ve ever considered, as we’ve seen gaming rigs, and we’ve seen coining rigs, but never in the same box. It seems like a solid idea though, as the system can game during the day, then mine at night to help cover its cost.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/xidax_guts13979_small_0.jpg"><img src="/files/u152332/xidax_guts13979_small.jpg" alt="The Xidax M6 Mining Rig comes set up with everything you need to start mining crypto-currancy almost right out of the box." title="Xidax M6 Mining Rig" width="620" height="676" /></a></p> <p style="text-align: center;"><strong>The Xidax M6 Mining Rig comes set up with everything you need to start mining crypto-currancy almost right out of the box.</strong></p> <p>The system’s specs include a 3.4GHz Core i5-4670K with 16GB of RAM, a Corsair RM 850 PSU, closed-loop liquid cooler, 250GB Samsung 840 EVO SSD, 1TB WD Black, and a pair of Sapphire Radeon R9 290X cards. In application performance, it’s pretty pedestrian with its stock-clocked Core i5-4670K. Why not something more badass? Xidax says it weighed hardware choices carefully because the pricier the hardware, the longer it takes to pay off with crypto-coins. The Radeons are a wise choice, as they offer about twice the performance of Nvidia’s fastest GPUs in mining applications. Gaming is also quite excellent (obviously, for a two-card system), and its mining performance is impressive at 1.7 to 1.8 Kilohashes per second. (Hashes of the kilo/mega/giga variety are the units of measurement for mining productivity.)</p> <p>Xidax ships the PC ready to start mining operations almost right out of the box, which is normally a daunting task. It also includes a Concierge (or should we say coincierge) service that has a Xidax rep remotely connect to the rig and do a final tune on the box for maximum mining performance. On this particular machine, it came ready to mine for Doge Coins and was forecast to make about $21.60 a day, or $670 a month, on a 24/7 schedule—including electricity costs.</p> <p>What’s the catch? There are a few. First, it’s loud when mining. In fact, it’s so loud that you won’t be able to stand being in the same room with it. Second, you can’t do anything with it while it’s mining because all GPU resources are pegged to the max. Third, crypto-currency can be volatile. Bitcoin saw its value see-saw from $130 to $1,242 and then back to $455 and $900 in just four months. It could all go kaput in a few months, or who knows—the government might even step in and ruin the fun.</p> <p>Considering its performance outside of mining, the M6 Mining Rig is pricey at $3,000. However, the price includes a lifetime warranty on parts and service except for the GPUs. Those carry a five-year warranty, which is still surprisingly good, considering that board vendors are already making noises that they don’t want to eat the cost of dead boards killed by mining. Xidax says it will cover them, though. And—again—it pays for itself, right?</p> <p>That’s ultimately the appeal of the M6 Gaming Rig, but it has to be carefully considered by potential buyers. After all, anything that sounds too good to be true usually is, but then again, it is a powerful gaming PC that could theoretically pay for itself in a few months. And even if the market blew up, at least you’d still have a formidable gaming PC rather than just standing there with your RAM sticks in one hand. And if it works out, whoa baby, you just got a PC for free! –</p> <p><strong>$3,000,</strong> <a href="http://www.xidax.com/">www.xidax.com</a></p> <p><img src="/files/u154082/xidax_benchmarks.png" alt="xidax benchmarks" title="xidax benchmarks" width="620" height="277" /></p> http://www.maximumpc.com/xidax_m6_mining_rig_review_2014#comments april issues 2014 bitcoin dogecoin Hardware maximum pc Review xidax m6 mining computer Reviews Systems Wed, 06 Aug 2014 16:42:51 +0000 Gordon Mah Ung 28234 at http://www.maximumpc.com Intel 730 Series SSD 480GB Review http://www.maximumpc.com/intel_730_series_ssd_480gb_review <!--paging_filter--><h3>An overclocked enterprise SSD, priced accordingly</h3> <p><a title="intel" href="http://www.maximumpc.com/tags/Intel_0" target="_blank">Intel</a> has largely been absent from the high-end SSD market for many years, which has been a real head-scratcher, considering the original X-25M’s dominance back in 2009. That all changes this month with the release of its all-new <strong>730 series SSD</strong>. It springs from the loins of its data center SSDs, which use validated NAND and Intel’s enterprise-level controller technology. To emphasize this heritage, Intel isn’t bragging about the drive’s overall speed, but instead notes the drive is rated to handle up to 70GB of writes per day, which is higher than any other SSD on the market by a huge margin. It features capacitors to protect data being written in case of a power outage, which is an unusual but not unprecedented feature on a consumer SSD. Intel also backs the drive with a five-year warranty.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/ww_13_18_small_0.jpg"><img src="/files/u152332/ww_13_18_small.jpg" alt="Intel’s new flagship SSD is validated for a whopping 70GB of writes per day." title="Intel 730 Series SSD 480GB" width="620" height="437" /></a></p> <p style="text-align: center;"><strong>Intel’s new flagship SSD is validated for a whopping 70GB of writes per day.</strong></p> <p>To create the 730 Series, Intel has basically taken the NAND flash and controller from its data center–oriented S3700 SSD and bumped up the clock and interface speeds. If you recall the “SSD overclocking” demo Intel held at Pax last year, this is the result, though Intel decided against letting consumers overclock the drive. Instead, it did the overclocking at the factory so that the drives could be validated at those speeds. To drive home the point that this is an SSD made for enthusiasts, Intel has even adorned it with a sweet-looking Skulltrail badge.</p> <p>The drive is a 7mm unit, so it will fit inside an ultrabook, but is available only in 240GB and 480GB capacities. It’s odd that it’s not available in 750GB or higher capacities, but our guess is Intel is afraid of the sky-high sticker price that such a drive would require; the two capacities it’s offering are priced very&nbsp; high at $250 and $490, respectively. The drive features Intel’s 20nm MLC NAND and its own third-generation controller. It’s ditched SandForce, along with all the other SSD makers in the business. One interesting note is that since this is an enterprise drive, it essentially doesn’t have a “low-power state,” so it’s not intended for mobile usage. Also, it consumes 5W under load, which is double the consumption of even a 7,200rpm mobile hard drive.</p> <p>When we strapped the 730 Series drive to our test bench, we saw results that were a bit slower overall than we expected. It topped the charts in AS SSD, which measures read and write speeds of incompressible data, but the Intel drive was only a smidge faster than most, and not by enough to make it stand out, as they are all very fast. It was a bit slower than average in straight-line sequential read speeds, topping out at 468MB/s for reads and 491MB/s for writes. While this is still plenty fast, it’s a bit short of the 550MB/s Intel claims the drive is capable of, which is totally saturating the SATA 6Gb/s interface.</p> <p>It was also oddly slow in the ATTO benchmark, which has a queue depth of four and is a “best case scenario” for most drives. It scored just 373MB/s for 64KB-read speeds, compared to 524MB/s for the Samsung 840 Pro. We ran the test several times to verify, so it’s not an aberration. It placed mid-pack in PCMark Vantage, but was slower than its competition in our real-<br />world Sony Vegas test, where we write a 20GB uncompressed AVI file to the drive.</p> <p>Overall, this drive is a bit of a conundrum. We have no doubt it’s reliable, as Intel has always been strong in that regard and this drive is full of safety-oriented features. But is it more reliable than a Samsung 840 Pro for the average consumer? We doubt it, and therefore the drive’s extra-high price tag doesn’t make much sense. If Intel realizes it’s no longer the only game in town and adjusts the price a bit, it’ll be a much more competitive drive, but as it stands, we must give it a so-so verdict of 8.</p> <p><strong>$490,</strong> <a href="http://www.intel.sg/content/www/xa/en/homepage.html">www.intel.com</a></p> http://www.maximumpc.com/intel_730_series_ssd_480gb_review#comments Hardware Intel 730 Series SSD 480GB maximum pc May issues 2014 solid state drive Reviews SSD Wed, 06 Aug 2014 16:36:43 +0000 Josh Norem 28289 at http://www.maximumpc.com Gigabyte Radeon R9 290X OC Review http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review <!--paging_filter--><h3>As good as it gets, if you can find one to buy</h3> <p>Aftermarket Radeon R9 290X GPUs are beginning to make the rounds, and this month we had a WindForce-cooled behemoth from <a title="gigabyte" href="http://www.maximumpc.com/tags/Gigabyte" target="_blank">Gigabyte</a> strutting its stuff in the lab. Unlike last month’s <a title="sapphire tri x r9 290x" href="http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review" target="_blank">Sapphire Tri-X R9 290X</a>, this board features a custom PCB in addition to the custom cooler, whereas the Sapphire slapped a huge cooler onto the reference design circuit board. Theoretically, this could allow for higher overclocks on the Gigabyte due to better-quality components, but more on that later.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/windforce14052_small_0.jpg"><img src="/files/u152332/windforce14052_small.jpg" alt="Unlike the reference design, Gigabyte’s R9 290X is cool, quiet, and overclockable." title="Gigabyte Radeon R9 290X OC" width="620" height="476" /></a></p> <p style="text-align: center;"><strong>Unlike the reference design, Gigabyte’s R9 290X is cool, quiet, and overclockable.</strong></p> <p>This is the overclocked version of the card, so it clocks up to 1,040MHz under load, which is a mere 40MHz over stock. These boards always have conservative overclocks out of the box, though, and that is by no means the final clock speed for this card. We’ve covered its WindForce cooler in past reviews, so we won’t go into all the details, but it’s a three-fan cooler that only takes up two PCIe slots and uses six heat pipes with inclined heatsinks to better dissipate the warm. It’s good for 450W of heat dispersal, according to Gigabyte, and since the R9 290X is roughly a 300W card (AMD has never given a TDP for this particular model for some reason), the WindForce cooler should be more than up to the job.</p> <p>Like all Radeon R9 290X boards, this sucker is big and long, measuring 11.5 inches. Gigabyte recommends you use at least a 600W power supply with it, and it sports two dual-link DVI ports for 2560x1600 gaming, as well as HDMI 1.4 and DisplayPort 1.2a if you want to run 4K. The card comes bundled with a free set of headphones. It used to include a free copy of Battlefield 4, but the company told us it was no longer offering the game bundle because it had run out of coupons. The MSRP of the board is $620, but some stores had it for $599 while others marked it up to $700.</p> <p>Once we had this Windy Bad Boy in the lab, we were very curious to compare it to the Sapphire Tri-X R9 290X we tested last month. Since both cards feature enormous aftermarket coolers, have the exact same specs and clocks, and are roughly the same price, we weren’t surprised to find that they performed identically for the most part.</p> <p>If you look at the benchmark chart, in every test the two cards are almost exactly the same—the only exception being Metro, but since that’s a PhysX game, AMD cards can get a bit wonky sometimes. In every other test, the two cards are within a few frames-per-second difference, making them interchangeable. Both cards also run in the mid–70 C zone under load, which is 20 C cooler than the reference design. We were able to overclock both cards to just a smidge over 1,100MHz, as well.</p> <p>“Okay,” you are saying to yourself. “I’m ready to buy!” Well, that’s where we run into a small problem. Gigabyte’s MSRP for this card is $620—the same as the Sapphire Tri-X card—but at press time, the cheapest we could find it for was $700 on Newegg. We can’t ding Gigabyte for Newegg’s pricing, but it’s a real shame these R9 290X cards are so damned expensive.</p> <p><strong>$620,</strong> <a href="http://www.gigabyte.us/">www.gigabyte.us</a></p> http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review#comments Air Cooling amd april issues 2014 Gigabyte Radeon R9 290X OC gpu graphics card Hardware maximum pc Review Reviews Tue, 05 Aug 2014 19:52:42 +0000 Josh Norem 28227 at http://www.maximumpc.com NZXT H440 Review http://www.maximumpc.com/nzxt_h440_review <!--paging_filter--><h3>Remarkably clean, and limited, too</h3> <p>We love the fact that <a title="nzxt" href="http://www.maximumpc.com/tags/nzxt" target="_blank">NZXT</a> bills this semi-silent-themed case as a “hassle-free experience.” We wonder if the company was using the same case that we were, because we encountered quite a bit of hassle building a standard configuration into this smaller-than-usual chassis.</p> <p>For starters, the case itself ships with no printed manual—at least, ours didn’t. We only hope that’s an oversight with our early review unit instead of a standard feature of the chassis itself, because there are definitely some features of the <a title="h440" href="http://www.maximumpc.com/nzxt_h440_case_ditches_optical_drive_bays_cleaner_look" target="_blank">H440</a> that warrant a bit of instruction, especially for neophyte builders.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/h440_blk_main_24x32in_small_0.jpg"><img src="/files/u152332/h440_blk_main_24x32in_small.jpg" alt="The H440 is the first case we’ve tested that doesn’t have 5.25-inch drive bays. " title="NZXT H440" width="620" height="578" /></a></p> <p style="text-align: center;"><strong>The H440 is the first case we’ve tested that doesn’t have 5.25-inch drive bays. </strong></p> <p>Case in point: There are absolutely zero 5.25-inch bays to be found on the H440, which is a good thing to know before you start attempting to pry off the H440’s front (Dremel in hand). We know, we know, the optical drive is dead, long live the optical drive—but is it too soon? To be honest, there’s an upstart contingent here at Maximum PC who think it’s a plus, while some cranky old farts think it’s a minus. Additionally, installing the power supply might evoke a bout of head-scratching at first, as there’s seemingly no way to just stuff it into the chassis thanks to how it’s been compartmentalized on the case’s bottom. This does build on the case’s motto of “remarkably clean,” though, by hiding your messy PSU cabling.</p> <p>This leads us into one of our major gripes with this chassis: There’s a lot of screwing. We pretty much pulled out the thumbscrews in the case’s side, which are supposedly designed to not do that. Beyond that, you have to unscrew a panel to slide the power supply in, you have to unscrew the standard PCI slot covers for any devices you want to install, and—most frustratingly—you have to first unscrew the case’s steel drive trays (up to six total) just for the privilege of being able to screw in your hard drive. Clean, yes. Toolless, no.</p> <p>The case feels a bit small on the inside, but it adequately supported our standard test setup (including an Nvidia GTX 480 video card) without any cramming or wedging. We like how the case’s three rubberized cable-routing holes fit perfectly with a standard video card setup—when using the top-most PCI Express x16 slot on our ATX motherboard, our video card didn’t block any of the much-needed routing holes.</p> <p>That said, cable routing is a bit of a challenge in the H440. There’s already not that much room between the rear of the motherboard tray and the case’s side panel. Amplifying the claustrophobia is a layer of soundproofing foam adhered to the side panel. We love that NZXT cares so much about our ears, but it makes for a less-than-pleasant smashing of cables against the case’s side (especially since there’s only one provided hole for power-supply cables to route through otherwise). Cable-management options feel more constrained by this case than others we’ve tested.</p> <p>The foam surrounding the case’s insides has quite a bit of work in store for it, too. No fewer than four of NZXT’s next-gen case fans grace the inside of the chassis: three 12cm fans on the front and one 14cm fan on the back. When we fired up the system with no components inside it, the soundproof-themed case was a bit audible. A full system only adds to the din, and while we appreciate NZXT’s efforts toward keeping the volume dial at a three instead of an eleven, it seems to be a bit of a lost cause.</p> <p>NZXT seems to think this case is perfect for liquid cooling. For some all-in-one setups, sure; for customized loops, you’re going to be in for something of a tubing nightmare. Best of luck!</p> <p><strong>$120,</strong> <a href="http://www.nzxt.com/">www.nzxt.com</a></p> http://www.maximumpc.com/nzxt_h440_review#comments april issues 2014 Hardware maximum pc Review Cases Reviews Tue, 05 Aug 2014 19:46:55 +0000 David Murphy 28236 at http://www.maximumpc.com Cooler Master Nepton 280L Review http://www.maximumpc.com/cooler_master_nepton_280l_review <!--paging_filter--><h3>Not quite god, but still Herculean</h3> <p>In the world of CPUs, closed-loop liquid coolers (CLCs) seem to be standard-issue for enthusiasts these days. They give you higher overclocking headroom than even the most expensive and beefy air coolers, and they can operate more quietly. However, we haven’t seen many with radiators as large as 280mm—just the <a title="kraken x60" href="http://www.maximumpc.com/nzxt_kraken_x60_review_2013" target="_blank">NZXT Kraken X60</a> and the <a title="Corsair h110 review" href="http://www.maximumpc.com/corsair_h110_review" target="_blank">Corsair H110</a> come to mind—so we were eager to run the <strong>Cooler Master Nepton 280L</strong> through its paces.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/coolermaster14036_small_0.jpg"><img src="/files/u152332/coolermaster14036_small.jpg" alt="The Nepton features a massive 280mm radiator." title="Cooler Master Nepton 280L" width="620" height="559" /></a></p> <p style="text-align: center;"><strong>The Nepton features a massive 280mm radiator.</strong></p> <p>Many hardware vendors have chosen to license their CLC designs from Asetek, whose patents go back quite a way. With the 280L, however, Cooler Master decided to roll its own with a custom pump. It’s definitely larger than usual. The tubes coming out of it are sleeved with a material called FEP, which is similar to Teflon and designed to slow the rate of evaporation inside the loop. That’s an important factor for a cooler that’s not designed to be refilled. Cooler Master is also using its own JetFlo fans, designed for high static pressure. This feature is needed to penetrate into the fins of the radiator, and the 140mm version of the JetFlo is making its debut here (the radiator can also fit 120mm fans).</p> <p>Pump installation is pretty straightforward. First, we screwed in a small bracket to each side of the cold plate. Then, since our LGA 2011 motherboard has an integrated CPU backplate, we just attached four bundled screws to that, set the cold plate on top the CPU, and added four fasteners to fix the cold plate’s brackets to the four screws in the backplate. The fasteners only take a flat-bladed screwdriver, oddly, but they went in smoothly. The radiator screws also come in two sets of eight, differing only a couple of millimeters in length, so it took a minute to separate each type. It would have been better had they been clearly differentiated.</p> <p>The pump, plugged into our board’s secondary CPU fan header, operated at a steady 6,300rpm, which is unusually high. We checked with Cooler Master, and the company agreed. We used a Zalman Fanmate to manually tune it down to 5,000rpm, at which point the pump noise didn’t stand out. Despite a loss of 1,300rpm, temps only went up about 1 C during our load test, indicating that the additional speed offered minimal improvement anyway. We then plugged the pump into a chassis fan header without the Fanmate, and it leveled off at 4,500rpm.</p> <p>Of course, there are caveats. A large percentage of cases will not accommodate a 280mm radiator; either the dimensions are too small or the fan mounts are not sized for it. This is not the radiator’s fault, though, so we can’t really deduct points for it. It’s just something that you have to be aware of. Also, the thick FEP tubes are not especially flexible. The radiator screws have unusually open heads, requiring an uncommonly large bit to avoid stripping. Lastly, the pump is too loud without some fiddling.</p> <p>These are fairly minor issues that all have workarounds, though. Considering the Nepton’s top-tier cooling performance, reasonably low noise levels, and ease of installation, its quirks don’t stick out in the end. Its load temperatures were notably lower than anything else we’ve tested and may allow you to squeeze another couple-hundred MHz out of an overclock. The Nepton is an indisputable upgrade from Cooler Master’s older Seidon series.</p> <p><strong>$125 (street),</strong> <a href="http://us.coolermaster.com/">www.coolermaster-usa.com</a></p> http://www.maximumpc.com/cooler_master_nepton_280l_review#comments Air Cooling april issues 2014 clc closed loop cooler Cooler Master Nepton 280L cpu Hardware maximum pc water cooling Reviews Tue, 05 Aug 2014 19:35:45 +0000 Tom McNamara 28218 at http://www.maximumpc.com Seagate 1TB Hybrid vs. WD Black2 Dual Drive http://www.maximumpc.com/seagate_1tb_hybrid_vs_wd_black2_dual_drive_2014 <!--paging_filter--><h3>Seagate 1TB Hybrid vs. WD Black2 Dual Drive</h3> <p>Every mobile user who is limited to just one storage bay wants the best of both worlds: SSD speeds with HDD capacities. Both Seagate and WD have a one-drive solution to this problem, with Seagate offering a hybrid 1TB hard drive with an SSD cache for SSD-esque performance, and WD offering a no-compromise 2.5-inch drive with both an SSD and an HDD. These drives are arch rivals, so it’s time to settle the score.</p> <h4>ROUND 1: Specs and Package</h4> <p>The WD Black2 Dual Drive is two separate drives, with a 120GB SSD riding shotgun alongside a two-platter 1TB 5,400rpm hard drive. Both drives share a single SATA 6Gb/s interface and split the bandwidth of the channel between them, with the SSD rated to deliver 350MB/s read speeds and 140MB/s write speeds. The drive comes with a SATA-to-USB adapter and includes a five-year warranty. The Seagate SSHD uses a simpler design and features a 1TB 5,400rpm hard drive with an 8GB sliver of NAND flash attached to it, along with software that helps move frequently accessed data from the platters to the NAND memory for faster retrieval. It includes a three-year warranty and is otherwise a somewhat typical drive aimed at the consumer market, not hardcore speed freaks. Both drives include free cloning software, but since the WD includes two physical drives, a USB adapter, and a longer warranty, it gets the nod.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wd_endeavor_quarter_left_higres_smal_0.jpg"><img src="/files/u152332/wd_endeavor_quarter_left_higres_smal.jpg" alt="WD’s Black2 Dual Drive is two individual drives in one enclosure, and it has the price tag to prove it. " title="WD Black2" width="620" height="620" /></a></p> <p style="text-align: center;"><strong>WD’s Black2 Dual Drive is two individual drives in one enclosure, and it has the price tag to prove it. </strong></p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 2: Durability</h4> <p>This category is somewhat of a toss-up, as the WD Black2’s overall reliability is degraded somewhat by the fact that it has a spinning volume attached to it, giving it the same robustness of the Seagate SSHD. There’s also the issue of the WD Black using the slightly antiquated JMicron controller. We don’t have any reliability data on that controller in particular, but we are always more concerned about the SSD controller you-know-whating the bed than the memory, which is rated to last for decades, even under heavy write scenarios. Both drives also use two-platter designs, so neither one is more or less prone to damage than the other. In the end, we’ll have to go with the Seagate SSHD as being more durable, simply because you only have to worry about one drive working instead of two.&nbsp;</p> <p><strong>Winner: Seagate SSHD</strong></p> <h4>ROUND 3: Performance</h4> <p>Seagate is very clear about the performance of its hybrid drives, stating that they “boot and perform like an SSD,” but it never says they’re faster. It also claims the drive is “up to five times faster than a hard drive,” which seems like a bit of a stretch. It’s difficult to actually benchmark a caching drive because it won’t show on standard sequential read tests, and it gets killed by SSDs in access time tests. That said, we did see boot and PCMark Vantage scores improve significantly over time. Our boot time dropped by more than half, going from 2:27 to 1:07 after several boots, and our PCMark Vantage score shot up from 6,000 to 19,000. Still, these times are much slower than what we got with the WD SSD, which booted in 45 seconds (the system had three dozen programs installed), and hit 33,000 in PCMark Vantage.</p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 4: Cloning Package</h4> <p>Both drives include free software to help you clone your old drive and, in an odd twist, both companies use Acronis software to get ’er done. Seagate’s software is called DiscWizard, and works on OSes as old as Windows 98 and Mac OS 10.x. WD throws in a copy of Acronis True Image, though it only works with WD drives attached via the included USB-to-SATA adapter. We tested both software packages and found them to be nearly identical, as both let us clone our existing drive and boot from it after one pass, which can be tricky at times. Therefore, we call the software package a tie since they both perform well and use Acronis. However, WD’s $300 bundle includes a USB-to-SATA adapter that makes the cloning process totally painless. Seagate makes you forage for a cable on your own, which tips the scales in WD’s favor.</p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 5: Ease of Use</h4> <p>This round has a crystal-clear winner, and that’s the Seagate SSHD. That’s because the Seagate drive is dead-simple to use and behaves exactly like a hard drive at all times. You can plug it into any PC, Mac, or Linux machine and it is recognized with no hassle. The WD drive, on the other hand, only works on Windows PCs because it requires special software to “unlock” the 1TB hard drive partition. For us, that’s obviously not a problem, but we know it’s enraged some Linux aficionados. Also, the WD drive only has a 120GB SSD. So, if you are moving to it from an HDD, you will likely have to reinstall your OS and programs, then move all your data to the HDD portion of the drive. The Seagate drive is big enough that you would just need to clone your old drive to it.</p> <p><strong>Winner: Seagate SSHD</strong></p> <p style="text-align: center;"><strong><a class="thickbox" href="/files/u152332/laptop-sshd-1tb-dynamic-with-label-hi-res-5x7_small_0.jpg"><img src="/files/u152332/laptop-sshd-1tb-dynamic-with-label-hi-res-5x7_small.jpg" alt="Seagate’s hybrid drive offers HDD simplicity and capacity, along with SSD-like speed for frequently requested data. " title="Seagate SSHD" width="620" height="639" /><br /></a></strong></p> <p style="text-align: center;"><strong>Seagate’s hybrid drive offers HDD simplicity and capacity, along with SSD-like speed for frequently requested data. </strong></p> <h3 style="text-align: left;">And the Winner Is…</h3> <p style="text-align: left;">This verdict is actually quite simple. If you’re a mainstream user, the Seagate SSHD is clearly the superior option, as it is fast enough, has more than enough capacity for most notebook tasks, and costs about one-third of the WD Black2. But this is Maximum PC, so we don’t mind paying more for a superior product, and that’s the <strong>WD Black2 Dual Drive</strong>. It delivers both speed and capacity and is a better high-performance package, plain and simple.</p> <p style="text-align: left;"><span style="font-style: italic;">Note: This article originally appeared in the April 2014 issue of the magazine.</span></p> http://www.maximumpc.com/seagate_1tb_hybrid_vs_wd_black2_dual_drive_2014#comments Hard Drive Hardware HDD Review Seagate 1TB Hybrid ssd WD Black2 Backup Drives Hard Drives Reviews SSD Features Thu, 31 Jul 2014 19:27:45 +0000 Josh Norem 28103 at http://www.maximumpc.com MSI Radeon R9 270 Gaming OC Review http://www.maximumpc.com/msi_radeon_r9_270_gaming_oc_review <!--paging_filter--><h3>No surprises here, just a solid 1080p card</h3> <p><a title="msi" href="http://www.maximumpc.com/tags/msi" target="_blank">MSI</a> is offering two flavors of its midrange Radeon R9 270 GPU, formerly known as the <a title="7870 GHz" href="http://www.maximumpc.com/tags/radeon_hd_7870_ghz_edition" target="_blank">Radeon HD 7870 GHz edition</a>. There is a standard model and one with an “X” after its name. The difference between the two is the X model has slightly higher core and boost clocks, but otherwise the two cards are the same and are both based on AMD’s Pitcairn GCN core, which is a 28nm part that debuted in 2013.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r9_270x_gaming_2gd5v303_3d1_small_0.jpg"><img src="/files/u152332/r9_270x_gaming_2gd5v303_3d1_small.jpg" alt="Don’t bother with the R9 270X—the non-X version shown here is just fine. " title="MSI Radeon R9 270 Gaming OC" width="620" height="487" /></a></p> <p style="text-align: center;"><strong>Don’t bother with the R9 270X—the non-X version shown here is just fine. </strong></p> <p>The card in front of you is the MSI R9 270 Gaming model, which is a stock R9 270 with a mild overclock, hence the word “Gaming” in its moniker. It has an MSRP of $180, while the X version is roughly $20 more, though street prices are higher due to the mining craze and short supply. For those who are prone to guffawing at a card that is merely rebadged and price-dropped, this is par for the course and actually good news for gamers. That’s because both Nvidia and AMD refine their manufacturing processes over time, so by the time a GPU gets a rebadge, it’s often able to run at higher clocks with better efficiency for a much lower price. The bottom line is that this card once had a $350 price tag and now costs less than $200, so there’s very little to complain about.</p> <p>To rehash the specs, this is a card with a base clock of 900MHz and a boost clock of 975MHz, which is 50MHz higher than the reference board. It has 2GB of GDDR5 memory that runs at 5.6GHz, and 1,280 stream processors. Since this is not new silicon, the card does not offer support for TrueAudio, but as it’s a Graphics Core Next (GCN) card, it does support AMD’s new Mantle API (at press time, BF4 was not optimized for Mantle with the R9 270, but AMD said it’s “being investigated”). As a midrange GPU, the R9 270 has a low-ish TDP of 150w, and therefore requires only a single six-pin PCIe connector for power—an advantage over the 270X, which requires two six-pin connectors. Interestingly, the R9 270 doesn’t have a direct competitor from Nvidia since it costs just a bit over $200, so it sits right in between the $250 GTX 760 and the $150 GTX 650 Ti (the Ti Boost is out of stock everywhere, but costs around $175). The GTX 660 is about the same price, but that card is ancient, so we compared it to the more-expensive GTX 760.</p> <p>Overall, we had a pleasant testing experience with the MSI R9 270 card. It was quiet and cool—never getting hotter than <br />60 C—and was totally stable. It ran the grueling new Star Swarm demo over a weekend with nary a hiccup, and we were also able to overclock it to 1,140MHz boost clock, which netted a 10 percent bump in performance. Basically, we found its performance exactly in line with its price, in that it was a bit slower than the more-expensive GTX 760 in all the games we test aside from Tomb Raider, which is an AMD game.</p> <p>In the end, there’s nothing wrong with the MSI R9 270 Gaming OC and we have no problem recommending it. However, we’d still go with the GTX 760 just because it is quite a bit faster in many games, and only costs $30 more. If Mantle support is important to you, though, feel free to pull the trigger.</p> <p><strong>$220 (street),</strong> <a href="http://www.msi.com/index.php">www.msi.com</a></p> <p><span style="font-style: italic;">Note: This review was originally featured in the April 2014 issue of the&nbsp;</span><a style="font-style: italic;" title="maximum pc mag" href="https://w1.buysub.com/pubs/IM/MAX/MAX-subscribe.jsp?cds_page_id=63027&amp;cds_mag_code=MAX&amp;id=1366314265949&amp;lsid=31081444255021801&amp;vid=1&amp;cds_response_key=IHTH31ANN" target="_blank">magazine</a><span style="font-style: italic;">.</span></p> http://www.maximumpc.com/msi_radeon_r9_270_gaming_oc_review#comments april issues 2014 graphics card Hardware maximum pc msi radeon r9 270 oc Review videocard Reviews Videocards Wed, 30 Jul 2014 22:39:42 +0000 Josh Norem 28096 at http://www.maximumpc.com D-Link DGL-5500 Review http://www.maximumpc.com/d-link_dgl-5500_review <!--paging_filter--><h3>A router built specifically for gamers</h3> <p>When it comes to PC parts and accessories, all roads eventually lead to gamers. Intel and AMD both sell unlocked processors so gamers can more easily overclock their rigs for a few extra frames per second; pro gamer Johnathan “Fatal1ty” Wendel has endorsed everything from motherboards to power supplies; there’s gaming RAM; and of course, a whole assortment of accessories designed to give you an edge when smoking your friends on the virtual battlefield. Up until now, one of the few items missing from the list was an 802.11ac wireless router.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/dgl-5500_front_small_0.jpg"><img src="/files/u152332/dgl-5500_front_small.jpg" alt="The new Mac Pro stole its design from this router—true story. " title="D-Link DGL-5500" width="583" height="1200" /></a></p> <p style="text-align: center;"><strong>The new Mac Pro stole its design from this router—true story. </strong></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/dgl-5500_back_small_0.jpg"><img src="/files/u152332/dgl-5500_back_small.jpg" title="D-Link DGL-5500" width="584" height="1200" /></a></p> <p>D-Link gets credit for tying up that loose end with the DGL-5500, a dual-band AC1300 wireless router built specifically for gamers. What makes the DGL-5500 different from all the other 802.11ac models, including D-Link’s own DIR-868L (reviewed in our February issue), is the inclusion of Qualcomm’s StreamBoost technology.</p> <p>Whereas the majority of modern routers rely on simple quality of service (QoS) rules to prioritize network packets, StreamBoost examines what applications are running and how much actual bandwidth each one needs. It also manages latency because a laggy gamer is a dead gamer. The question is, does it work as advertised?</p> <p>For the most part, StreamBoost lives up to the hype. We consistently saw lower pings in online games when connected to the DGL-5500 versus our zero-point router, the Asus RT-AC66U. External factors beyond our control also affect ping, so it’s hard to offer an apples-to-apples comparison, but to give one example, our ping averaged around 42ms in Battlefield 4 when using Asus’s router. When switching to D-Link’s model and turning on StreamBoost, our pings hovered around 19ms. After firing up Netflix on a second PC and initiating file downloads on two other systems, the ping stayed around 22–24ms—that’s impressive.</p> <p>In our evaluation of D-Link’s DIR-868L, we said the fugly web-based interface could use a major overhaul, and that’s what we got with the DGL-5500. It’s much better looking than before and far less complicated to navigate, though it’s also painfully slow when switching between menus. The UI is also heavily biased toward StreamBoost—if you disable the feature, you lose access to the My Network map, which provides a graphical view of all active devices and how much bandwidth each one is consuming.</p> <p>The DGL-5500 outpaced our zero point router in 802.11n mode on the 2.4GHz band in our three indoor tests. It also did better at picking out uncluttered channels on its own—we use inSSIDer ($20, www.inssider.com) to identify the best channel(s) for testing. However, the RT-AC66U boasts better range and faster transfers in 802.11ac mode on the 5GHz band. It’s worth pointing out the DGL-5500 lacks beamforming, which concentrates the wireless signal at connected devices for longer range and better speeds.</p> <p>There are other shortcomings, as well—you can’t configure Guest networks, the single USB 2.0 port doesn’t support printer sharing, and the combined speed of both channels is capped at AC1300 rather than AC1750 as it is with D-Link’s DIR-868L. While StreamBoost is a step forward, the router is a step backward in other areas. Note to D-Link: Gamers care about this stuff, too.</p> <p><strong>$140 [street],</strong> <a href="http://www.d-link.com/">www.d-link.com</a></p> http://www.maximumpc.com/d-link_dgl-5500_review#comments ac wireless april issues 2014 Gaming Hardware hd media router 2000 Review Reviews Wed, 30 Jul 2014 22:22:18 +0000 Paul Lilly 28097 at http://www.maximumpc.com Small PC Computing http://www.maximumpc.com/small_PCs_2014 <!--paging_filter--><h3>We tour the burgeoning world of wee PCs</h3> <p>In case you haven’t noticed, the PC is getting smaller. But it’s not getting smaller in the way the PC fatalists see it. If anything, enthusiast PCs have gotten larger. Witness Corsair’s 900D, Cooler Master’s Cosmos SE, and Digital Storm’s Aventum II.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/opener_13639_small_0.jpg"><img src="/files/u152332/opener_13639_small.jpg" alt="Yes, the Haswell Nuc is actually this small." width="620" height="446" /></a></p> <p style="text-align: center;"><strong>Yes, the Haswell Nuc is actually this small.</strong></p> <p>The truth isn’t that the PC is getting smaller and thus going away; the truth is that for enthusiasts, there’s interest in gigantic PCs, small micro-towers, and now—Intel hopes—ultra-compact form factor (UCFF) PCs no larger than a book. All of which serve unique purposes, and thereby highlight the PCs unmatched versatility.</p> <p>UCFF PCs as a category aren’t new, of course. They’ve been around for years, but their performance has always been fairly underwhelming and they’ve always consisted of specialty hardware, to be embedded into an ATM or smart soda machine.</p> <p>But now that these compact computers are more capable than ever, readily available, and easily built, there’s no telling what new and interesting applications will spring forth. Is Intel actually onto something big with its new Next Unit of Computing (NUC) initiative?</p> <h3>Next Unit of What?</h3> <p><strong>Intel’s push to make the desktop smallera</strong></p> <p>Trying to figure out the actions of the world’s largest chip company can be confounding to consumers who don’t fully appreciate Intel’s size-13 footprint on the PC industry and its ability to single-handedly change the game.</p> <p>Sometimes when Intel sees a niche it thinks needs to be filled, it tries to jump start it from scratch. The company tried and failed, for example, with its Common Building Block program that was meant to create a DIY-laptop world with standardized power bricks, hard drives, optical drives, LCD panels, keyboards, and battery packs. While CBB never took off, many of the fruits of that effort are still with us.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/page3art_small_0.jpg"><img src="/files/u152332/page3art_small.jpg" alt="Intel is even offering a limited-edition customized Dragon NUC. " width="620" height="454" /></a></p> <p style="text-align: center;"><strong>Intel is even offering a limited-edition customized Dragon NUC. </strong></p> <p>Now, Intel is attempting to both create and fill a niche again with its Next Unit of Computing, or NUC (rhymes with “luck”), a new ultra-compact form factor that the company hopes will push performance computing into unheard-of places.</p> <p>Unlike the CBB program, which was totally reliant on the participation of parts makers and laptop builders, NUCs are actually built and sold by Intel itself. In a nutshell, NUCs are simply 4x4-inch computers packing as much power as possible.</p> <p>From what we can tell, Intel’s actions aren’t intended to drive others out of the market. In fact, Intel seems to be trying to invite others into the NUC game. Thus far, Gigabyte has jumped in with its NUC-style Brix boxes that are proving to be fairly innovative. There are also other smaller and lesser-known brands and embedded-PC vendors in there, as well.</p> <p>Unlike Thin ITX, NUC-style boxes aren’t designed around industry-standard specs. The only things common between the NUC and Brix, for example, are the footprint, the power brick, and other mobile components they accomodate. You won’t, for example, be able to swap a motherboard from a Brix into a NUC because these PCs are generally customized to the chassis they’re in.</p> <h4>Challenges to NUC</h4> <p>One of the challenges NUC and its ilk share is the limited board space. At 4x4 inches, jamming in features has meant adding more layers to the motherboard. While typical ATX motherboards feature six- or even four-layer PCBs, NUCs’ are 10-layer. <br />Adding layers isn’t cheap, either. For example, in a 10-layer ATX motherboard—which you might see with a dual-proc board, where additional layers are needed to run all the traces of both processors—the PCB itself costs about $90.</p> <p>The path going forward for NUC isn’t to blow them up in size, either. Rather than making them, say, 5x5 inches or more in the future, Intel says it’s more interested in getting a 65-watt TDP processor to work reliably in a package of NUC’s current size. Of course, adding a hotter CPU means more cooling and a bigger and more power-hungry power brick, too.</p> <h4>NUC Sales</h4> <p>So, are NUC and NUC-style devices resonating with consumers? Intel didn’t give us exact sales figures, but it says it has seen healthy demand, with quarter-on-quarter growth from 30–50 percent. Interestingly, Intel says that even after it offered a lower-cost Celeron version using the Sandy Bridge microarchitecture, the demand has mostly been at the high end, with consumers actually preferring the initial Ivy Bridge Core i5 version.</p> <p>That’s another reason Intel thinks that NUCs aren’t actually hurting the desktop. In fact, Intel believes the demand for a lot of performance, albeit in a tiny package, will reinvigorate the desktop, as people seek to put a PC in places they never could before.</p> <hr /> <p>&nbsp;</p> <h3>Intel NUC D54250WYK</h3> <p><strong>Haswell comes to the NUC</strong></p> <p>The original Intel NUC DC3217BY we saw in late 2012 was an odd duck. The case was maroon and black, and while it showcased Intel’s newfangled Thunderbolt connectivity, there were no Ethernet, USB 3.0, or analog audio out.</p> <p>Intel cited limited board space as the reason for the port selection on that model (to be fair, Intel did offer a dual-HDMI version with gigabit Ethernet and a single USB 3.0 port) and soldiered on despite the skepticism over the device. That’s good news because the latest NUC leaves few questions unanswered.</p> <p>The newest Haswell NUC D54250WYK shares the same footprint as the original NUC but sits about an eighth of an inch shorter. Rather than the Core i3-3217U in the original NUC, the top-end Haswell NUC features a 1.3GHz Core i5-4250U that will Turbo Boost up to 2.6GHz. There’s no lack of ports on this unit, either. The Haswell NUC includes a Mini Display Port, Mini HDMI, gigabit Ethernet, four USB 3.0 ports, analog audio out, and an infrared port.</p> <p>The newest Haswell NUC D54250WYK shares the same footprint as the original NUC but sits about an eighth of an inch shorter. Rather than the Core i3-3217U in the original NUC, the top-end Haswell NUC features a 1.3GHz Core i5-4250U that will Turbo Boost up to 2.6GHz. There’s no lack of ports on this unit, either. The Haswell NUC includes a Mini Display Port, Mini HDMI, gigabit Ethernet, four USB 3.0 ports, analog audio out, and an infrared port.</p> <p>Internally, there’s a pair of DDR3 SO-DIMM slots and stacked Mini PCIe slots that let you install an mSATA drive and wireless card. The original NUC had overheating issues that caused some of the mSATA drives to error out. Intel has apparently addressed this by tweaking the fan and adding a thermal pad that rests on the mSATA drive. The shell in all NUCs is prewired for Wi-FI. The motherboard in this NUC also features a SATA 6Gb/s port and a port for SATA power, too. Intel apparently plans to use the same board in a future NUC that will be tall enough to support cheaper and far larger notebook hard drives. The motherboard itself is an Intel design and features a beautiful UEFI as well as the QS87 chipset.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/untitled-13221_small_0.jpg"><img src="/files/u152332/untitled-13221_small.jpg" alt="After a puzzling first effort, Intel offers nearly all you could ask for in its NUC follow-up." width="620" height="522" /></a></p> <p style="text-align: center;"><strong>After a puzzling first effort, Intel offers nearly all you could ask for in its NUC follow-up.</strong></p> <p>Performance isn’t a primary driver of people who run these mini PCs, but we decided to see how this Haswell NUC stacked up against the original NUC. That unit features a 1.8 Core i3-3217U CPU on the Ivy Bridge microarchitecture. Both NUCs are dual-core Hyper-Threaded parts, so the only real performance difference is due to the Turbo Boost of the Haswell and the newer microarchitecture. As expected, the Core i5 gives the original Ivy Bridge a pasting in CPU-related tasks. In graphics, it’s closer between the HD4000 and HD5000, but the Haswell part generally is in front. Oddly, the Ivy Bridge NUC comes out on top in 3DMark Ice Storm, which tests basic graphics performance, but falls back in 3DMark Cloud Gate. Neither NUC is suited for serious gaming, but in the 10-year-old Counter Strike: Source graphics stress test, both gave acceptable frame rates at 1080p.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">SPECIFICATIONS/Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td><strong>Haswell NUC</strong></td> <td><strong>Ivy Bridge NUC</strong></td> </tr> <tr> <td class="item">Model</td> <td class="item-dark">D54250WYK</td> <td>DC3217IYE</td> </tr> <tr> <td>CPU</td> <td>1.3GHz Core i5-4250U</td> <td>1.3GHz Core i5-4250U<br /><strong>&nbsp;</strong></td> </tr> <tr> <td class="item">Graphics</td> <td class="item-dark">HD5000</td> <td>HD4000</td> </tr> <tr> <td>Ports</td> <td>Mini HDMI 1.4a, DisplayPort 1.2, 4x USB 3.0, gigabit Ethernet, analog audio out, IrDA, Kensington lock port</td> <td>2x HDMI 1.4A, 3x USB 2.0, gigabit Ethernet, Kensington lock port</td> </tr> <tr> <td>Stitch.EFx (sec)</td> <td><strong>1,747</strong></td> <td>2,453<strong><br /></strong></td> </tr> <tr> <td>ProShow Producer (sec)</td> <td><strong>2,567</strong></td> <td>3,729</td> </tr> <tr> <td>3DMark Cloud Gate</td> <td><strong>3,958</strong></td> <td>3,409</td> </tr> <tr> <td>3DMark Ice Storm</td> <td>32,157</td> <td><strong>35,969</strong></td> </tr> <tr> <td>Counter Strike Source (fps)</td> <td><strong>63.23</strong></td> <td>52.4</td> </tr> <tr> <td>Google Octane 2.0</td> <td><strong>17,832</strong></td> <td>10,643</td> </tr> <tr> <td>Power Consumption Idle (watts)</td> <td><strong>5</strong></td> <td>8</td> </tr> <tr> <td>Power Consumption Idle (watts)</td> <td><strong>24</strong></td> <td>35</td> </tr> <tr> <td>Power Consumption YouTube 1080p (watts)</td> <td>19</td> <td><strong>14.5</strong></td> </tr> </tbody> </table> <p><em>Best scores are bolded. <br /></em></p> </div> </div> </div> </div> </div> <p>We measured power consumption of both NUCs using the same power load and the same power brick (both were outfitted with similar parts, too). On idle, the Haswell unit drank about 5 watts versus the 8 watts of the Ivy Bridge unit. We also tried a worst-case scenario with Prime95 and Furmark running simultaneously. The Haswell used 24 watts to the Ivy Bridge’s 35W. While watching a 1080p video on YouTube, the Ivy Bridge unit used but 14.5 watts, interestingly, while the Haswell NUC used 19 watts.</p> <p>The Haswell NUC is likely the fastest NUC available today, as no one has figured out how to shoehorn a quad-core into the unit. But it’s not cheap. We found the unit on the street for about $375. Before you balk, remember that you’re getting a kit that includes the CPU and PSU. Yes, you can get a cheaper system by going larger—but if you want small and fast, this is the best yet.</p> <p><strong>Intel NUC D54250WYK</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$375, <a href="http://www.intel.com/ " target="_blank">www.intel.com</a></strong></p> <h3>Gigabyte Brix Projector GB-BXPi3-4010</h3> <p>Intel’s goal with the NUC initiative was to create a new category of computing. What that category would be or how it would be used, the company didn’t really know when it started.</p> <p>While Gigabyte has several NUC-style clones, dubbed the “Brix” line, the one that really captured our interest is the Brix Projector. Yup, a UCFF PC with a DLP pico projector and 1.5-watt speaker integrated into it. The projector isn’t super bright, but it outputs a decent 75 ANSI-rated lumens. That means you won’t be using it outdoors in the daylight or in a very bright room, but it’s far better than the first 15-lumen pico projectors of yesteryear. It offers enough light that Gigabyte rates the device as being capable of projecting on a screen up to 85 inches. Resolution is also average at 864x480, or WVGA res, but that’s pretty standard for most pico projectors that are still actually “pico.” We’ll also note that lower resolutions are actually quite passable for media projection. Gigabyte even had the foresight to integrate a standard tripod mount into the base of the PC, too.</p> <p>Inside the Brix Projector you’ll find a pair of DDR3 SO-DIMM slots, and the same stacked layout to take mSATA and Wi-Fi cards as in Intel NUCs. External ports are also generous, with four USB 3.0, gigabit Ethernet, a Mini DisplayPort 1.2, full-size HDMI 1.4a, an analog jack that pulls double duty as optical SPDIF output, and a Mini HDMI-in port should you want to use the unit as a projector from another device.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">SPECIFICATIONS/Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td><strong>Brix Projector</strong></td> <td><strong>Ivy Bridge NUC</strong></td> </tr> <tr> <td class="item">Model</td> <td class="item-dark">GB-BXPi3-4010</td> <td>DC3217IYE</td> </tr> <tr> <td>CPU</td> <td>1.7GHz Core i3-4010U</td> <td>1.8 Core i3-3217U<br /><strong>&nbsp;</strong></td> </tr> <tr> <td class="item">Graphics</td> <td class="item-dark">HD4400</td> <td>HD4400</td> </tr> <tr> <td>Ports</td> <td>HDMI 1.4a, Mini HDMI in, Mini DisplayPort 1.2, gigabit Ethernet, 4x USB 3.0, analog audio, S/PDIF</td> <td>2x HDMI 1.4A, 3x USB 2.0, gigabit Ethernet, Kensington lock port</td> </tr> <tr> <td>Stitch.EFx (sec)</td> <td><strong>2,441</strong></td> <td>2,453</td> </tr> <tr> <td>ProShow Producer (sec)</td> <td><strong>3,564</strong></td> <td>3,729</td> </tr> <tr> <td>3DMark Cloud Gate</td> <td><strong>3,667</strong></td> <td>3,409</td> </tr> <tr> <td>3DMark Ice Storm</td> <td>26,475</td> <td><strong>35,969</strong></td> </tr> <tr> <td>Counter Strike Source (fps)</td> <td><strong>53.29</strong></td> <td>52.4</td> </tr> <tr> <td>Google Octane 2.0</td> <td><strong>11,624</strong></td> <td>10,643</td> </tr> <tr> <td>Power Consumption Idle (watts)</td> <td><strong>7.5</strong></td> <td>8</td> </tr> <tr> <td>Power Consumption Idle (watts)</td> <td><strong>24</strong></td> <td>35</td> </tr> <tr> <td>Power Consumption YouTube 1080p (watts)</td> <td>19</td> <td><strong>14.5</strong></td> </tr> </tbody> </table> <p><em>Best scores are bolded. <br /></em></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/untitled-13223_small_1.jpg"><img src="/files/u152332/untitled-13223_small_0.jpg" alt="Yup. There’s indeed a projector integrated into this PC that’s no bigger than a Wendy’s Baconator." width="620" height="502" /></a></p> <p style="text-align: center;"><strong>Yup. There’s indeed a projector integrated into this PC that’s no bigger than a Wendy’s Baconator.</strong></p> <p>The CPU in the model we reviewed is a Haswell 1.7GHz Core i3-4010U with HD4400 graphics. Again, extreme performance isn’t a key metric for people looking at this class of device, but we were still interested to see how it does against the Ivy Bridge Intel NUC DC3217IYE. Remember, both the Intel Ivy Bridge NUC and the Brix have Turbo Boost disabled at the factory. Despite the Ivy Bridge NUC having a 100MHz advantage, the Brix Projector was slightly faster in some tests. In other tests, though, both were dead even. Clearly, if you really need the performance in a UCFF, pony up for a Core i5 part.</p> <p>In general, power consumption on idle was slightly higher (using an external monitor) with the IB NUC; under our CPU- and GPU-heavy loads and simply playing a 1080p YouTube video, the Brix was on par with the Haswell Intel NUC. As with that PC, the Brix Projector consumed more power than the older Ivy Bridge NUC playing the 1080p video.</p> <p>Using the Brix Projector is a hoot. The graphics signal, you should know, is passed internally, so there’s no hooptie external pass-through cable. You can actually run both the projector and an external monitor simultaneously.</p> <p>Overall, it’s a slick little unit. The question is, what would a normal person need it for? The answer is, most of us wouldn’t need it. But don’t take that to be a negative. There are certainly specialized applications for it, such as media installations, commercial applications, or even an ad-hoc mini-theater setup for the kids. Again, it’s not everybody’s cup of tea, but the fact that you can get a “real” computer with a 75-lumen projector is pretty mind-boggling.</p> <p><strong>Gigabyte Brix Projector</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$600, <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <hr /> <p>&nbsp;</p> <h3>Gigabyte Brix Pro</h3> <p><strong>Faster than a tower. Really</strong></p> <p>In the land of ARM and off-brand x86 parts, the dual-core Core i3 is king. After all, when we talk about the “high-performance” needs of UCFF users, the performance of a Haswell-based CPU or even an Ivy Bridge part is like going back in time and landing a P-51 Mustang next to the Wright brothers after they just touched down at Kitty Hawk.</p> <p>Following that same analogy, you can think of Gigabyte’s blisteringly fast Brix Pro as an X-Wing fighter making a fly-by, wagging its wings, and then flipping the bird before making the jump to light speed. We’re not kidding, either. The Brix Pro is simply the fastest NUC-style UCFF we’ve ever tested. We actually watched it outpace our full-tower, six-core 3.2 Core i7-3630K that’s clocked full-time at 3.9GHz.</p> <p>The secret is Gigabyte’s ability to magically integrate a full-on Core i7-4770R in the Brix Pro. The Core i7-4770R “Crystalwell” is no mere Haswell part. Its main claim to fame is 128MB of super-fast embedded DRAM on the CPU package that acts as gigantic L4 cache (a Core i7-4770K’s L3 cache is 8MB). This cache greatly increases bandwidth for graphics operations and puts it on par with GeForce GT 650M discrete graphics. Since it acts as L4 cache, it can also greatly aid some application workloads, too. And no you can’t buy it, it’s only available soldered to motherboards. Oh, and it’s a full-on desktop-class quad-core Hyper-Threaded i7 chip that’ll hit 3.9GHz on Turbo.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">SPECIFICATIONS/Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td><strong>Brix Pro</strong></td> <td><strong>Ivy Bridge NUC</strong></td> </tr> <tr> <td class="item">Model</td> <td class="item-dark">GB-BXi7-4770R</td> <td>DC3217IYE</td> </tr> <tr> <td>CPU</td> <td>Core 3.2Ghz i7-4770R</td> <td>1.8 Core i3-3217U<br /><strong>&nbsp;</strong></td> </tr> <tr> <td class="item">Graphics</td> <td class="item-dark">Iris Pro 5200</td> <td>HD4000</td> </tr> <tr> <td>Ports</td> <td>HDMI 1.4a, DisplayPort 1.2, 4x USB 3.0, gigabit Ethernet, Kensington lock port</td> <td>2x HDMI 1.4a, 3x USB 2.0, gigabit Ethernet, Kensington lock port</td> </tr> <tr> <td>Stitch.EFx (sec)</td> <td>867</td> <td>2,453<strong><br /></strong></td> </tr> <tr> <td>ProShow Producer (sec)</td> <td><strong>1,410</strong></td> <td>3,729</td> </tr> <tr> <td>3DMark Cloud Gate</td> <td><strong>10,406</strong></td> <td>3,409</td> </tr> <tr> <td>3DMark Ice Storm</td> <td><strong>68,195</strong></td> <td>35,969</td> </tr> <tr> <td>Counter Strike Source (fps)</td> <td><strong>149.43</strong></td> <td>52.4</td> </tr> <tr> <td>Google Octane 2.0</td> <td><strong>26,893</strong></td> <td>10,643</td> </tr> <tr> <td>Power Consumption Idle (watts)</td> <td>8</td> <td>8</td> </tr> <tr> <td>Power Consumption Idle (watts)</td> <td>87</td> <td><strong>35<br /></strong></td> </tr> <tr> <td>Power Consumption YouTube 1080p (watts)</td> <td>20</td> <td><strong>14.5</strong></td> </tr> </tbody> </table> <p><em>Best scores are bolded. <br /></em></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/untitled-13228_small_0.jpg"><img src="/files/u152332/untitled-13228_small.jpg" alt="The Brix Pro packs in more performance per cubic inch than any system we’ve ever tested." width="620" height="481" /></a></p> <p style="text-align: center;"><strong>The Brix Pro packs in more performance per cubic inch than any system we’ve ever tested.</strong></p> <p>Physically, the Pro is about 2.5 inches tall, making it about half an inch taller than the Intel Haswell NUC on page 51. That height, though, gives the Brix Pro the capability to mount a 9.5mm 2.5-inch notebook drive. The motherboard still has an mSATA slot, so you can run an SSD as well as one of the upcoming 2TB 9.5mm hard drives.</p> <p>Like other NUC-style machines, besides the mSATA slot, you’ll also find a mini PCIe slot that Gigabyte has already populated with an 802.11ac as well as two SO-DIMM slots. There’s a single integrated power and SATA connector for the 2.5-inch drive, as well.</p> <p>On the performance tip, as we said, the Brix Pro smokes all other NUCs. That’s not a surprise, as it’s a quad-core part going up against dual-core parts. And we don’t mean a wisp of smoke—it’s a full four-alarm smoke-out with the Brix Pro offering 200 percent performance increases over the Ivy Bridge NUC and from 82–163 percent increases over the Haswell NUC. This desktop Haswell-R part is so fast, it slightly outpaced our desktop zero-point system in ProShow Producer 5 and was slower by just 4 percent in Stitch.Efx 2.0 runs. Yes. Faster than a six-core overclocked machine that’s 30 times bigger. Granted, the tower will eat it in multithreaded tasks and gaming, but the fact that a machine smaller than a retail CPU box can be faster than a mid-tower machine is incredible.</p> <p>There’s a cost, though. When you’re hammering it with a heavy workload, it gets a little whiny. It’s not horrible, but you will hear the fan under very heavy loads. It also drinks more. The CPU has a TDP rating of 65 watts and under extreme CPU and GPU loads, we saw at-the-wall power usage hit near 90 watts. Most of the time though, power consumption is quite reasonable. The last issue is cost. This bare-bones kit will set you back $650. Much of that is the CPU ($400), but either way, we know there’s a price for miniaturization. At least with the Brix Pro, you’re getting a hell of a lot of performance.</p> <p><strong>Gigabyte GB-BXi7-4770R</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9ka.jpg" alt="score:9ka" title="score:9ka" width="210" height="80" /></div> </div> </div> <p><strong>$650, <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h3>DIY NUC</h3> <p><strong>You can roll your own NUC—but should you?</strong></p> <p>To a DIYer, “building” a NUC is a bit of an insult. You basically buy a NUC or Brix, slot in two SO-DIMMs, a Wi-Fi card, an mSATA drive, and install the OS. If you posted such a “build” on YouTube, the hazelnut gallery would come out of the woodwork to rip you a new one in the comment section.</p> <p>All is not lost, however, for true wrenchers who want to actually build a UCFF PC from scratch, so-called kits be damned. We just wonder whether it makes much sense, because at this point there are a lot of barriers to entry to building your own.</p> <p>The first issue is getting a chassis. Intel has told us it really sees these devices as being purely custom computing options with the base NUC and NUC-style machines. While Mini-ITX and Thin ITX (more on that on page 56) feature standard I/O shields like their bigger siblings, ATX and microATX, NUC doesn’t have any standardized cutout for system I/O. That means any chassis would have to be built to take one of the multiple NUC motherboard port arrangements currently available. So don’t just buy a NUC motherboard and a NUC chassis without making sure they match. Most vendors will specify which NUC motherboard the chassis will fit.</p> <p>To experience what it would be like to build our own NUC, we ran with a Silverstone PT14 chassis. This aluminum chassis comes with an I/O shield for either the dual-HDMI-port Ivy Bridge boards or the Thunderbolt version. Our PT14 is the dual-HDMI version.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">DIY NUC-style</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td></td> </tr> <tr> <td class="item">Silverstone Petit PT14 chassis</td> <td class="item-dark">$40</td> </tr> <tr> <td>Intel D33217GKE mobo/CPU</td> <td>$310</td> </tr> <tr> <td class="item">19V power brick</td> <td class="item-dark">$16</td> </tr> <tr> <td>Wi-Fi antennas</td> <td>$10</td> </tr> <tr> <td>Windows 8 OEM OS</td> <td>$99</td> </tr> <tr> <td>Adata 8GB DDR3/1333 RAM</td> <td>$65</td> </tr> <tr> <td>120GB Crucial mSATA drive</td> <td>$108</td> </tr> <tr> <td>Intel 802.11ac Wi-Fi card</td> <td>$34</td> </tr> <tr> <td><strong>Total</strong></td> <td>$682</td> </tr> </tbody> </table> </div> </div> </div> </div> </div> <p>The next issue is securing the NUC motherboard. Intel isn’t fully committed to supporting a DIY ecosystem, so rather than selling individual boards, it’s selling 10-packs of motherboards intended for system builders or integrators. In a bit of a wink, wink, nod, nod, though, some of the bulk packs of motherboards are broken up and sold to end users. This, of course, raises questions about warranty support, but according to LogicSupply.com (a popular vendor of embedded systems that seems to stock most of the esoteric NUC parts), the warranty for the boards are covered directly by Intel even if purchased stand-alone, so it seems Intel will stand behind them.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">BARE-BONES INTEL NUC</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td></td> </tr> <tr> <td class="item">Intel DC3217IYE</td> <td class="item-dark">$255</td> </tr> <tr> <td>Windows 8 OEM OS</td> <td>$99</td> </tr> <tr> <td class="item">Adata 8GB DDR3/1333 RAM</td> <td class="item-dark">$65</td> </tr> <tr> <td>120GB Crucial mSATA drive</td> <td>$108</td> </tr> <tr> <td>Intel 802.11ac Wi-Fi card</td> <td>$34</td> </tr> <tr> <td><strong>Total</strong></td> <td>$561</td> </tr> </tbody> </table> </div> </div> </div> </div> </div> <p>The board we went with was an Intel DC33217GKE “Golden Lake” motherboard. It comes with an integrated heatsink and fan—which won’t work, as the PT14 chassis features an integrated heat pipe that connects directly to the chassis. Since the CPU is 17 watts, it’s possible to dissipate much of the heat through the chassis. Our Golden Lake motherboard came with a standard Intel cooler, which we unscrewed by first removing the two visible screws holding down the fan. We then removed the three fans holding down the heatsink and gently removed it from the board. The PT14 does have a single fan that’s set to exhaust air out the bottom of the chassis.</p> <p>From there it’s as simple as screwing the motherboard to the top of the chassis, populating the RAM, Wi-Fi card, and mSATA, installing the power button, and you’re done. All told, it took us about 15 minutes to roll our own NUC going at a leisurely pace so as not to forever lose the screws. We’ll note that the Wi-Fi antennas didn’t come with our 802.11ac card (they typically don’t) so you’ll have to secure a pair of rubber duckies with cables (just Bing “rubber wifi antenna and internal cable,” select Image, and search for the rubber duck antennas with internal cables. They’re typically under $10.)</p> <p>Before you’re done, though, you’ll also need to buy the 19-volt power brick. Intel actually sells them on its NUC parts page for $15, or they can be found at retailers for $16, typically.</p> <p>There, you’re done. You’ve just built your first Next Unit of Computing. It wasn’t difficult and it’s kind of fun. But does it make sense?</p> <p>No, not at all. Not once you run the numbers. The parts to build your own NUC from scratch cost about $682 (including $99 for the OS). If you had bought a NUC bare-bone system and added the same 802.11ac, mSATA, and RAM from the DIY package you would spend: $561. Ouch. And that’s without having to search through Uncle Jim’s used computer store for a pair of rubber duck Wi-Fi antennas and finding someone who actually sells NUC chassis. From a fiscal point of view, it makes no sense whatsoever. Even our standard edict that building your own box gives you control over the parts, fan placement, and appearance really doesn’t apply because, really, is there that much of a difference?</p> <p>Again, Intel says it’s not sure where it’s going with NUC as a DIY proposition and that’s apparent to us, because the real kick in the gut here is the motherboard. A NUC bare-bones kit with motherboard, power brick, chassis, and internal Wi-Fi antennas is $255 on the street. The best price we could find for the NUC motherboard alone was $310. Perhaps if Intel decides to make the price of the NUC boards more reasonable the DIY angle will make more sense, but today, it’s a waste of scratch no matter how you cut it.</p> <h3>Parts of a Whole</h3> <p><strong>The essential components of a DIY NUC</strong></p> <p style="text-align: center;"><img src="/files/u152332/part_shots-13234_small.jpg" alt="The Silverstone PT14 NUC chassis dissipates heat using a heat pipe with a fan blowing air out the bottom." width="620" height="413" /></p> <p style="text-align: center;"><strong>The Silverstone PT14 NUC chassis dissipates heat using a heat pipe with a fan blowing air out the bottom.</strong></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/part_shots-13238_small_0.jpg"><img src="/files/u152332/part_shots-13238_small.jpg" alt="This Intel D33217GKE NUC motherboard isn’t packaged for consumers, but you can still buy them with apparent warranty support from Intel." width="620" height="541" /></a></p> <p style="text-align: center;"><strong>This Intel D33217GKE NUC motherboard isn’t packaged for consumers, but you can still buy them with apparent warranty support from Intel.</strong></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/part_shots-13236_small_0.jpg"><img src="/files/u152332/part_shots-13236_small.jpg" alt="With NUC, you’ll want higher-clocked modules and a dual-channel config if you care at all about 3D performance." width="620" height="510" /></a></p> <p style="text-align: center;"><strong>With NUC, you’ll want higher-clocked modules and a dual-channel config if you care at all about 3D performance.</strong></p> <p style="text-align: center;"><strong><img src="/files/u152332/part_shots-13237_small.jpg" alt="Like most NUCs, our DIY takes an mSATA drive. Newer units, however, will take 2.5-inch drives at the cost of space." width="620" height="450" /></strong></p> <p style="text-align: center;"><strong>Like most NUCs, our DIY takes an mSATA drive. Newer units, however, will take 2.5-inch drives at the cost of space.</strong></p> <p style="text-align: center;"><strong><a class="thickbox" href="/files/u152332/part_shots-13232_small_0.jpg"><img src="/files/u152332/part_shots-13232_small.jpg" alt="The NUC and Brix units all share the same basic 65-watt power supply." width="620" height="725" /></a></strong></p> <p style="text-align: center;"><strong>The NUC and Brix units all share the same basic 65-watt power supply.<br /></strong></p> <p style="text-align: center;">&nbsp;</p> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> http://www.maximumpc.com/small_PCs_2014#comments feature Gigabyte Brix Pro Gigabyte Brix Projector GB-BXPi3-4010 Hardware intel March issues 2014 nuc pc small pc Features Mon, 28 Jul 2014 23:01:57 +0000 Gordon Mah Ung 28032 at http://www.maximumpc.com AVADirect Mini Cube Gaming PC Review http://www.maximumpc.com/avadirect_mini_cube_gaming_pc_review_2014 <!--paging_filter--><h3>Just call it ‘The Fridge’</h3> <p>Naming a PC isn’t an easy task. It’s hard enough when you’re talking about your personal PC (Betsy, Svetlana, or Jabba work well), but when you’re a company selling a new model, Marketing 101 says the name should imbue magic and convince consumers to pony up.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/ava_13708_small_0.jpg"><img src="/files/u152332/ava_13708_small.jpg" alt="Though capable and reasonably priced, this medium form factor is eclipsed by smaller, faster, and cheaper machines." title="AVADirect Mini Cube Gaming PC" width="620" height="582" /></a></p> <p style="text-align: center;"><strong>Though capable and reasonably priced, this medium form factor is eclipsed by smaller, faster, and cheaper machines.</strong></p> <p>We’re guessing AVADirect didn’t take that class, as its new custom mini-cube gaming PC is apparently named Mini Cube Gaming PC. The truth is, AVADirect probably doesn’t give a damn about the name because frankly, who cares? Maybe “5S” or “S IV” works on some people, but on a custom PC where you pick out the parts yourself, it’s probably far less pressing.</p> <p>Around the office, we’ve taken to calling this handsome SFF machine “The Fridge,” not necessarily because of its size, but because of its Frigidaire-like aesthetic. Sure, it would have been cool if the optical drive shared the same brushed-aluminum surface, but it still matches the black accents elsewhere on the case. While there’s no question that this is a small form factor rig, compared to the micro-towers we’ve seen lately, it’s pretty big. It’s more than double the width of the Falcon Northwest Tiki, and while slightly shorter than the CyberPower Hadron we reviewed in February, it’s about three inches wider than that machine.</p> <p>That size increase gives it more capability. While most micro-towers use SFX or 1U PSUs, The Fridge uses a standard 760W Seasonic ATX PSU. Inside, you’ll also find a liquid-cooled Core i7-4770K overclocked to 4.2GHz, 16GB of Kingston DDR3/1600, two Kingston 120GB HyperX SSDs in RAID 0, a 2TB WD HDD, an MSI Z87 Mini-ITX board, and an Asus GeForce GTX 780 card.</p> <p>Against our zero-point system, the AVADirect represents well in the non-heavily multithreaded tasks but, not surprisingly, it gets left behind in all other tests by the ZP’s six-core Core i7-3930K part clicking along at 3.8GHz. That includes gaming tests, but not by the margin you would expect from the zero-point’s GeForce GTX 690.</p> <p>The more important question is how The Fridge compares with the SFF/micro-tower crowd. Not too shabby. The bad mutha of the group continues to be Falcon Northwest’s Tiki, with its Haswell part overclocked to 4.7GHz and a GeForce Titan. Indeed, the Tiki still stands as the fastest micro-tower we’ve ever tested, and the fact that it’s held onto that title well into the new year demonstrates how aggressively Falcon went for broke with this model. Of course, that aggression comes at a price, with the Tiki hitting the $4,400 mark. At $2,583, AVADirect can pull the old, “You can buy our system, play all of your games, and still have enough money to buy two of the upcoming cheap 4K panels” routine.</p> <p>Normally, that routine would sway us, because like most folks, we can see sacrificing a little performance for a new monitor, keyboard, mouse, and new suit and shoes, too. But then there’s CyberPower PC’s Hadron Hydro 300, which costs $300 less than the AVADirect. It almost mirrors the parts in the AVADirect except for the HDD. The Hadron also packs custom liquid-cooling for its CPU and GPU, which, while the chassis gets a tad warm, helps the rig run extremely quietly and gives it a slight performance edge. The AVADirect box is louder and under heavy loads emits a low-frequency large-fan buzz.</p> <p>That leaves the AVADirect in a tough spot. It’s slower than the Tiki and more expensive than the Hadron. Yes, it’s got an off-the-shelf PSU, but we’re not sure that’s worth the sacrifice in size. Yes, it’s a striking-looking case with its brushed-steel/aluminum finish, but maybe the sun is just finally starting to set on the medium form factor.</p> <p><strong>$2,584,</strong> <a href="http://www.avadirect.com/">www.avadirect.com</a></p> <p><em>Note: This article was originally featured in our March issue of the magazine.</em></p> http://www.maximumpc.com/avadirect_mini_cube_gaming_pc_review_2014#comments AVADirect Mini Cube Hardware March issues 2014 maximum pc Reviews Systems Thu, 24 Jul 2014 22:11:59 +0000 Gordon Mah Ung 28059 at http://www.maximumpc.com Sapphire Tri-X Radeon R9 290X Review http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review <!--paging_filter--><h3>A real gem of a GPU</h3> <p>For those who haven’t kept up with current events: Late last year AMD launched its all-new Hawaii GPUs, starting with its flagship Radeon R9 290X that featured a blower-type cooler designed by AMD. In testing, it ran hotter than any GPU we’ve ever tested, hitting 94 C at full load, which is about 20 C higher than normal. AMD assured everyone this was no problemo, and that the board was designed to run those temps until the meerkats came home. It was stable at 94 C, but the GPU throttled performance at those temps. The stock fan was also a bit loud at max revs, so though the card offered kick-ass performance, it was clearly being held back by the reference cooler.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_13650_small_0.jpg"><img src="/files/u152332/sapphire_13650_small.jpg" alt="The Tri-X throws off AMD’s meh cooler." title="Sapphire Tri-X Radeon R9 290X" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>The Tri-X throws off AMD’s meh cooler.</strong></p> <p>Therefore, we all eagerly awaited the arrival of cards with aftermarket coolers, and this month we received the first aftermarket Radeon R9 290X—the massive triple-fan Tri-X model from Sapphire; and we must say, all of our Radeon prayers have been answered by this card.</p> <p>Not only does it run totally cool and quiet at all times, but because it runs so chilly it has plenty of room to overclock, making it a card that addresses every single one of our complaints about the reference design from AMD. There is one caveat: price. The Sapphire card is $50 more expensive than the reference card at $600, but you are obviously getting quite a bit of additional horsepower for your ducats.</p> <p>When we first fired it up, we were amazed to see it hit 1,040MHz under load, and stay there throughout testing. Even more surprising were the temps we were seeing. Since the reference card hits 94 C all day long, this is obviously a really hot GPU, but the Sapphire Tri-X cooler was holding it down at a chilly 75 C. The card was whisper-quiet too, which was also a pleasant surprise given the noise level of the reference cooler. We were also able to overclock it to 1,113MHz, which is a turnaround in that we could not overclock the reference board at all since it throttles at stock settings.</p> <p><strong>$600,</strong> <a href="http://www.sapphiretech.com/landing.aspx?lid=1">www.sapphiretech.com</a></p> <p><span style="font-style: italic;">Note: This review was originally featured in the March 2014 issue of the&nbsp;</span><a style="font-style: italic;" title="maximum pc mag" href="https://w1.buysub.com/pubs/IM/MAX/MAX-subscribe.jsp?cds_page_id=63027&amp;cds_mag_code=MAX&amp;id=1366314265949&amp;lsid=31081444255021801&amp;vid=1&amp;cds_response_key=IHTH31ANN" target="_blank">magazine</a><span style="font-style: italic;">.</span></p> http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review#comments Air Cooling amd gpu graphics card Hardware March issues 2014 maximun pc Review Sapphire Tri-X Radeon R9 290X Reviews Thu, 24 Jul 2014 22:09:13 +0000 Josh Norem 28024 at http://www.maximumpc.com Toshiba Qosmio X75 Review http://www.maximumpc.com/toshiba_qosmio_x75_review_2014 <!--paging_filter--><h3>Lots of graphical horsepower at a reasonable price</h3> <p>It’s been a while since we reviewed a Toshiba gaming notebook, so we couldn’t wait to get our hands on the company’s new Qosmio X75. Unlike iBuypower’s super-slim and portable 17-inch Battalion M1771 gaming notebook we reviewed last issue, the Qosmio X75 puts power ahead of portability.</p> <p>With a body measuring 16.5x10.7x1.7 inches and weighing more than seven pounds, the X75 is definitely in desktop-replacement territory. The chassis is clad in black textured aluminum, with lots of red accenting, such as the shiny red trim around the body and the trackpad, the red LED keyboard backlighting, and the glowing red Qosmio logo on the lid. It all serves to add a bit of flash to an otherwise subtle aesthetic.</p> <p style="text-align: center;"><img src="/files/u152332/toshiba_laptop13691_small.jpg" alt="Go with 8GB of RAM and forego the Blu-ray drive to save $300." title="Toshiba Qosmio X75" width="620" height="413" /></p> <p style="text-align: center;"><strong>Go with 8GB of RAM and forego the Blu-ray drive to save $300.</strong></p> <p>A couple aspects we don’t like are the 4-pin power connector, which necessitates precise orientation of the plug. We’re also not crazy about the exhaust fan’s location on the right edge, which could mean warm wrists for right-handers during heavy play sessions. While it never got uncomfortably hot, we would have preferred a rear exhaust.</p> <p>On the bright side, the Qosmio’s display is one of the best TN panels we’ve seen, with fantastic viewing angles and a vibrant 1080p glossy display, which didn’t suffer from the usual glare problem. We also had no qualms with the laptop’s quad Harman/Kardon speakers, which sounded clear and powerful. As a matter of fact, we can confidently say that these are some of the best laptop speakers we’ve heard.</p> <p>The trackpad is similarly praise-worthy. While we normally harp on trackpads that don’t feature two dedicated buttons, the Qosmio’s uniform expanse is easy to use, with horizontal grooves above the left and right mouse clickers providing a suitable substitute for separate buttons. In addition, the trackpad is ample at 4.5x3.2 inches, highly responsive, and supports multitouch gestures. The keyboard is also equally competent, although we do wish the arrow keys were full-size as opposed to half-size.</p> <p>Inside the chassis, the Qosmio sports a quad-core 2.4GHz 4700MQ CPU, a GeForce GTX 770M, and 16GB of memory. For storage, it has a 256GB mSATA SSD coupled with a 1TB hard drive. The laptop has a 47Wh 8-cell battery.</p> <p>When it was time to perform, Toshiba’s laptop killed it in the gaming department, but was average everywhere else. We had never reviewed a gaming laptop with a 770M before, and found that it had no issues kicking the crap out of the more mobile-oriented 765M GPU in our Alienware 14 zero-point rig, thanks in no small part to its 3GB of GDDR5 memory. We’re talking performance advantages of 17–66 percent in the gaming tests. The Qosmio couldn’t quite keep up with our zero-point in our CPU-intensive benchmarks, however, losing by roughly 3–8 percent. While those aren’t huge losses, it’s still a little disappointing given that both laptops use the same Intel processor. We suspect that Toshiba is throttling the CPU to avoid thermal issues. Thankfully, the laptop never got hot, so we didn’t hear much fan noise.</p> <p>The laptop’s biggest failing actually came by way of battery life, which isn’t a big surprise from a machine of this size. In our video rundown test, the Qosmio lasted two hours and 20 minutes. If you’re interested in getting a laptop this large, you’re most likely going to use it as a desktop replacement, thus battery life isn’t really an issue. And while its CPU performance is a little disappointing, the Qosmio X75 offers a lot of performance as a gaming laptop for a very fair price. While our build cost $1,800, foregoing a Blu-ray drive and reducing the memory to 8GB of RAM (which is more than enough for gaming) could save $275, bringing the total to a little over $1,500. When you also consider the fact that you can easily pop open the bottom of the laptop for swapping out RAM and storage (without voiding the warranty), the Qosmio X75 turns out to be a great deal for enthusiasts, particularly gamers.</p> <p><strong>$1,800,</strong> <a href="http://www.toshiba.com/tai/">www.toshiba.com</a></p> <p><span style="font-style: italic;">Note: This review was originally featured in the March 2014 issue of the&nbsp;</span><a style="font-style: italic;" title="maximum pc mag" href="https://w1.buysub.com/pubs/IM/MAX/MAX-subscribe.jsp?cds_page_id=63027&amp;cds_mag_code=MAX&amp;id=1366314265949&amp;lsid=31081444255021801&amp;vid=1&amp;cds_response_key=IHTH31ANN" target="_blank">magazine</a><span style="font-style: italic;">.</span></p> http://www.maximumpc.com/toshiba_qosmio_x75_review_2014#comments Business Notebooks Hardware March issues 2014 maximum pc Review Toshiba Qosmio X75 Reviews Notebooks Thu, 24 Jul 2014 22:04:19 +0000 Jimmy Thang 28010 at http://www.maximumpc.com