Hardware http://www.maximumpc.com/taxonomy/term/41/ en PAX East 2014: Cooler Master Unveils NovaTouch TKL Mechanical Keyboard [Video] http://www.maximumpc.com/pax_east_2014_cooler_master_unveils_novatouch_tkl_mechanical_keyboard_video <!--paging_filter--><h3><img src="/files/u166440/novatouch.jpg" alt="NovaTouch TKL" title="NovaTouch TKL" width="200" height="100" style="float: right;" />New feature dampens the sound of typing</h3> <p><strong>Cooler Master has unveiled the NovaTouch TKL mechanical keyboard an Pax East</strong>. Maximum PC’s Jimmy Thang was able to see the new keyboard that features a silicon-based injection around the mechanical key switches.&nbsp;</p> <p><iframe src="//www.youtube.com/embed/gjIH4dKNg2s" width="620" height="315" frameborder="0"></iframe></p> <p>The silicon-based injection helps absorb the friction and shock when two mechanical pieces are rubbing against each other. Sound is kept to a minimum as well &nbsp;when a user is typing. As for the feel of the switch it is similar to, according to the rep, the Cherry MX Brown key switches.</p> <p>One other interesting feature is that the key stems are backwards compatible with Cherry MX key caps. A major plus for keyboard enthusiasts who like to switch out the caps with their own.&nbsp;</p> <p>The NovaTouch TKL is expected to come out sometime in the third quarter of 2014. No set price has been determined.&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/pax_east_2014_cooler_master_unveils_novatouch_tkl_mechanical_keyboard_video#comments Cooler Master Cooler Master NovaTouch Hardware mechanical keyboard NovaTouch NovaTouch TKL Mon, 14 Apr 2014 02:35:52 +0000 Sean D Knight and Jimmy Thang 27623 at http://www.maximumpc.com PAX East 2014: Logitech Showcases Its G502 Proteus Core Gaming Mouse [Video] http://www.maximumpc.com/pax_east_2014_logitech_showcases_its_g502_proteus_core_gaming_mouse_video <!--paging_filter--><h3><img src="/files/u166440/logitech_g502.jpg" alt="Logitech G502" title="Logitech G502" width="200" height="153" style="float: right;" />A Mouse with a 12,000DPI sensor</h3> <p>If you have been looking for a mouse that will let you shoot the wings off of a fly, then <strong>Logitech’s G502 Proteus Core gaming mouse</strong> might be the one for you. Maximum PC’s Jimmy Thang got to see the&nbsp;<a title="MPC Logitech G502" href="http://www.maximumpc.com/logitech_g502_proteus_core_gaming_mouse_packs_customizable_12000_dpi_sensor" target="_blank"><span style="color: #ff0000;">Proteus Core</span></a>, which features a 12,000DPI sensor, up close and personal at PAX East.</p> <p>&nbsp;</p> <p><iframe src="//www.youtube.com/embed/lK4ueindDmQ" width="620" height="315" frameborder="0"></iframe></p> <p>Of course, if 12,000 DPI is too much for a user to handle then the sensor can be adjusted as low as 200. Aside from the high DPI the Proteus Core has 11 programmable buttons, comes with five 3.6g weights, the sensor can be adjusted for various surfaces, and it has a dual-mode hyper-fast scroll wheel.&nbsp;</p> <p>The Logitech G502 Proteus Core Tunable Gaming Mouse will be out sometime this month for $79.99 in the U.S. and Europe.</p> <p>So who would want to try and shoot the wings off of a fly at 500 feet with that mouse?&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/pax_east_2014_logitech_showcases_its_g502_proteus_core_gaming_mouse_video#comments gaming mouse Hardware logitech Logitech G502 Logitech G502 Proteus Core Proteus Core Sun, 13 Apr 2014 18:29:33 +0000 Sean D Knight and Jimmy Thang 27618 at http://www.maximumpc.com PAX East 2014: HyperX Showcases Gaming Headset and RAM [Video] http://www.maximumpc.com/pax_east_2014_hyperx_showcases_gaming_headset_and_ram_video <!--paging_filter--><h3><img src="/files/u166440/hyperx_fury.jpg" alt="HyperX Fury" title="HyperX Fury" width="200" height="198" style="float: right;" />Meet the HyperX Cloud gaming headset and Fury RAM</h3> <p><strong>HyperX is showing off its HyperX Cloud headset at PAX East</strong>. A division of Kingston, HyperX has added quite a few interesting features to this headset which Maximum PC’s Jimmy Thang was able to learn about.&nbsp;</p> <p>For example, the headset comes with two interchangeable earcups. One set is made of leather and the other is a red, velour earcup that will change the sound profile of the device. HyperX also considered making its product an on-the-go type of headset that comes with a detachable microphone and adapters so that it can be used with a desktop, notebook, mobile phones, and Sony’s PlayStation 4.&nbsp;</p> <p><iframe src="//www.youtube.com/embed/bgh0RmGvmfQ" width="620" height="315" frameborder="0"></iframe></p> <p>Retail price for the HyperX Cloud headset is $99.99 and is available for pre-order.</p> <p>In addition to the headset, Jimmy learned a little about <strong>HyperX’s Fury RAM</strong> which features automatic overclocking up to 1866MHz, requires 1.5V, and comes in 8GB or 16GB modules.&nbsp;</p> <p>The HyperX Fury is available for purchase at $84.99 or $159.99 depending on if you are purchasing it individually or as a kit.&nbsp;</p> <p><span style="font-style: italic;">Follow Sean on&nbsp;</span><a style="font-style: italic;" title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a><span style="font-style: italic;">, </span><a style="font-style: italic;" title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a><span style="font-style: italic;">, and </span><a style="font-style: italic;" title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></p> http://www.maximumpc.com/pax_east_2014_hyperx_showcases_gaming_headset_and_ram_video#comments Fury RAM Hardware hyperX HyperX Cloud headest HyperX Fury ram Headphones Sun, 13 Apr 2014 00:51:40 +0000 Sean D Knight and Jimmy Thang 27617 at http://www.maximumpc.com PAX East 2014: Intel Booth Tour [Video] http://www.maximumpc.com/pax_east_2014_intel_booth_tour_video <!--paging_filter--><h3><img src="/files/u166440/auros_x7.jpg" alt="Auros X7" title="Auros X7" width="200" height="123" style="float: right;" />A 4K gaming laptop and the Aorus X7 gaming notebook</h3> <p>With 4K monitors on the shelves, it is only a matter of time before 4K laptops start to reach into the wallets of tech enthusiasts. Maximum PC’s Jimmy Thang got to see an<strong> Alienware</strong>&nbsp;<strong>18-inch laptop</strong><strong>&nbsp;running in 4K </strong>during his Intel booth tour at Pax East.</p> <p>Hardly any information was provided on the laptop’s specs, though it was revealed that it has an Intel Core i7 4940MX Extreme Edition processor. The rep was also eager to point out that, while the stock processor speed is at 3.1GHz, it has been overclocked to 5.2 GHz.&nbsp;</p> <p>To show off its abilities, Batman: Arkham Origins was being used to demo the impressive piece of hardware in 4K although Jimmy noted that the framerate was choppy.</p> <p>No price or release date was provided.</p> <p><iframe src="//www.youtube.com/embed/4q75oB-wrow" width="620" height="315" frameborder="0"></iframe></p> <p>&nbsp;</p> <p>During the tour Jimmy was also able to see the <strong>Aorus X7 gaming notebook</strong>. The 17.3-inch notebook features a metal chassis, dual Nvidia GeForce GTX 765M cards, and an Intel Core i7-4700HQ processor encased within a metal chassis. For storage, the notebook comes with one 1TB HDD and two 128GB SSDs&nbsp;</p> <p>The Aorus X7 is 1.9-inches thick and weighs 6.39lbs. To keep it cool, it also has five cooling pipes, four vents, and two fans which have been designed to be at the rear, along with the GPUs, so that the consumer’s palms and fingers don’t get heated while using it.&nbsp;</p> <p>Those wishing to purchase the Aorus X7 can do so for the starting price of $2,099.</p> <p><iframe src="//www.youtube.com/embed/OYpCtwGxmsU" width="620" height="315" frameborder="0"></iframe></p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/pax_east_2014_intel_booth_tour_video#comments 4K gaming 4K gaming laptop Aorus X7 Auros X7 gaming notebook dual sli cards gaming notebook Hardware intel News Sat, 12 Apr 2014 23:56:11 +0000 Sean D Knight and Jimmy Thang 27616 at http://www.maximumpc.com AMD Unleashes the Dual-GPU Radeon R9 295X2 http://www.maximumpc.com/amd_unleashes_dual-gpu_radeon_r9_295x2 <!--paging_filter--><h3>AMD's $1,500, 500w TDP monster GPU arrives</h3> <p><img src="http://www.maximumpc.com/files/u302/radeon_r9fan_watercooler_product_shot_birdseye_5in300dpi_4c_small.jpg" alt="R9 295X2" title="R9 295X2" width="250" height="189" style="float: right;" />Today AMD is pulling the wraps, or the briefcase as it were, off its new flagship GPU - the massively badass Radeon R9 295X2. Packing two fully-loaded R9 290X GPUs, this air-and-water cooled $1,500 GPU is the new "fastest single GPU" on the planet, and sets new records for both benchmark performance as well as sticker price.</p> <h3>Project Hydra</h3> <p>When we first heard whispers about a dual-Hawaii card coming out of AMD's Skunk Works, we figured a few things had to happen to make this card a reality. We thought perhaps it would tone down its R9 290X core a bit to keep temperatures somewhat below "thermonuclear," because just one R9 290X GPU needs a cooler the size of Montana to keep it from getting so hot it begins to throttle. Since one of these GPUs runs hotter than Russell Crowe, we figured if AMD had the cajones to put two of them on a PCB it would either need to be liquid-cooled, or severely underclocked so as to not overwhelm whatever massive cooler it had designed. As it turns out, we were sort of wrong, and sort of right, and we couldn't have asked for anything more with the final card we now know as the Radeon R9 295X2.</p> <p>Instead of making compromises, lowering clock speeds, or both, AMD said "F that" and went all-in, shoving two *overclocked* R9 290X GPUs into a massive 12-inch shroud that is cooled by both liquid and air, then slapping the highest MSRP we've ever seen on a retail GPU in our hardware-watching lives, at least until the GTX Titan Z arrives at some point in the future.</p> <p><img src="/files/u302/radeon_r9fan.jpg" width="650" height="490" /></p> <p style="text-align: center;"><em>The Radeon R9 295X2 is over 12 inches long. Insert "that's what she said" joke here.</em></p> <h3>GPU Specs</h3> <p>Since we've already covered the Radeon R9 290X in the past, this rundown of the specs will be quick and dirty. Okay, so you take one R9 290X GPU, then take another one, and put them on the same card. There you have it! The only difference between these GPUs and stand-alone R9 290X GPUs is that most of them would hit a boost clock of 1,000MHz if given enough thermal headroom, whereas the GPUs on the R9 295X2 are designed to hit a slightly faster 1,018MHz. Not only that, but due to the increased cooling performance made possible by the Asetek-designed apparatus, you can actually overclock these GPUs as well, which was not possible on a reference R9 290X. Otherwise, specs are exactly double compared to the R9 290X, so there are 5,632 Stream Processors, 12.4 billion transistors, 8GB of RAM total, dual 512-bit memory bus, a 500w TDP, and 11.5Tflops of compute performance. If you're the type who doesn't like reading, and wants to look at a chart, we feel you. Here is a spec chart provided by AMD:</p> <p>&nbsp;</p> <p style="text-align: center;"><img src="/files/u302/r9295x2_specs.jpg" alt="R9 295X2 Specs" title="R9 295X2 Specs" width="557" height="538" /></p> <p style="text-align: center;">&nbsp;</p> <h3 style="text-align: left;">Hybrid Cooling</h3> <p>Since the Hawaii core at the heart of the R9 295X2 runs hotter than the surface of the sun, AMD had to enlist the expertise of Asetek to build a custom closed-loop liquid cooling mechanism to keep the GPUs colder than Gwyneth Paltrow's heart. Each GPU gets its own water block to dissipate heat, with liquid entering the system via one tube, swishing around a bit, then squirting over to the second GPU via a connecting tube underneath the shroud. Once it makes its rounds in the second water block it is sent back to the radiator where it's cooled by a 120mm fan. Here's a picture of the whole shebang:</p> <p>&nbsp;</p> <p style="text-align: center;"><img src="/files/u302/amdrad_r9_watercooler.jpg" width="650" height="434" /></p> <p style="text-align: center;"><em>The Asetek cooler is maintenance-free and uses Kool Aid inside. Not really.</em></p> <p style="text-align: center;">&nbsp;</p> <h3 style="text-align: left;">Compared to the Titan Z</h3> <p style="text-align: left;">The Radeon R9 295X2 is a natural competitor to the GTX Titan Z, but just because both of them sport two of each company's current flagship GPUs, AMD with its R9 290X and Nvidia with its Titan Black. Since the Radeon card costs half the price of a Titan Z they will exist in separate worlds, with the Radeon strictly for gaming and mining, and with the Titan Z for gamers/developers. Also, we still have not seen official specs for the Titan Z, and Nvidia doesn't have it listed on its website, so some of this comparison is pure speculation. That said, let's speculate via this handy chart comparing the two cards:</p> <p style="text-align: left;">&nbsp;</p> <p style="text-align: center;"><img src="/files/u302/r9295x2_titanz.jpg" width="339" height="430" /></p> <p style="text-align: center;">*The Titan Z's compute ability is unknown, so this is a guess based on 2X Titan Black.</p> <h3 style="text-align: left;">Two Interesting Tidbits</h3> <p style="text-align: left;">Before we get to the benchmarks, there are two unique attributes of this card we want to point out. The first is that the R9 295X2 has a glowing red logo on its side and a red LED-lit center fan; a first for an AMD GPU. Those who have been green with envy (heh) over Nvidia's glowing GeForce logo will surely appreciate this edition. AMD says it was added as part of the card's "no compromise" design. A second part of that design philosophy extends to the dual eight-pin power connectors, which must each provide 28A of power to the card. This means you can't just run one cable with two eight-pin PCIe connectors on it to the GPU, so you'll essentially need an SLI/CrossFire capable PSU to run this bad boy. It is, after all, a CrossFire GPU.</p> <p style="text-align: left;">&nbsp;</p> <p style="text-align: center;"><img src="/files/u302/fan.jpg" width="400" height="383" /></p> <p style="text-align: center;"><em>AMD brings the bling with a glowing logo and LED fan.</em></p> <p style="text-align: left;">Ok, enough jibber jabber. Hit the next page for benchmarks and our final thoughts.</p> <hr /> <h3 style="text-align: left;">Benchmarks</h3> <p style="text-align: left;">We tested the R9 295X2 on our standard GPU test bench, which is a high-end machine running an Asus Rampage IV Extreme motherboard, Intel Core i7-3960X Extreme Edition CPU, 16GB of DDR3/1600 memory, a Thermaltake ToughPower 1,050 PSU, and Windows 8 Enterprise. We did not have two GTX 780 Ti cards to use for testing, so we compared it to a dual R9 290X cards in CrossFire running at 4K resolution to get things started.&nbsp;</p> <p style="text-align: left;">&nbsp;</p> <p style="text-align: center;"><strong>3840x2160 Benchmarks</strong></p> <p style="text-align: center;"><img src="/files/u302/crossfire_comparison.jpg" width="341" height="422" /></p> <p style="text-align: center;"><em>Best scores are bolded</em></p> <p style="text-align: left;">Overall, there's not much surprise here, except for the fact that this is the first single-GPU card we've tested that is actually playable at 4K resolution. Also what you can't experience by looking at this benchmark chart is how loud the R9 290X cards are when run under load in tandem. They make some noise, whereas the R9 295X2 is very, very quiet. There is still a tiny bit of fan noise under load but it's night-and-day compared to a stock R9 290X</p> <p style="text-align: left;">Next we compared the R9 295X2 to the GTX 780 Ti, also at 3840x2160.</p> <p style="text-align: left;">&nbsp;</p> <p style="text-align: center;"><strong>3840x2160 Benchmarks</strong></p> <p style="text-align: center;"><img src="/files/u302/780ti-comparison.jpg" width="342" height="393" /></p> <p style="text-align: center;"><em>Best scores are bolded</em></p> <p style="text-align: left;">Compared to the single-GPU competition, well, there is no competition. The R9 295X2 lays the smack down plain and simple, which is to be expected given its numerous advantages.</p> <p style="text-align: left;">Next up, the R9 295X2 versus <strong>GTX 780 SLI</strong>.</p> <p style="text-align: center;"><strong>3840x2160 Benchmarks</strong></p> <p style="text-align: center;"><em><img src="/files/u302/sli_4k.jpg" width="343" height="436" /></em></p> <p style="text-align: center;"><em>Best scores are bolded</em></p> <p style="text-align: left;">The Radeon R9 295X2 is still holding its own against two GTX 780 GPUs. It's a shame we don't have a second GTX 780 Ti though, because it would most likely eat the Radeon's lunch, for less money out the door too. Of course, you have two cards and a lot more heat and noise, but that's the price you pay for extreme performance with this particular config.</p> <p style="text-align: left;">Finally, let's have a look at the Radeon R9 295X2 versus the GTX 780 Ti at<strong> 2560x1600 with 4XAA enabled</strong>.</p> <p style="text-align: center;">&nbsp;</p> <p style="text-align: center;"><strong>2560x1600 4XAA Benchmarks</strong></p> <p style="text-align: center;"><img src="/files/u302/2560_benches1.jpg" width="340" height="377" /></p> <p style="text-align: center;"><em>Best scores are bolded</em></p> <p style="text-align: left;">Another smackdown - what a surprise.</p> <h3 style="text-align: left;">Final Thoughts</h3> <p style="text-align: left;">It's not often in the GPU game that we have such a one-sided battle, but we certainly do have just that with the powerful R9 295X2. This card kicks all kinds of ass, no doubt about it. It's easily the fastest single-card GPU we've ever tested, and by a healthy margin too. Of course, we don't have dual GTX 780 Ti cards to test it against, so that's unfortunate. Regardless, that would still not change the Radeon's "single card champion" status, which it now claims, unquestionably. Not only is it fast, but it's very quiet and cool too, which are words we never thought we'd say about a fire-breathing Hawaii card, but AMD has certainly done its homework on this one and it delivers on all promises.</p> <p style="text-align: left;">Without getting too hyperbolic, in many ways this is essentially the perfect GPU. It offers record-breaking performance, only takes up two-slots, is cool and quiet, and it overclocks. Of course, the one chink in its armor is its $1,500 price tag, which seems insanely high in a market where $1,000 used to be the upper echelon. However, compared to the $3,000 GTX Titan Z the Radeon is actually a bargain, which is another sentence we never thought we'd write, but here we are. Naturally, Nvidia won't take too kind to this type of aggression, so all it has to do now is release a gamer-oriented dual-GPU card such as the mythical GTX 790, hopefully with two GTX 780 Ti cores onboard, and price it at $1,500 and it'll be game, set, match Nvidia. Right after the R9 290X came out and stole the GTX 780's thunder, Nvidia pounced immediately with the GTX 780 Ti to reclaim the "fastest single card" crown, so we expect them to respond to the R9 295X2, and to respond with vigor.</p> http://www.maximumpc.com/amd_unleashes_dual-gpu_radeon_r9_295x2#comments Hardware project hydra r9 295x2 radeon Gaming Videocards Tue, 08 Apr 2014 13:20:20 +0000 josh norem 27585 at http://www.maximumpc.com 8 Products That Were Ahead of Their Time http://www.maximumpc.com/8_products_were_ahead_their_time_2014 <!--paging_filter--><h3><img src="/files/u154082/virtual-boy-set.jpg" width="250" height="179" style="float: right;" />Virtual Boy, Microsoft tablet, Dell phablet, and more</h3> <p>We all love innovation...when it works. Unfortunately when it comes to technological breakthroughs, timing is everything.&nbsp;For instance, did you know that Microsoft showed off its tablet PC way back in 2000? And who could forget Nintendo’s attempt at a VR headset with its Virtual Boy in 1995?</p> <p>Many of those gadgets were simply ahead of their time. To give credit where credits due, we thought we would round up eight of these lone-gone pioneers.&nbsp;</p> <p>Did you own any of these devices? Let us know in the comments below!</p> http://www.maximumpc.com/8_products_were_ahead_their_time_2014#comments 2013 ahead of their time Hardware innovation maximum pc microsoft tablet oculus rift sony phablet tech gadgets virtual boy The List Thu, 27 Mar 2014 21:51:37 +0000 Jimmy Thang and Clark Crisp 27422 at http://www.maximumpc.com Unique and Cool Computer Cases http://www.maximumpc.com/cool_computer_cases_2014 <!--paging_filter--><h3><a class="thickbox" style="font-size: 10px; text-align: center;" href="/files/u152332/puget_small_0.jpg"></a><a class="thickbox" style="font-weight: normal;" href="/files/u152332/fang3_small_0.jpg"><img src="/files/u152332/fang3_small.jpg" title="Cyber Power Fang III" width="200" height="266" style="float: right;" /></a></h3> <h3>What's it like to build in three of the most unusual cases on the market?</h3> <p>A generation ago, computer cases were typically beige pizza box–shaped things that resided under beige CRT monitors. You wrangled floppy disks in and out of them and pressed the power button at times, but they weren't conversation pieces or personal statements. We don't know exactly when the shift to case fanciness occurred. It evolved gradually, like facial hair or Nicolas Cage. And in the last few years, we've seen some pretty exotic enclosures come to the home desktop, in various degrees of affordability and physical dimensions. You may wonder what it's like to build inside one of these strange containers; we certainly did. To find out, we had three distinctly different unconventional cases delivered to our Lab: the Cooler Master HAF Stacker 935, the In Win D-Frame, and the Corsair Carbide Air 540 that are pictured here.</p> <p>With the help of our trusty intern Sam Ward, we built complete systems inside each of these enclosures, and we document the experience in the following pages of this article.</p> <p><em>Note: The time this article originally debuted in the magazine, we built up our HAF Stacker 935 using a prototype unit. As a result, the <a title="stacker case" href="http://www.newegg.com/Product/Product.aspx?Item=N82E16811119290" target="_blank">retail version</a> may have some slight differences.</em></p> <h4>Cooler Master HAF Stacker 935</h4> <p><strong>The Voltron of PC building</strong></p> <p>With our Stacker 935 fully assembled, the black mega-tower looked like something a super-villain would use in his or her secret lair. It measured over three feet tall, for Pete's sake, nine inches wide, and nearly two feet long. The Stacker 935 consists of one small ITX unit about 9 inches tall, and one large unit about 19 inches tall, with an MSRP of $170, for a total height of about 28 inches. You can't buy the big one by itself, but you can grab as many of the small ones as you like, and then stack them according to your needs. The small unit is called the Stacker 915 and retails for $70. Our combo of choice, pictured for your amusement, is a 935 and a 915, which Cooler Master sells as the "945." For comparison, a Corsair 900D is about 27 inches tall, and the inside of a Cooler Master Cosmos II is about 22 inches tall (the external handles add some extra height).</p> <p>In this pre-production unit, the cases attach to each other with a sliding mechanism, which you lock into place with a set of provided screws. Each compartment also comes with detachable feet and a detachable top panel, so you have flexibility in your stacking arrangement.</p> <p>The 915 is limited to ITX motherboards, but it's pretty wide open otherwise; the "F" variant gets its PSU in the front, and the "R" variant has one in the rear. But while the 915R won't give your CPU room for more than a stock air cooler, you can fit a water-cooling pump pretty easily, and a radiator and fan would mount on the side panel. The 915 also takes up to three drive cages, each containing three 3.5-inch drives, for a total of nine. Each of those nine slots can also take two SSDs if you have an adapter kit. If you limit your video card length to eight inches or less, you can fit an ITX motherboard and two cages.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/1st_image_small_0.jpg"><img src="/files/u152332/1st_image_small.jpg" alt="While you’re free to arrange your Stacker cases in any order, Cooler Master recommends that you not add more than two smaller 915 cases to the basic 935 setup, lest the tower tip over." title="Cooler Master HAF Stacker 935" width="620" height="567" /></a></p> <p style="text-align: center;"><strong>While you’re free to arrange your Stacker cases in any order, Cooler Master recommends that you not add more than two smaller 915 cases to the basic 935 setup, lest the tower tip over.</strong></p> <p>For system cooling, each side panel will accommodate a 360mm or 280mm radiator. That's probably overkill for the ITX system itself, but a custom liquid-cooling loop for a system located in the mid-tower can take advantage of this additional space. Cooler Master recommends quick-disconnect couplings for such a setup, so that the two cases can easily detach from each other even after the loop is installed. The stock 915 comes with a front 92mm fan, and the mid-tower ships with dual 120mm intake fans and one 140mm rear exhaust fan. The mid-tower's intake fans can fit inside the front bezel, to maximize room inside the case.</p> <p>For system cooling, each side panel will accommodate a 360mm or 280mm radiator. That's probably overkill for the ITX system itself, but a custom liquid-cooling loop for a system located in the mid-tower can take advantage of this additional space. Cooler Master recommends quick-disconnect couplings for such a setup, so that the two cases can easily detach from each other even after the loop is installed. The stock 915 comes with a front 92mm fan, and the mid-tower ships with dual 120mm intake fans and one 140mm rear exhaust fan. The mid-tower's intake fans can fit inside the front bezel, to maximize room inside the case.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/2nd_image_small_0.jpg"><img src="/files/u152332/2nd_image_small.jpg" alt="You can swap drive cages freely between the big case and the small cases; they attach with four standard screws." title="Cooler Master HAF Stacker 935" width="620" height="690" /></a></p> <p style="text-align: center;"><strong>You can swap drive cages freely between the big case and the small cases; they attach with four standard screws.</strong></p> <p>Other than sheer cable length, the systems were not difficult to put together. The cases are all roomy with plenty of cable management space, and the drive cages can be moved around to multiple spots.</p> <p><strong>Cooler Master HAF Stacker 935</strong></p> <p><strong>$170,</strong> <a href="http://www.coolermaster-usa.com/">www.coolermaster-usa.com</a></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_small_11.jpg"><img src="/files/u152332/main_image_small_10.jpg" width="620" height="750" /></a></p> <p><strong>1.</strong> In this config, the top case is a semi-portable HTPC unit powered by an AMD A10-6800K. It doesn't share any cabling or devices with the system underneath.</p> <p><strong>2.</strong> The gap between the cases is about an inch tall, so we have plenty of room for this Corsair H100i CPU cooler to blow exhaust out the top of the larger case.</p> <p><strong>3.</strong> The 915 has dust filters on its side panels; they're attached with four screws.</p> <p><strong>4. </strong>The larger case ships with two 120mm intake fans in the front, and one 140mm exhaust in the rear.</p> <p><em>Click the <a title="d-frame page" href="http://www.maximumpc.com/cool_computer_cases_2014?page=0,1" target="_self">next page</a> to read about the Win D-Frame case.</em></p> <hr /> <p>&nbsp;</p> <h3>In Win D-Frame</h3> <p><strong>If you've got it, flaunt it</strong></p> <p>The industrial-looking In Win D-Frame was probably the most interesting to put together. If you're a fan of K'Nex or Tinkertoys, you'll be right at home attaching the four red pieces to each other and to the silvery mounting plate in back. When you're done, you'll be rewarded with a unique, all-aluminum frame, two tempered glass panels, and a fun 90-degree rotation from the standard direction (so your video cards will be vertical, for example). Despite its aluminum frame, the tempered glass makes the case surprisingly heavy, tipping the scales at 25 pounds before you start installing your hardware.</p> <p>The aluminum is supposed to help with heat dissipation, which itself is aided by a set of fans installed in the bottom of the case, blowing upward. The D-Frame comes with no fans of its own, as people who are paying the $400 price tag usually have a set of their own. We had to settle for a few Scythe Gentle Typhoons. It's a hard-knock life.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/win_d-frame_small_0.jpg"><img src="/files/u152332/win_d-frame_small.jpg" width="620" height="551" /></a></p> <p style="text-align: center;"><strong>It may be heavy, but that tempered glass provides an awesome view of your beloved parts. Sloppy builders need not apply.</strong></p> <p>The rotation throws off some sizing estimations, admittedly. Minus the power supply cage in the back, the D-Frame is about 22 inches long and 19 inches tall. The width is a little misleading, due to the frame extensions that support the glass panels; inside, you'll actually have no more than six inches for an air cooler. So, we'd recommend a liquid-cooler instead. We used a Silverstone TD03 here; despite its extra-thick radiator and dual fans, it installed in the rear with plenty of room to spare. If you remove the 3.5-inch drive cage, the D-Frame could easily support a thick 240mm radiator as well. (And without the cage, there are still several storage device mounts built directly into the frame.) Our fan/rad combo hangs off of two circular brackets attached to the frame with a couple of screws. A fourth fan would usually go there if we weren't using liquid cooling, but these brackets have no problem with the heavier load. The power-supply cage is also spacious, holding a Corsair AX1200i—eight inches long—with room to spare.</p> <p>Video card length is basically a non-factor. There's about 13 inches of space in that area, and you can remove the fan bracket to add a couple more. We slapped a trio of Nvidia GeForce GTX 780s in there without breaking a sweat. The close proximity of the power supply cage makes hooking up the cards easy. The power supply can be oriented in two different directions; we chose this orientation because it shows off our pretty braided cables, in a patriotic selection of red, white, and blue. When your side panels are made entirely of glass, you can't skimp on the little visual details. Note that there are no other panels shielding the inside of the case, so foreign objects can fall in if you're not careful. On the other hand, this semi-open design makes tinkering a lot easier. You could ditch the panels altogether, if you don’t have any pets or small children around.</p> <p style="text-align: center;"><img src="/files/u152332/win_d_frame_small-.jpg" alt="Despite its appearance of complexity, the D-Frame needs only a standard Phillips screwdriver to assemble." width="620" height="475" /></p> <p style="text-align: center;"><strong>Despite its appearance of complexity, the D-Frame needs only a standard Phillips screwdriver to assemble.</strong></p> <p>Lastly, this case is getting a limited run of 500 units, though that's apparently just for this red version. The orange version gets its own 500-unit run. Each one is stamped with a serial number—ours is 187. The point is, it won't be around forever. Though with this rugged design, the case itself may very well outlive you.</p> <p><strong>In Win D-Frame</strong></p> <p><strong>$400,</strong> <a href="http://www.inwin-style.com/en/">www.inwin-style.com</a></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/win_d_frame1_small_0.jpg"><img src="/files/u152332/win_d_frame1_small.jpg" width="620" height="617" /></a></p> <p><strong>1.</strong> You can orient the power supply any way you want, but we recommend positioning its intake fan on the outside, so that it's not competing with the video card(s) for air.</p> <p><strong>2.</strong> For a little variety, we went with an AMD FX-8350 CPU on an Asus Crosshair V Formula-Z motherboard. Can't let Intel hog all the fun.</p> <p><strong>3.</strong> There's nearly three inches of space between the back of the motherboard and the side panel, so cable routing is very manageable.</p> <p><strong>4.</strong>&nbsp; Because of the extra length of this case, even a radiator and fan combo as bulky as the Silverstone TD03 fits in the back with room to spare.</p> <p><em>Click the <a title="corsair carbide air 540 page" href="http://www.maximumpc.com/cool_computer_cases_2014?page=0,2" target="_self">next page</a> to read about the Corsair Carbide Air 540 and other interesting chassis.</em></p> <hr /> <p>&nbsp;</p> <h3>Corsair Carbide Air 540</h3> <p><strong>It's hip to be square</strong></p> <p>When you peek inside the Corsair Carbide Air 540, you'll probably notice that it's missing some important stuff. Like, oh, a PSU and 5.25-inch drive bays. Have you gone crazy? Are invisible elves powering this puzzling cube? No and no. All you have to do is flip the case around to view the other compartment, and you'll see the wizard behind the curtain. The 13 inches of total width give ample room for the trick. The net effect is that you can have loads of fans and water cooling for the video card, CPU, and motherboard without sacrificing storage capacity. Two 3.5-inch drive bays sit on the bottom of the left-hand side, complete with built-in SATA connectors so that you never see drive cables on that side of the case. The right-hand side (where the PSU is) gets four 2.5-inch bays for SSDs.</p> <p>You may be concerned by the low number of 3.5-inch bays, but keep in mind that the two 5.25 bays in the back compartment can accommodate two such drives, with an adapter kit. We decided to put a Blu-ray drive in there instead. But keep in mind that these larger bays are vertical; not all optical drives are compatible with that orientation.</p> <p>Power supplies can be mounted pretty much any way you want, though, so Corsair takes advantage of this. The PSU is rotated 90 degrees onto its side to accommodate the relatively tight space, but its cables don't have to be clean here, since there's no window and just empty space between the PSU and the front of the case. This relaxed design makes it a lot easier just hook up everything and go, rather than taking painstaking steps to zip-tie every cable for maximum cleanliness. Nobody likes their side panel to bulge because they couldn't route everything smoothly, and here, you don't even have to think about it. The power supply can also be virtually any length. The lower right-hand corner of the panel has a grill, so the PSU can pull cooler external air.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/corsair_carbide_small_0.jpg"><img src="/files/u152332/corsair_carbide_small.jpg" alt="The Carbide Air 540’s looks are reminiscent of the cube-shaped boxes found in server rooms, miniaturized for the home desktop." title="Corsair Carbide Air 540" width="620" height="567" /></a></p> <p style="text-align: center;"><strong>The Carbide Air 540’s looks are reminiscent of the cube-shaped boxes found in server rooms, miniaturized for the home desktop.</strong></p> <p>The front of the case will take a 280mm or 360mm radiator—we installed the Corsair H110, which uses a 280mm rad. There was even room to put fans on both sides for “push-pull,” using the two stock 140mm front intake fans. This increases airflow through the rad, so the liquid that returns to the pump contains less heat and can therefore absorb more, before it's pumped back to the radiator. We still had room for an Asus DirectCU II GeForce GTX 770, which is 10.7 inches long, but it was tight—a card with PCI Express power connectors on the end instead of the side would not fit in this config.</p> <p>Meanwhile, you could add two 140mm fans in the top, though a push-pull setup didn't have quite enough space. The rear has a 140mm exhaust fan pre-installed. You can place a 140mm closed-loop radiator on top of it without obstructing anything—a feature usually only seen in full-tower cases. The rear will also take a 120mm fan, and the top will take two 120mm units, or one 240mm radiator.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/one_more_small_0.jpg"><img src="/files/u152332/one_more_small.jpg" alt="The left-hand &quot;chamber&quot; is eight inches wide, leaving five inches on the other side for the power supply and 5.25-inch drive bays." title="Corsair Carbide Air 540" width="620" height="630" /></a></p> <p style="text-align: center;"><strong>The left-hand "chamber" is eight inches wide, leaving five inches on the other side for the power supply and 5.25-inch drive bays.</strong></p> <p>Back to the PSU side, there's room between it and the front of the case for a liquid-cooling reservoir and pump, but it will require some cable tidiness, especially if you're using two or more video cards. The rear of the case does not have pre-cut holes to route tubing to an external rad or reservoir, so your loop will need to be completely internal, unless you're prepared for a little DIY.</p> <p><strong>Corsair Carbide Air 540</strong></p> <p><strong>$140,</strong> <a href="http://www.corsair.com/en">www.corsair.com</a></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_small2_0.jpg"><img src="/files/u152332/main_image_small2.jpg" width="620" height="710" /></a></p> <p><strong>1.</strong> Since the 5.25-inch drive bay is in a hidden chamber, you can wire up a fan controller back there without revealing much of the cable routing.</p> <p><strong>2.</strong> Our distinctively orange stunt motherboard for this rig is the Gigabyte GA-Z87X-OC, a quad-SLI board that retails for around $200.</p> <p><strong>3.</strong> The Air 540 has built-in motherboard standoffs with a central "guide" post, which shaves installation time down even further.</p> <p><strong>4.</strong> Because the power supply is laid on its side, it shouldn't need extension cables to reach the 8-pin CPU power connector at the top of the motherboard</p> http://www.maximumpc.com/cool_computer_cases_2014#comments 2013 cool computer cases cooler master stacker corsair carbide air 540 fishtank case Hardware Lian Li maximum pc unique unusual win d-frame Features Mon, 17 Mar 2014 21:28:07 +0000 Tom McNamara 27368 at http://www.maximumpc.com New 14-inch Razer Blade Comes with Highest Resolution Display http://www.maximumpc.com/new_14-inch_razer_blade_comes_highest_resolution_display_2014 <!--paging_filter--><h3><img src="/files/u166440/rzrbladet14_01.jpg" alt="Razer Blade 2014" title="Razer Blade 2014" width="200" height="113" style="float: right;" />An impressive display to accommodate all that power</h3> <p><a title="Razer Homepage" href="http://www.razerzone.com/" target="_blank"><span style="color: #ff0000;">Razer</span></a> has revealed the specs and features for its new Razer Blade laptop. The new <strong>14-inch Razer Blade’s biggest feature is the 3200x1800 touchscreen display</strong> that features 5.76 megapixels. It is a significant upgrade from <a title="Razer Blade 2013" href="http://www.maximumpc.com/article/news/razer_announces_14-inch_razer_blade_laptop" target="_blank"><span style="color: #ff0000;">last year’s model</span></a> that featured a 1600x900 resolution display. According to Razer, the laptop’s state-of-the-art IGZO IPS display panel results in a 250 percent increase in contrast ratio over last year’s model.&nbsp;</p> <p>The laptop, which is 0.7-inches thin, is powered by a next generation Nvidia GeForce GTX 870M GPU that provides 3GB of dedicated GDDR5 VRAM in conjunction with the laptop’s 8GB of DDR3L memory. In addition, it also has a 4th gen Intel Core i7 processor and comes with Windows 8.1 64-bit installed. As for storage, the laptop will come with either a 128, 256, or 512GB SSD.&nbsp;</p> <p>Other features include three USB 3.0 ports, a built-in 2.0MP full-HD camera, Kensington lock interface, and 150-watt power supply. It weighs 4.47 lbs., measures 13.6x0.70x9.3-inches, and has up to six hours of battery life.</p> <p>The Razer Blade’s starting price is $2,199. Pre-orders are currently being accepted and it is expected to ship in early April.</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/new_14-inch_razer_blade_comes_highest_resolution_display_2014#comments gaming laptop. ultrathin laptop Hardware highest resolution razer razer blade News Thu, 13 Mar 2014 03:44:56 +0000 Sean D Knight 27431 at http://www.maximumpc.com V3 Devastator Review http://www.maximumpc.com/v3_devastator_review_2014 <!--paging_filter--><h3>Pint-size PC packs a punch</h3> <p>How much PC power can one jam into a bread box? (We’ll take a commercial break while the youngsters Google “bread box.”) V3 Gaming tries to answer that question with the latest iteration of its <strong>Devastator</strong> small form factor box. Unlike the four micro-towers that we <a title="micro tower" href="http://www.maximumpc.com/micro-towers_review_2013" target="_blank">previously reviewed</a>, the Devastator conforms to a boxier silhouette, using a slick new Silverstone SG10 case.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/v3_devastator_small_1.jpg"><img src="/files/u152332/v3_devastator_small_0.jpg" alt="The Devastator fits in a pair of GeForce GTX 770 cards along with a new Haswell chip." title="V3 Devastator" width="620" height="643" /></a></p> <p style="text-align: center;"><strong>The Devastator fits in a pair of GeForce GTX 770 cards along with a new Haswell chip.</strong></p> <p>The SG10 is slightly taller than a traditional SFF box, so it can accommodate a microATX motherboard. Lest you wonder why you’d need a microATX in an SFF, we have a simple answer: multiple video cards—more on that later.</p> <p>The Devastator seems like it was configured with best-bang-for-the-buck in mind. Inside, you’ll find a pair of GeForce GTX 770 cards slotted into the Asus Gryphon Z87 board, 16GB of Corsair DDR3/1600, and a Core i5-4670K clocked up to 4.4GHz. For storage, V3 Gaming didn’t skimp on the primary drive—something a lot of vendors do these days—and outfitted the Devastator with a 256GB Plextor M5 Pro SSD. Bulk storage, however, is a bit paltry, consisting of a 1TB Toshiba desktop HDD. These days, it’s pretty hard to justify a 1TB drive on a build that costs more than $800.</p> <p>To measure the Devastator’s performance, we looked to three points of comparison. First was our aging Sandy Bridge-E zero-point test bed, where it was a give-and-take contest. In thread-light tasks, the Devastator’s higher-clocked Haswell is on par and even sometimes faster than the six-core SNB-E in our zero-point. Flip to, say, Premiere Pro CS6 or x264 HD 5.01 encoding, however, and even the elderly Sandy Bridge-E cores hammer the hell out of the less-threaded Haswell. In gaming, the tide turns yet again, with the once-mighty GeForce GTX 690 in our zero-point being trounced by the Devastator’s pair of GeForce GTX 770 cards. Boo hoo.</p> <p>But that’s not the whole story on micro-towers. We also pit the Devastator against the $4,433 Falcon Northwest Tiki, and the latter’s 4.7GHz Core i7-4770K simply dominated in the compute-heavy tasks. But, again, the Tiki’s Titan can’t compete with the dual GeForce GTX 770s—the Devastator is faster in 3DMark 11 by 22 percent. The gap closes in 3DMark Fire Strike Extreme, where the Devastator is but 10 percent faster.</p> <p>The Tiki is an extreme example, though. For the most part, simple physics and the ability of a bigger box to better displace thermals and hold more hardware put standard desktops at an advantage. For example, the $2,000 CyberPower Zeus Evo Thunder 3000 SE that we reviewed in our September 2013 issue aces the Devastator in compute tasks, thanks to its Core i7-4770K, and comes out slightly faster in gaming since it also has a pair of 770s, while coming in $500 cheaper. Of course, it’s also much bigger.</p> <p>Still, the Devastator is a good blend of size and performance, and is fairly priced to boot, given that small form factor boxes&nbsp; typically carry a premium.</p> <p><strong>$2,500,</strong> <a href="http://www.v3gamingpc.com/">www.v3gamingpc.com</a></p> http://www.maximumpc.com/v3_devastator_review_2014#comments 2013 computer Hardware Holiday issues 2013 maximum pc Review silverstone case V3 Devastator Reviews Systems Mon, 03 Mar 2014 22:54:29 +0000 Gordon Mah Ung 27322 at http://www.maximumpc.com Asus Radeon R9 280X DC2 TOP Review http://www.maximumpc.com/asus_radeon_r9_280x_dc2_top_review <!--paging_filter--><h3>The new 1080p king</h3> <p>At any given time, we have one GPU in our inventory that holds the title of “loudest card in the office.” The current title-holder is the PowerColor Radeon HD 7970 Vortex, which sounds like a jet engine. That’s just how the Radeon 7970 GHz cards are; their boosted clock speeds drum up a lot of heat, making them much louder than their Nvidia counterparts. Given this pedigree, imagine our surprise when we fired up the <strong>Asus Radeon R9 280X</strong>, which rocks the exact same Tahiti XT chip used in the 7970 GE boards. As we leaned in close to our test bed expecting to hear that oh-so-familiar fan noise, we were greeted instead with a barely audible whirring sound. It’s truly miraculous what AMD and Asus have done with this formerly unruly chip, making it whisper-quiet and also surprisingly affordable at $310, which is roughly half what it used to cost.</p> <p style="text-align: center;"><a class="img-float-right" href="/files/u152332/asus-radeon-r9-280x-directcu-ii-top-1000x716_small_0.jpg"><img src="/files/u152332/asus-radeon-r9-280x-directcu-ii-top-1000x716_small.jpg" alt="The R9 280X is a heck of a lot more quiet and affordable than the original HD 7970 GE. " title="Asus Radeon R9 280X DC2 TOP" width="620" height="444" /></a></p> <p style="text-align: center;"><strong>The R9 280X is a heck of a lot more quiet and affordable than the original HD 7970 GE. </strong></p> <p>Like the previous Asus boards we’ve reviewed, this is a DirectCU II card, so it has a fancy custom PCB, high-end components for improved stability, longevity, and overclocking, as well as a hulking two-slot cooler we’ve seen before (and loved). This is a 28nm Tahiti card, with 2,048 stream processors, a 384-bit-wide memory bus, and 3GB of GDDR5 memory. This card will be competing with the more expensive Nvidia GTX 770, which costs $400 as we went to press, with no indication that Nvidia will lower its price. Perhaps after reviews of this card appear, Nvidia will rethink that proposition.</p> <p>This is a TOP card, which means it’s overclocked, but not by much at 1,070MHz, only 70MHz higher than stock. Asus also has a super-premium version of this card named the Matrix Platinum, which has a three-slot cooler and a much higher price tag. One interesting note is that, unlike the flagship R9 290X cards with their new dies that don’t require a CrossFire bridge, this card still requires a bridge in multicard configs. Thankfully, Asus threw a bridge connector into the box along with a driver CD. The card measures 11 inches long and includes two DVI connectors, one HDMI, and one DisplayPort connector.</p> <p>When we ran the R9 280X through our gauntlet of PC benchmarks, two things immediately surprised us. The first was just how quiet the card was, as it is barely audible at any time, even under heavy load. The second was that it was trading blows with the GTX 770, which costs $90 more. Sure, the GK104 and Tahiti chips have always been comparable, so this is expected, we suppose, but given this card’s low pricing (by comparison), it was hard to wrap our heads around the fact that it’s punching above its weight class. It also handily spanked the $250 GTX 760, giving it the best price-to-performance ratio in its price segment.</p> <p>In the end, this is the go-to card for ultra settings at 1080p, no question. It smokes the more expensive GTX 770 and also spanks the GTX 760, as it should. If the performance delta isn’t enough to sway you, there’s word that the Never Settle Forever game bundle will be coming to the 200-series cards soon, too, making this card almost irresistible. The only fly in the ointment is the Asus GPU Tweak software, which looks and feels antiquated, and is difficult to examine at a glance. Thankfully, third-party options are available, making this only a minor blemish on an otherwise perfect GPU.&nbsp;</p> <p><strong>$310,</strong> <a href="http://www.asus.com/">www.asus.com </a></p> http://www.maximumpc.com/asus_radeon_r9_280x_dc2_top_review#comments 2013 Asus Radeon R9 280X DC2 TOP gpu graphics card Hardware Holiday issues 2013 maximum pc Review Video Card Reviews Videocards Thu, 27 Feb 2014 21:46:28 +0000 Josh Norem 27311 at http://www.maximumpc.com Gateway One ZX4970-UR22 Review http://www.maximumpc.com/gateway_one_zx4970-ur22_review <!--paging_filter--><h3>A budget-friendly AiO for dad</h3> <p>While we love powerful super-rigs that can cut through benchmarks like a hot knife through buttah, not everyone can afford an $8,000 PC. This is where a budget-friendly all-in-one computer such as the <strong>Gateway One ZX4970</strong> comes into play. At a mere $530, it certainly presents an interesting value proposition, but is it actually a good deal or a waste of dough?</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/gateway_aio_stock_image_small_0.jpg"><img src="/files/u152332/gateway_aio_stock_image_small.jpg" alt="A button allows you to toggle the light behind the Gateway logo on and off. Fancy!" title="Gateway One ZX4970-UR22" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>A button allows you to toggle the light behind the Gateway logo on and off. Fancy!</strong></p> <p>The first thing you’ll notice about the ZX4970 is its 21.5-inch screen. It’s not small, but it is dwarfed by most other AiOs on the market, which generally come in 23- and 27-inch form factors. Furthermore, the display’s TN panel offers subpar viewing angles, besides being a bit dim. But where the ZX4970 really falls short is in its omission of a touchscreen, which is a shame given the presence of the touch-friendly Windows 8 OS. On the upside, we don’t have much beef with the integrated 2.5-watt speakers beneath the monitor—they serve decent volume levels, though they obviously can’t match a dedicated 2.1 setup.</p> <p>On the left side of the screen, the ZX4970 features two USB 3.0 ports, an SD card reader, and both a headphone and mic jack. To the right of the monitor is a button that lets you switch the AiO’s HDMI port from in to out (or vice versa) and a DVD burner. The HDMI port itself resides behind the monitor, along with four USB 2.0 ports and an Ethernet jack. It’s not an exorbitant amount of ports, but it covers most common needs. Cables can be routed through a cutout on the stand in the back. The stand allows you to bend the monitor back roughly 20 degrees, which could be useful for use from a standing position if not for the fact that the screen doesn’t support touch and the included full-size keyboard and mouse are wired, so you’re essentially tethered to your desk anyhow.</p> <p>If you’re hoping to play the latest PC games or put the machine through heavy compute tasks, the ZX4970 is not for you. While the AiO features a respectably hefty (for this price, that is) 1TB hard drive, the rest of the ZX4970’s parts are pretty bare-bones. The unit is running a dual-core Ivy Bridge–based Intel Pentium G2030 clocked at 3GHz, has 4GB of DDR3/1600, and lacks discrete graphics. Compared to our Asus ET2300 zero-point AiO, which features a quad-core processor, twice the amount of RAM, and a GeForce GT 630M GPU, Gateway’s offering faced a whole lot of pain in our benchmarks. It performed roughly 20–30 percent slower in our ProShow Producer 5 and Stitch.Efx CPU tests, and was left in the dust in x264 HD 5.0 benchmark, which thrives on cores. Our ZP AiO is by no means a tank, but compared to Gateway’s ZX4970, it was like an M4 Sherman facing off with a Volkswagen microbus full of hippies. And as far as graphics go, high-end integrated graphics are on the cusp of matching low-end mobile GPUs, but the ZX4970 uses a meager Pentium integrated-graphics solution, so it found itself roughly 60–70 percent slower than the ZP’s GeForce GT 630M in both the STALKER: CoP and Metro 2033 benchmarks. In our real-world test, we booted Borderlands 2 and ran everything on low at 1366x768 resolution and got an average frame rate in the mid-teens. No, it’s notß pretty for anything beyond casual gaming.</p> <p>While the ZX4970 is dang cheap, it’s an unfortunate example of “you get what you pay for.” It reminds us of the affordable eMachines of yesteryear, in AiO form. Although it may be a decent computer for Aunt Peg, for an enthusiast, we recommending spending a little more to build a much better desktop.</p> <p><strong>$530,</strong> <a href="http://www.gateway.com/worldwide/">www.gateway.com</a></p> http://www.maximumpc.com/gateway_one_zx4970-ur22_review#comments 2013 aio all in one pc Consumer Desktops Gateway One ZX4970-UR22 Hardware Holiday issues 2013 monitor pc Review Reviews Systems Thu, 27 Feb 2014 20:29:39 +0000 Jimmy Thang 27310 at http://www.maximumpc.com Cooler Master V8 GTS Review http://www.maximumpc.com/cooler_master_v8_gts_review_2014 <!--paging_filter--><h3>Not firing on all cylinders</h3> <p>When it comes to keeping your CPU cool under pressure, it’s hard to beat a closed-loop liquid cooler (CLC). They’re on the expensive side, though, so there’s still plenty of room at $50 and below for conventional air cooling. What, then, do we make of an air cooler with an MSRP of $100? It’s gotta be pretty fancy to command that kind of scratch, and the <strong>Cooler Master V8 GTS</strong> sure seems like a contender.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/v8_gts_front01_small_0.jpg"><img src="/files/u152332/v8_gts_front01_small.jpg" alt="The fans have small embossed arrows to indicate airflow." title="Cooler Master V8 GTS" width="620" height="618" /></a></p> <p style="text-align: center;"><strong>The fans have small embossed arrows to indicate airflow.</strong></p> <p>As its name suggests, it’s an update of the original V8, which used a single 120mm fan to cool four sets of radiator fins. That version is actually still compatible with recent CPU sockets, but it’s rated for “only” 180 watts of heat dissipation. The GTS version ups the cap to 250 watts, with dual 140mm fans and a vapor chamber. But despite its bulk, it will still play nice with high-profile RAM sticks and large motherboard heatsinks, such as those on the Rampage IV Extreme in our test-bench machine.</p> <p>However, the fans are not designed to be removed, making installation a bit awkward. You know that things are not going to go swimmingly when the bundled items include a proprietary tool for tightening nuts. And sure enough, we had to pull the RAM, video card, and motherboard from the case to get enough clearance to crank this widget. This is also a four-way SLI motherboard, so the first slot is designed for the primary video card. But unfortunately, there was not enough space there to install it. We can just use a different slot in our test system, but you may run into trouble if you’re already using your slots for other devices.</p> <p>You might let these hassles slide if the cooler had the class-leading performance to justify its quirks. Unfortunately, our testing demonstrated that the V8 GTS was good, but not $100 good. The Phanteks TC14PE cools a little more, costs less, and is quieter and a lot easier to install. Cooler Master’s own Hyper 212 Evo air cooler edges out the GTS for nearly a third of the price, albeit at unacceptable noise levels (never mind the CLCs from NZXT and Corsair that cost about the same as the V8 GTS and considerably outperform it).</p> <p>The GTS’s aesthetics may win over some converts, though. A silver-and-black theme rarely fails to deliver, and an overall shape evoking a V8 engine block is admittedly pretty nifty. The fans also sport several red LEDs, slotted in the top to make it look a bit like a Decepticon. More LEDs are tucked underneath, so they can’t be seen directly but emit a spooky glow onto the motherboard.</p> <p>But that silvery look comes at a cost; the GTS’s heat pipes and contact surface are aluminum instead of copper, which can make a difference when dealing with this much surface area; copper tends to transfer heat more quickly, but it’s also heavier and more expensive. It’s also disappointing that the fans will be tricky to replace if they break down, get damaged, or aren’t beefy enough for your needs. They use a custom housing similar to a Noctua NF-P14, but with two screw holes removed.</p> <p>The V8 GTS isn’t a complete indictment of performance air coolers, but we’re wondering if that time isn’t fast approaching.</p> <p><strong>$100,</strong> <a href="http://www.coolermaster-usa.com/">www.coolermaster-usa.com</a></p> http://www.maximumpc.com/cooler_master_v8_gts_review_2014#comments air cooler Air Cooling Cooler Master V8 GTS CPU Cooler Hardware maximum pc Review Reviews Thu, 27 Feb 2014 20:22:42 +0000 Tom McNamara 27303 at http://www.maximumpc.com Can You Survive on a Chromebook Alone? http://www.maximumpc.com/can_you_survive_chromebook_alone_2014 <!--paging_filter--><p><img src="/files/u154280/c720_0.png" width="300" height="225" style="float: right;" /></p> <h3>We use nothing but Google's lightweight OS for a week</h3> <p>When Google announced <a title="chrome os" href="http://www.maximumpc.com/tags/chrome_os" target="_blank">Chrome OS</a>, many people scoffed at the viability of a browser-based OS. Currently, however, <strong><a title="chromebook" href="http://www.maximumpc.com/tags/Chromebook" target="_blank">Chromebooks</a></strong> are among the most popular inexpensive computing devices today. The search giant has done a great job of making an OS that is light enough to function on entry-level Atom-based SOCs and even low-powered ARM silicon. With the launch of many new Chromebooks (click hear to find out which one we think is the&nbsp;<a style="font-weight: bold;" title="Best Chromebook" href="http://www.maximumpc.com/best_chromebook_2013" target="_blank">best chromebook</a>)&nbsp;we wanted to see if a person could survive with a Chromebook playing games, videos, word processing and more for an entire week. Read on to see how the OS fared against Windows in our seven-day challenge.</p> <p><strong>Methodology</strong></p> <p>The premise of our test was simple, use nothing but a Chromebook for seven days straight. We weren’t allowed to touch a PC during that period, so we left our Windows rig sitting around collecting dust. Below you will find different sections about our experiences with our Chromebook. In addition, we fill you in on whether a person can use one as their primary computer.</p> <p>We should mention the only other Internet capable device we were allowed to use during our testing period was a smartphone. We did, after all, have to make the occasional phone call/text every now and then.</p> <p style="text-align: center;"><img src="/files/u154280/c720.png" alt="Acer C720" title="Acer C720" width="600" /></p> <p style="text-align: center;"><strong>The Acer C720 Chromebook</strong></p> <p><strong>The Hardware:</strong></p> <p>We grabbed <strong>Acer’s C720</strong>, as it’s arguably the best Chromebook for the money, providing us with a dual-core Intel Haswell processor, 16GB SSD, and 4GB of DDR3 RAM. We thought about using <a title="google pixel" href="http://www.maximumpc.com/google_chromebook_pixel_review_2013" target="_blank">Google’s Pixel</a>, but it’s super expensive at $1,300. The C720 comes very close to the Pixel in performance, and its way cheaper at $250. Not to mention its 0.7lbs lighter than the Pixel as well.</p> <p style="text-align: center;"><img src="/files/u154280/chrome_os_desktop.png" alt="Chrome OS" title="Chrome OS" width="600" /></p> <p style="text-align: center;"><strong>Chrome OS' desktop interface</strong></p> <p><strong>Performance:</strong></p> <p>Using a Chromebook we found some distinct performance advantages and disadvantages. First off, Chrome OS is insanely fast at booting up, taking about 2 seconds to get to the desktop, and we saw the device get us to the Internet in just seconds. In case you've been living under a rock and don't know how Chrome OS operates, it is an operating system that is tied to the cloud. This means that in order to properly take advantage of its features, you must be connected online.&nbsp;</p> <p>The battery life was excellent on the C720, as we got around eight and a half hours run time while producing documents and surfing the web. The C720 was highly portable since it weighs just 2.7 pounds and has a thin profile of 0.7 inches. We also liked its small sleek form factor, as it easily fit into our bag. With its small size also comes a small keyboard, however, and we found ourselves missing our full-size keyboard with its 10-key number pad. We did like the C720’s multi-touch track pad, as the multi-touch gestures were very responsive, but it’s a bit too small for large fingers. These hardware peripherals will vary from Chromebook to Chromebook, however, so the aforementioned statements are not relevant to all Chromebooks.</p> <p style="text-align: center;"><img src="/files/u154280/capture1.png" width="600" height="315" /></p> <p style="text-align: center;"><strong>A familiar face</strong></p> <p><strong>Browsing the Internet:</strong></p> <p>Our Chromebook browsed the web quickly and efficiently. It handled multiple tabs very well and we didn’t see any slowdown in performance when we had 10 or more tabs open. We did, however, run into an issue with Newegg as some of its links didn’t work properly on our Chromebook. We tried looking at customer reviews on the e-tailer’s website and couldn’t get them to load on our Chromebook no matter what we did. We tried shutting down the unit and restarting it, restoring it to factory settings [A.K.A. powerwashing], and disabling our Chrome add-ons and nothing worked. The biggest weakness of Chrome OS is that not everything supports Chrome, so unlike Windows, you can’t just switch browsers if a website isn’t loading properly.</p> <p style="text-align: center;"><img src="/files/u154280/creating_documents.png" alt="Google Docs" title="Google Docs" width="600" height="337" /></p> <p style="text-align: center;"><strong>Google's Word Processing Application: Google Drive</strong></p> <p><strong>Producing Documents:</strong></p> <p>Google Drive was how we created documents, spreadsheets, and presentations. We liked using the cloud-based suite, but it’s not as fleshed out as Microsoft Office. There’s simply more functionality in Word and PowerPoint, as they offer more customization than Google Docs. We found there to be more transitions in PowerPoint along with more options to customize our slides than on Google Slides. If you just need basic presentations, documents, and spreadsheets, however, Google Drive can do most of what Microsoft’s Office can do for free.&nbsp;</p> <p>One of the biggest advantages Google Drive has over Microsoft Office is its sharing function and we liked how we could easily share our documents with the service. Another strong feature of document sharing in Google Docs is that multiple people can edit the same document at the same time. As noted by one of our readers, Hellabrad, you can also edit and share documents with other Office users with Microsoft's free web client. Finally, Google Docs is constantly and conveniently AutoSaving, which is something the desktop version of Word doesn’t do. By default, the Microsoft Word desktop application AutoSaves every 10 minutes, and this setting can be changed to AutoSave every minute (Hellabrad).&nbsp;</p> <p><em>Click the next page to read about gaming, picture-editing and more with a Chromebook.</em></p> <hr /> <p>&nbsp;</p> <p style="text-align: center;"><img src="/files/u154280/watching_videos.png" alt="Watching Videos" title="Watching Videos" width="600" height="337" /></p> <p style="text-align: center;"><strong>Streaming Amazon Instant Prime on a Chromebook</strong></p> <p><strong>Watching Videos:</strong></p> <p>Chrome OS has Adobe Flash Player baked right into its browser, so we found there to be no problems with watching movies and TV shows on Amazon Instant Video, Netflix, and Hulu Plus. The picture quality was a clear 720p on our 11-inch display, which didn’t look that bad because the pixel density was fairly high on our relatively small screen.</p> <p><strong>Anti-Virus:</strong></p> <p>There are no third-party AV programs on Chrome OS you can download at the moment. We see this as a problem because we would love to see Norton, Kaspersky, Trend Micro, and other AV developers making Chrome apps to help protect the OS. AV suites may come along if the OS gains further adoption, but for now you’re only protected by Google.</p> <p>The search giant claims that you’ll never get a virus on its Chrome OS, but Apple said the same thing a few years ago with OSX and that didn’t turn out to be the case. As a matter of fact, the past few years Apple users have seen many viruses invade their laptops and all-in-ones like never before. We suspect that ChromeOS won’t be immune to these problems either.</p> <h3 style="text-align: center;"><img src="/files/u154280/photo_editing.png" alt="Pixlr Photo Editor" title="Pixlr Photo Editor" width="600" /></h3> <p style="text-align: center;"><strong>Editing photos using Pixlr on a Chromebook</strong></p> <p><strong>Editing Photos/Videos:</strong></p> <p>We initially thought that we could use Adobe’s Creative Cloud on our Chromebook, but we were wrong, as Chrome OS does not support the online suite. If you need Photoshop, Illustrator, or InDesign, you’ll need a Windows PC to use these multimedia-editing apps.&nbsp;</p> <p>The built-in photo editor in Chrome OS is very limited, but luckily there’s a free Chrome app called <a title="pixlr" href="http://pixlr.com/" target="_blank">Pixlr</a> that can satisfy your photo editing needs in a pinch. Pixlr gives you a variety of tools including an eraser, smudge tool, selection tool, stamp tool, along with a paint bucket tool and red eye reduction. Pixlr also lets you adjust your image’s size and create layers for those who like to stack effects when editing their photos. It’s not a Photoshop replacement, but at least you can lightly touch up photos.</p> <p>From what we know there’s no way to edit videos on a Chromebook (other than the simple <a title="youtube video editor" href="http://www.youtube.com/editor" target="_blank">Youtube video editor</a>, that is), so again you’ll need a good old X86 PC to this task. If Adobe did start supporting Chromebooks we could see them as cheap multimedia machines, but until that time comes, Chrome OS users are limited to editing photos.</p> <p style="text-align: center;"><img src="/files/u154280/gaming_cos.png" alt="Gaming in Chrome OS" title="Gaming in Chrome OS" width="600" /></p> <p style="text-align: center;"><strong>Playing Bastion on a Chromebook</strong></p> <p><strong>Playing Games:</strong></p> <p>As mentioned before Chrome OS supports Adobe Flash, meaning that Flash games can be played on the OS. <a title="armor games" href="http://armorgames.com/" target="_blank">Armor Games</a>, a website that provides tons of free flash games, ran well, but we did see a few hiccups in our frame rate from time to time after a few minutes.&nbsp;</p> <p>There are a few indie titles that are available on Chrome OS, including Bastion and Flow. Bastion was a performance hog and pushed our tiny Chromebook to its limits, as the unit’s fan was blaring right when we started up the game. Flow on the other hand ran well and didn’t bring our Chromebook to its knees like Bastion did.&nbsp;</p> <p>We did miss Steam and Origin too (only because of BF4, naturally) and we found Chrome OS doesn’t have any compelling flash titles to keep PC gamers satisfied. We find flash games fun 5-10 minute coffee break games, but they don’t quench our hardcore-gaming thirst.</p> <p style="text-align: center;"><img src="/files/u154280/content_management.png" alt="Content Management" title="Content Management" width="600" /></p> <p><strong>Managing Content:</strong></p> <p>We didn’t like Chrome OS because of the lack of content management it provides. There are no folders for Music, Documents, or Pictures like in Windows. All of your files are automatically put in your download folder, and they are all grouped from most current to least current. We thought it was odd we couldn’t put any of files these files onto our desktop. Not to mention, all this glorious content is stored on a “massive” 16GB SSD. It’s not all bad as you can at least natively zip and unzip files in the OS with right-click, which is a two finger tap in Chrome OS.&nbsp;</p> <p>We thought it was strange that we couldn’t upload our music to Google Music using our Chromebook. Chrome OS doesn’t support this, and that’s just weird because you would think Google would support its own ecosystem. Simply put, there’s a huge lack of content management features and it’s something Google definitely needs to change if it seeks to get more market penetration within the laptop scene.</p> <p><strong>Conclusion:</strong></p> <p>While the Chromebook is very fast and functional, it lacks power-user apps like Photoshop, or triple-A gaming titles. We see the device great for college students looking to get a computing device that they can get 8-9 hours out of while taking notes and browsing the web. Chrome OS can also stream the major video services, as we watched Amazon Prime Instant Video, Hulu, and Netflix with no problems. You’re ultimately getting a document, web browsing, and streaming machine.&nbsp;</p> <p>There have been more hybrid Windows 8.1 devices sporting X86 Intel Atom processors with fast 32GB or 64GB SSDs. These inexpensive Windows machines should challenge Chromebooks in the upcoming months and will make Chrome OS devices harder and harder to sell. We’ve already seen some tablet-laptops that are $350-$400 like the ASUS T100, which gives users Windows 8.1 in a portable form factor with a battery life that is comparable to the C720. We’d personally stick with an X86 Windows PC because it does a lot more than Chrome OS, giving us access to a never-ending abundance of apps and tools that Google’s browser OS just can’t rival at the moment.</p> http://www.maximumpc.com/can_you_survive_chromebook_alone_2014#comments Acer C720 chrome os chromebook cloud Gmail Google google docs google drive Hardware laptop netbook notebook online word processing Editor Blogs Mon, 24 Feb 2014 22:45:11 +0000 Chris Zele 27081 at http://www.maximumpc.com MSI N780 Lightning Review http://www.maximumpc.com/msi_n780_lightning_review <!--paging_filter--><h3>Too exotic (and expensive) for mere mortals</h3> <p>Back in October, we took a look at the MSI GTX 770 Lightning, which was a bit like a hot rod that had been given a little too much go-go juice. It was fast, and provided a plethora of performance options for horsepower junkies, but it was simply unstable, even at stock clocks. Undaunted, MSI followed it up by sending us an even bigger, badder board in the same series, the <strong>GTX 780 Lightning</strong>. Like the other Lightning cards, this is the cream of the crop from MSI in terms of board design, cooling, features, and clock speeds. In other words, if you are looking for the fastest non-Titan board MSI offers, this is it. Unfortunately for MSI, though this board was quite stable overall, we didn’t see enough of a performance advantage over other GTX 780 cards to justify its outrageous $750 sticker price.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/gtx_780_lightningv296_3d_small_0.jpg"><img src="/files/u152332/gtx_780_lightningv296_3d_small.jpg" alt="MSI includes a separate utility just for the card’s fans, letting you control the outer ones separate from the inner fan." title="MSI N780 Lightning" width="620" height="492" /></a></p> <p style="text-align: center;"><strong>MSI includes a separate utility just for the card’s fans, letting you control the outer ones separate from the inner fan.</strong></p> <p>To its credit, MSI has made this card pretty damned awesome and worthy of the Lightning moniker by infusing it with all kinds of badassery. For starters, it has a color-sensitive Lightning logo that changes shade (green, yellow, or red) depending on GPU load. Twin rows of blue LEDs flicker on the top of the backplate, showing GPU activity, and there’s also a GPU Reactor PCB on top of the card with blue LEDs, and it supposedly helps overclocking by allowing up to 300 percent more power to surge into the card. The reactor is easily removable though, in case it causes clearance issues. The card also uses twin BIOS chips for overclockers, and a redonk three-fin setup dubbed Tri-Frozr with PWM and its own separate fan-control software. Of course, it has “military-class” everything, including a custom PCB with 16 phase power, and hardware leads for directly monitoring voltages straight off the card.</p> <p>To test the card, we spent about a week overclocking it so we could take it to the maximum level of performance. We ended up with a power-target setting of 109 percent, GPU offset of 135MHz, and a small memory overclock of 220MHz. This gave us a boost clock that cycled between 1,254MHz and 1,267MHz, which was stable. Whenever it ran at 1,280MHz for any period of time, it would hard lock, so this is as high as we could take it. Overall, that’s an excellent result, but not any better than what we achieved with the less expensive Asus and EVGA boards. Under load, the 780 Lightning ran at 76 C, which is also excellent and very quiet, but nothing unusual for these high-end boards.</p> <p>Looking at the benchmark chart, you can see why we’re puzzled by this card’s price tag. It performed exactly the same in our testing as the other top-tier GTX 780 boards, yet costs $90 more than the EVGA card and $40 more than the Asus board. Now, if you’re looking to do competition-level overclocking, we imagine the Lightning is the board you want, but for people who just want an air-cooled GPU that is quiet and overclocks well, it’s tough to recommend this board given its exorbitant price tag.</p> <p><strong>&nbsp;$750,</strong> <a href="http://www.msi.com/language/">www.msi.com</a></p> http://www.maximumpc.com/msi_n780_lightning_review#comments 2013 December issues 203 graphics card Hardware maximum pc MSI N780 Lightning Review December 2013 Reviews Videocards Tue, 11 Feb 2014 09:20:06 +0000 Josh Norem 27235 at http://www.maximumpc.com PC Performance Tested http://www.maximumpc.com/pc_performance_tested_2014 <!--paging_filter--><h3><a class="thickbox" style="font-size: 10px; text-align: center;" href="/files/u152332/nvidia_geforce_gtx_780-top_small_0.jpg"><img src="/files/u152332/nvidia_geforce_gtx_780-top_small.jpg" alt="Nvidia’s new GK110-based GTX 780 takes on two ankle-biter GTX 660 Ti GPUs." title="Nvidia’s new GK110" width="250" height="225" style="float: right;" /></a>With our lab coats donned, our test benches primed, and our benchmarks at the ready, we look for answers to nine of the most burning performance-related questions</h3> <p>If there’s one thing that defines the Maximum PC ethos, it’s an obsession with Lab-testing. What better way to discern a product’s performance capabilities, or judge the value of an upgrade, or simply settle a heated office debate? This month, we focus our obsession on several of the major questions on the minds of enthusiasts. Is liquid cooling always more effective than air? Should serious gamers demand PCIe 3.0? When it comes to RAM, are higher clocks better? On the surface, the answers might seem obvious. But, as far as we’re concerned, nothing is for certain until it’s put to the test. We’re talking tests that isolate a subsystem and measure results using real-world workloads. Indeed, we not only want to know if a particular technology or piece of hardware is truly superior, but also by how much. After all, we’re spending our hard-earned skrilla on this gear, so we want our purchases to make real-world sense. Over the next several pages, we put some of the most pressing PC-related questions to the test. If you’re ready for the answers, read on.</p> <h4>Core i5-4670K vs. Core i5-3570K vs. FX-8350</h4> <p>People like to read about the $1,000 high-end parts, but the vast majority of enthusiasts don’t buy at that price range. In fact, they don’t even buy the $320 chips. No, the sweet spot for many budget enthusiasts is around $220. To find out which chip is the fastest midrange part, we ran Intel’s new <a title="4670k" href="http://ark.intel.com/products/75048/" target="_blank">Haswell Core i5-4670K</a> against the current-champ <a title="i5 3570K" href="http://ark.intel.com/products/65520" target="_blank">Core i5-3570K</a> as well as AMD’s <a title="vishera fx-8350" href="http://www.maximumpc.com/article/features/vishera_review" target="_blank">Vishera FX-8350</a>.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/fx_small_0.jpg"><img src="/files/u152332/fx_small.jpg" alt="AMD’s FX-8350 has two cores up on the competition, but does that matter?" width="620" height="607" /></a></p> <p style="text-align: center;"><strong>AMD’s FX-8350 has two cores up on the competition, but does that matter?</strong></p> <p><strong>The Test:</strong> For our test, we socketed the Core i5-4670K into an Asus Z87 Deluxe with 16GB of DDR3/1600, an OCZ Vertex 3, a GeForce GTX 580 card, and Windows 8. For the Core i5-3570K, we used the same hardware in an Asus P8Z77-V Premium board, and the FX-8350 was tested in an Asus CrossHair V Formula board. We ran the same set of benchmarks that we used in our original review of the FX-8350 published in the Holiday 2012 issue.</p> <p><strong>The Results:</strong> First, the most important factor in the budget category is the price. As we wrote this, the street price of the Core i5-4670K was $240, the older Core i5-3570K was in the $220 range, and AMD’s FX-8350 went for $200. The 4670K is definitely on the outer edge of the budget sweet spot while the AMD is cheaper by a bit.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/haswell_small_5.jpg"><img src="/files/u152332/haswell_small_4.jpg" alt="Intel’s Haswell Core i5-4670K slots right into the high end of the midrange." title="Haswell" width="620" height="620" /></a></p> <p style="text-align: center;"><strong>Intel’s Haswell Core i5-4670K slots right into the high end of the midrange.</strong></p> <p>One thing that’s not disputable is the performance edge the new Haswell i5 part has. It stepped away from its Ivy Bridge sibling in every test we ran by respectable double-digit margins. And while the FX-8350 actually pulled close enough to the Core i5-3570K in enough tests to go home with some multithreaded victories in its pocket, it was definitely kept humble by Haswell. The Core i5-4670K plain-and-simply trashed the FX-8350 in the vast majority of the tests that can’t push all eight cores of the FX-8350. Even worse, in the multithreaded tests where the FX-8350 squeezed past the Ivy Bridge Core i5-3570K, Haswell either handily beat or tied the chip with twice its cores.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/ivybridge_small_0.jpg"><img src="/files/u152332/ivybridge_small.jpg" alt="The Core i5-3570K was great in its day, but it needs more than that to stay on top." title="Core i5-3570K" width="620" height="622" /></a></p> <p style="text-align: center;"><strong>The Core i5-3570K was great in its day, but it needs more than that to stay on top.</strong></p> <p>Even folks concerned with bang-for-the-buck will find the Core i5-4670K makes a compelling argument. Yes, it’s 20 percent more expensive than the FX-8350, but in some of our benchmarks, it was easily that much faster or more. In Stitch.Efx 2.0, for example, the Haswell was 80 percent faster than the Vishera. Ouch.</p> <p>So where does this leave us? For first place, we’re proclaiming the Core i5-4570K the midrange king by a margin wider than Louie Anderson. Even the most ardent fanboys wearing green-tinted glasses or sporting an IVB4VR license plate can’t disagree.</p> <p>For second place, however, we’re going to get all controversial and call it for the FX-8350, by a narrow margin. Here’s why: FX-8350 actually holds up against the Core i5-3570K in a lot of benchmarks, has an edge in mulitithreaded apps, and its AM3+ socket has a far longer roadmap than LGA1155, which is on the fast track to Palookaville.</p> <p>Granted, Ivy Bridge and 1155 is still a great option, especially when bought on a discounted combo deal, but it’s a dead man walking, and our general guidance for those who like to upgrade is to stick to sockets that still have a pulse. Let’s not even mention that LGA1155 is the only one here with a pathetic two SATA 6Gb/s ports. Don’t agree? Great, because we have an LGA1156 motherboard and CPU to sell you.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light">Core i5-4670K</th> <th>Core i5-3570K</th> <th>FX-8350</th> </tr> </thead> <tbody> <tr> <td class="item"><strong>POV Ray 3.7 RC3 (sec)</strong></td> <td class="item-dark"><strong>168.53</strong></td> <td> <p>227.75</p> </td> <td>184.8</td> </tr> <tr> <td><strong>Cinebench 10 Single-Core</strong></td> <td><strong>8,500</strong></td> <td>6,866</td> <td>4,483</td> </tr> <tr> <td class="item"><strong>Cinebench 11.5</strong></td> <td class="item-dark"><strong>6.95<br /></strong></td> <td>6.41</td> <td><strong>6.90</strong></td> </tr> <tr> <td><strong>7Zip 9.20</strong></td> <td>17,898</td> <td>17,504</td> <td><strong>23,728</strong></td> </tr> <tr> <td><strong>Fritz Chess</strong></td> <td><strong>13,305</strong></td> <td>11,468</td> <td>12,506</td> </tr> <tr> <td class="item"><strong>Premiere Pro CS6 (sec)</strong></td> <td class="item-dark"><strong>2,849</strong></td> <td>3,422</td> <td>5,220</td> </tr> <tr> <td class="item"><strong>HandBrake Blu-ray encode&nbsp; (sec)</strong></td> <td class="item-dark"><strong>9,042</strong></td> <td>9,539</td> <td><strong>8,400</strong></td> </tr> <tr> <td><strong>x264 5.01 Pass 1 (fps)</strong></td> <td><strong>66.3<br /></strong></td> <td>57.1</td> <td>61.3</td> </tr> <tr> <td><strong>x264 5.01 Pass 2 (fps)</strong></td> <td><strong>15.8</strong></td> <td>12.7</td> <td><strong>15</strong></td> </tr> <tr> <td><strong>Sandra (GB/s)</strong></td> <td><strong>21.6</strong></td> <td><strong>21.3</strong></td> <td>18.9</td> </tr> <tr> <td><strong>Stitch.Efx 2.0 (sec)</strong></td> <td><strong>836</strong></td> <td>971</td> <td>1,511</td> </tr> <tr> <td><strong>ProShow Producer 5 (sec)</strong></td> <td><strong>1,275</strong></td> <td>1,463</td> <td>1,695</td> </tr> <tr> <td><strong>STALKER: CoP low-res (fps)</strong></td> <td><strong>173.5</strong></td> <td>167.3</td> <td>132.1</td> </tr> <tr> <td><strong>3DMark 11 Physics</strong></td> <td><strong>7,938</strong></td> <td>7,263</td> <td>7,005</td> </tr> <tr> <td><strong>PC Mark 7 Overall</strong></td> <td><strong>6,428</strong></td> <td>5,582</td> <td>4,408</td> </tr> <tr> <td><strong>PC Mark 7 Storage</strong></td> <td>5,300</td> <td><strong>5,377</strong></td> <td>4,559</td> </tr> <tr> <td><strong>Valve Particle (fps)</strong></td> <td><strong>180</strong></td> <td>155</td> <td>119</td> </tr> <tr> <td><strong>Heaven 3.0 low-res (fps)</strong></td> <td><strong>139.4</strong></td> <td>138.3</td> <td>134.4</td> </tr> </tbody> </table> </div> <p><em>Best scores are bolded. Test bed described in text</em></p> <h4>Hyper-Threading vs. No Hyper-Threading<em>&nbsp;</em></h4> <p><a title="hyper threading" href="http://www.intel.com/content/www/us/en/architecture-and-technology/hyper-threading/hyper-threading-technology.html" target="_blank">Hyper-Threading</a> came out 13 years ago with the original 3.06GHz Pentium 4, and was mostly a dud. Few apps were multithreaded and even Windows’s own scheduler didn’t know how to deal with HT, making some apps actually slow down when the feature was enabled. But the tech overcame those early hurdles to grow into a worthwhile feature today. Still, builders are continually faced with choosing between procs with and without HT, so we wanted to know definitively how much it matters. <em>&nbsp;</em></p> <p><strong>The Test:</strong> Since we haven’t actually run numbers on HT in some time, we broke out a Core i7-4770K and ran tests with HT turned on and off. We used a variety of benchmarks with differing degrees of threadedness to test the technology’s strengths and weaknesses.</p> <p><strong>The Results:</strong> One look at our results and you can tell HT is well worth it if your applications can use the available threads. We saw benefits of 10–30 percent from HT in some apps. But if your app can’t use the threads, you gain nothing. And in rare instances, it appears to hurt performance slightly—as in Hitman: Absolution when run to stress the CPU rather than the GPU. Our verdict is that you should pay for HT, but only if your chores include 3D modeling, video encoding or transcoding, or other thread-heavy tasks. Gamers who occasionally transcode videos, for example, would get more bang for their buck from a Core i5-4670K.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light">HT Off</th> <th>HT On</th> </tr> </thead> <tbody> <tr> <td class="item"><strong>PCMark 7 Overall</strong></td> <td class="item-dark">6,308</td> <td> <p><strong>6,348</strong></p> </td> </tr> <tr> <td><strong>Cinebench 11.5</strong></td> <td>6.95</td> <td><strong>8.88</strong></td> </tr> <tr> <td class="item"><strong>Stitch.EFx 2.0 (sec)</strong></td> <td class="item-dark">772</td> <td>772</td> </tr> <tr> <td><strong>ProShow Producer 5.0&nbsp; (sec)</strong></td> <td>1,317</td> <td><strong>1,314</strong></td> </tr> <tr> <td><strong>Premiere Pro CS6 (sec)</strong></td> <td>2,950</td> <td><strong>2,522</strong></td> </tr> <tr> <td class="item"><strong>HandBrake 0.9.9 (sec)</strong></td> <td class="item-dark">1,200</td> <td><strong>1,068</strong></td> </tr> <tr> <td class="item"><strong>3DMark 11 Overall</strong></td> <td class="item-dark">X2,210</td> <td>X2,209</td> </tr> <tr> <td><strong>Valve Particle Test (fps)</strong></td> <td>191</td> <td><strong>226</strong></td> </tr> <tr> <td><strong>Hitman: Absolution, low res (fps)</strong></td> <td><strong>92</strong></td> <td>84</td> </tr> <tr> <td><strong>Total War 2: Shogun CPU Test (fps)</strong></td> <td><strong>42.4</strong></td> <td>41</td> </tr> </tbody> </table> </div> <p><em>Best scores are bolded. We used a Core i7-4770K on a Asus Z87 Deluxe, with a Neutron GTX 240 SSD, a GeForce GTX 580, and 16GB of DDR3/1600 64-bit, with Windows 8</em></p> <p><em>Click the next page to read about air cooling vs water cooling</em></p> <h3> <hr /></h3> <h3>Air Cooling vs. Water Cooling<em>&nbsp;</em></h3> <p>There are two main ways to chill your CPU: a heatsink with a fan on it, or a closed-loop liquid cooler (CLC). Unlike a custom loop, you don't need to periodically drain and flush the system or check it for leaks. The "closed" part means that it's sealed and integrated. This integration also reduces manufacturing costs and makes the setup much easier to install. If you want maximum overclocks, custom loops are the best way to go. But it’s a steep climb in cost for a modest improvement beyond what current closed loops can deliver. <em>&nbsp;</em></p> <p>But air coolers are not down for the count. They're still the easiest to install and the cheapest. However, the prices between air and water are so close now that it's worth taking a look at the field to determine what's best for your budget.<em>&nbsp;</em></p> <p><strong>The Test:</strong> To test the two cooling methods, we dropped them into a rig with a hex-core Intel Core i7-3960X overclocked to 4.25GHz on an Asus Rampage IV Extreme motherboard, inside a Corsair 900D. By design, it's kind of a beast and tough to keep cool.</p> <h4>The Budget Class<em>&nbsp;</em></h4> <p><strong>The Results:</strong> At this level, the Cooler Master 212 Evo is legend…ary. It runs cool and quiet, it's easy to pop in, it can adapt to a variety of sockets, it's durable, and it costs about 30 bucks. Despite the 3960X's heavy load, the 212 Evo averages about 70 degrees C across all six cores, with a room temperature of about 22 C, or 71.6 F. Things don’t tend to get iffy until 80 C, so there's room to go even higher. Not bad for a cooler with one 120mm fan on it.</p> <p>Entry-level water coolers cost substantially more, unless you're patient enough to wait for a fire sale. They require more materials, more manufacturing, and more complex engineering. The Cooler Master Seidon 120M is a good example of the kind of unit you'll find at this tier. It uses a standard 120mm fan attached to a standard 120mm radiator (or "rad") and currently has a street price of $60. But in our tests, its thermal performance was about the same, or worse, than the 212 Evo. In order to meet an aggressive price target, you have to make some compromises. The pump is smaller than average, for example, and the copper block you install on top of the CPU is not as thick. The Seidon was moderately quieter, but we have to give the nod to the 212 Evo when it comes to raw performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/coolermaster_212evo_small_2.jpg"><img src="/files/u152332/coolermaster_212evo_small_1.jpg" alt="The Cooler Master 212 Evo has arguably the best price- performance ratio around." title="Cooler Master 212" width="620" height="715" /></a></p> <p style="text-align: center;"><strong>The Cooler Master 212 Evo has arguably the best price-performance ratio around.</strong></p> <h4>The Performance Class<em>&nbsp;</em></h4> <p><strong>The Results:</strong> While a CLC has trouble scaling its manufacturing costs down to the budget level, there's a lot more headroom when you hit the $100 mark. The NZXT Kraken X60 CLC is one of the best examples in this class; its dual–140mm fans and 280mm radiator can unload piles of heat without generating too much noise, and it has a larger pump and apparently larger tubes than the Seidon 120M. Our tests bear out the promise of the X60's design, with its "quiet" setting delivering a relatively chilly 66 C, or about 45 degrees above the ambient room temp.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/nzxt_krakenx60_small_3.jpg"><img src="/files/u152332/nzxt_krakenx60_small_1.jpg" alt="It may not look like much, but the Kraken X60 is the Ferrari of closed-loop coolers." title="Kraken X60" width="620" height="425" /></a></p> <p style="text-align: center;"><strong>It may not look like much, but the Kraken X60 is the Ferrari of closed-loop coolers.</strong></p> <p>Is there any air cooler that can keep up? Well, we grabbed a Phanteks TC14PE, which uses two heatsinks instead of one, dual–140mm fans, and retails at $85–$90. It performed only a little cooler than the 212 Evo, but it did so very quietly, like a ninja. At its quiet setting, it trailed behind the X60 by 5 C. It may not sound like much, but that extra 5 C of headroom means a higher potential overclock. So, water wins the high end.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 620px; height: 267px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light"><span style="font-family: times new roman,times;">Seidon 120M Quiet / Performance Mode</span></th> <th><span style="font-family: times new roman,times;">212 Evo<br />Quiet / Performance Mode</span></th> <th><span style="font-family: times new roman,times;">Kraken X60 Quiet / Performance Mode</span></th> <th><span style="font-family: times new roman,times;">TC14PE<br />Quiet / Performance Mode</span></th> </tr> </thead> <tbody> <tr> <td class="item"><strong>Ambient Air</strong></td> <td class="item-dark">22.1 / 22.2</td> <td> <p>20.5 / 20</p> </td> <td>20.9 / 20.7</td> <td>20 / 19.9</td> </tr> <tr> <td><strong>Idle Temperature</strong></td> <td>38 / 30.7</td> <td>35.5 / 30.5</td> <td>29.7 / 28.8</td> <td>32 / <strong>28.5</strong></td> </tr> <tr> <td class="item"><strong>Load Temperature</strong></td> <td class="item-dark">78.3 / 70.8</td> <td>70 / 67.3</td> <td>66 / 61.8</td> <td>70.3 / 68.6</td> </tr> <tr> <td><strong>Load - Ambient</strong></td> <td>56.2 / 48.6</td> <td>49.5 / 47.3</td> <td>45.1 / 41.1</td> <td>50.3/ 48.7</td> </tr> </tbody> </table> </div> <p><em>All temperatures in degrees Celsius. Best scores bolded.</em></p> <h4>Is High-Bandwidth RAM worth it?<em>&nbsp;</em></h4> <p>Today, you can get everything from vanilla DDR3/1333 all the way to exotic-as-hell DDR3/3000. The question is: Is it actually worth paying for anything more than the garden-variety RAM? <em>&nbsp;</em></p> <p><strong>The Test:</strong> For our test, we mounted a Core i7-4770K into an Asus Z87 Deluxe board and fitted it with AData modules at DDR3/2400, DDR3/1600, and DDR3/1333. We then picked a variety of real-world (and one synthetic) tests to see how the three compared.</p> <p><strong>The Results:</strong> First, let us state that if you’re running integrated graphics and you want better 3D performance, pay for higher-clocked RAM. With discrete graphics, though, the advantage isn’t as clear. We had several apps that saw no benefit from going from 1,333MHz to 2,400MHz. In others, though, we saw a fairly healthy boost, 5–10 percent, by going from standard DDR3/1333 to DDR3/2400. The shocker came in Dirt 3, which we ran in low-quality modes so as not to be bottlenecked by the GPU. At low resolution and low image quality, we saw an astounding 18 percent boost. <em>&nbsp;</em></p> <p>To keep you back on earth, you should know that cranking the resolution in the game all but erased the difference. To see any actual benefit, we think you’d really need a tri-SLI GeForce GTX 780 setup and expect that the vast majority of games won’t actually give you that scaling.<em>&nbsp;</em></p> <p>We think the sweet spot for price/performance is either DDR3/1600 or DDR3/1866.<em><br /></em></p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 620px; height: 267px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light"><span style="font-family: times new roman,times;">DDR3/1333</span></th> <th><span style="font-family: times new roman,times;">DDR3/1600</span></th> <th><span style="font-family: times new roman,times;">DDR3/2400</span></th> </tr> </thead> <tbody> <tr> <td class="item"><strong>Stitch.Efx 2.0 (sec)</strong></td> <td class="item-dark">776</td> <td> <p>773</p> </td> <td><strong>763</strong></td> </tr> <tr> <td><strong>PhotoMatix HDR (sec)</strong></td> <td>181</td> <td>180</td> <td>180</td> </tr> <tr> <td class="item"><strong>ProShow Producer 5.0 (sec) <br /></strong></td> <td class="item-dark">1,370</td> <td>1,337</td> <td><strong>1,302</strong></td> </tr> <tr> <td><strong>HandBrake 0.9.9 (sec)</strong></td> <td>1,142</td> <td>1,077</td> <td><strong>1,037</strong></td> </tr> <tr> <td><strong>3DMark Overall</strong></td> <td>2,211</td> <td>2,214</td> <td>2,215</td> </tr> <tr> <td><strong>Dirt 3 Low Quality (fps)</strong></td> <td>234</td> <td>247.6</td> <td><strong>272.7</strong></td> </tr> <tr> <td><strong>Price for two 4GB DIMMs (USD)</strong></td> <td>$70</td> <td>$73</td> <td>$99</td> </tr> </tbody> </table> </div> <p><em>All temperatures in degrees Celsius. Best scores bolded.</em></p> <p><em>Click the next page to see how two midrange graphics cards stack up against one high-end GPU!</em></p> <h3> <hr /></h3> <h3>One High-End GPU vs.Two Midrange GPUs<em>&nbsp;</em></h3> <p>One of the most common questions we get here at Maximum PC, aside from details about our lifting regimen, is whether to upgrade to a high-end GPU or run two less-expensive cards in SLI or CrossFire. It’s a good question, since high-end GPUs are expensive, and cards that are two rungs below them in the product stack cost about half the price, which naturally begs the question: Are two $300 cards faster than a single $600 card? Before we jump to the tests, dual-card setups suffer from a unique set of issues that need to be considered. First is the frame-pacing situation, where the cards are unable to deliver frames evenly, so even though the overall frames per second is high there is still micro-stutter on the screen. Nvidia and AMD dual-GPU configs suffer from this, but Nvidia’s SLI has less of a problem than AMD at this time. Both companies also need to offer drivers to allow games and benchmarks to see both GPUs, but they are equally good at delivering drivers the day games are released, so the days of waiting two weeks for a driver are largely over. <em>&nbsp;</em></p> <h4>2x Nvidia <a title="660 Ti" href="http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660ti" target="_blank">GTX 660 Ti</a> vs. <a title="geforce gtx 780" href="http://www.maximumpc.com/article/news/geforce_gtx_780_benchmarks" target="_blank">GTX 780</a><em>&nbsp;</em></h4> <p><strong>The Test:</strong> We considered using two $250 GTX 760 GPUs for this test, but Nvidia doesn't have a $500 GPU to test them against, and since this is Maximum PC, we rounded up one model from the "mainstream" to the $300 GTX 660 Ti. This video card was recently replaced by the GTX 760, causing its price to drop down to a bit below $300, but since that’s its MSRP we are using it for this comparison. We got two of them to go up against the GTX 780, which costs roughly $650, so it's not a totally fair fight, but we figured it's close enough for government work. We ran our standard graphics test suite in both single- and dual-card configurations. <em>&nbsp;</em></p> <p><strong>The Results:</strong> It looks like our test was conclusive—two cards in SLI provide a slightly better gaming experience than a single badass card, taking top marks in seven out of nine tests. And they cost less, to boot. Nvidia’s frame-pacing was virtually without issues, too, so we don’t have any problem recommending Nvidia SLI at this time. It is the superior cost/performance setup as our benchmarks show.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/nvidia_geforce_gtx_780-top_small_0.jpg"><img src="/files/u152332/nvidia_geforce_gtx_780-top_small.jpg" alt="Nvidia’s new GK110-based GTX 780 takes on two ankle-biter GTX 660 Ti GPUs." title="Nvidia’s new GK110" width="620" height="559" /></a></p> <p style="text-align: center;"><strong>Nvidia’s new GK110-based GTX 780 takes on two ankle-biter GTX 660 Ti GPUs.</strong></p> <h4>2x <a title="7790" href="http://www.maximumpc.com/best_cheap_graphics_card_2013" target="_blank">Radeon HD 7790</a> vs.<a title="7970" href="http://www.maximumpc.com/article/features/Asus_680_7970" target="_blank">Radeon HD 7970</a>&nbsp;GHz<em></em></h4> <p><strong>The Test:</strong> For our AMD comparison, we took two of the recently released HD 7790 cards, at $150 each, and threw them into the octagon with a $400 GPU, the PowerColor Radeon HD 7970 Vortex II, which isn't technically a "GHz" board, but is clocked at 1,100MHz, so we think it qualifies. We ran our standard graphics test suite in both single-and-dual card configurations.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/reviews-10649_small_0.jpg"><img src="/files/u152332/reviews-10649_small.jpg" alt="Two little knives of the HD 7790 ilk take on the big gun Radeon HD 7970 . " title="HD 7790" width="620" height="663" /></a></p> <p style="text-align: center;"><strong>Two little knives of the HD 7790 ilk take on the big gun Radeon HD 7970 . </strong></p> <p><strong>The Results:</strong> Our AMD tests resulted in a very close battle, with the dual-card setup taking the win by racking up higher scores in six out of nine tests, and the single HD 7970 card taking top spot in the other three tests. But, what you can’t see in the chart is that the dual HD 7790 cards were totally silent while the HD 7970 card was loud as hell. Also, AMD has acknowledged the micro-stutter problem with CrossFire, and promises a software fix for it, but unfortunately that fix is going to arrive right as we are going to press on July 31. Even without it, gameplay seemed smooth, and the duo is clearly faster, so it gets our vote as the superior solution, at least in this config.<em><br /></em></p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 620px; height: 267px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light"><span style="font-family: times new roman,times;">GTX 660 Ti SLI</span></th> <th><span style="font-family: times new roman,times;">GTX 780</span></th> <th><span style="font-family: times new roman,times;">Radeon HD 7870 CrossFire</span></th> <th><span style="font-family: times new roman,times;">Radeon HD 7970 GHz</span></th> </tr> </thead> <tbody> <tr> <td class="item"><strong>3DMark Fire Strike</strong></td> <td class="item-dark"><strong>8,858</strong></td> <td> <p>8,482</p> </td> <td><strong>8,842</strong></td> <td>7,329</td> </tr> <tr> <td><strong>Catzilla (Tiger) Beta</strong></td> <td><strong>7,682</strong></td> <td>6,933</td> <td><strong>6,184</strong></td> <td>4,889</td> </tr> <tr> <td class="item"><strong>Unigine Heaven 4.0 (fps)<br /></strong></td> <td class="item-dark">33</td> <td><strong>35<br /></strong></td> <td><strong>30</strong></td> <td>24</td> </tr> <tr> <td><strong>Crysis 3 (fps)</strong></td> <td><strong>26</strong></td> <td>24</td> <td>15</td> <td><strong>17</strong></td> </tr> <tr> <td><strong>Shogun 2 (fps)</strong></td> <td><strong>60</strong></td> <td>48</td> <td><strong>51</strong></td> <td>43</td> </tr> <tr> <td><strong>Far Cry 3 (fps)</strong></td> <td><strong>41</strong></td> <td>35</td> <td>21</td> <td><strong>33</strong></td> </tr> <tr> <td><strong>Metro: Last Light (fps)</strong></td> <td><strong>24</strong></td> <td>22</td> <td>13</td> <td><strong>14</strong></td> </tr> <tr> <td><strong>Tomb Raider (fps)</strong></td> <td>18</td> <td><strong>25</strong></td> <td><strong>24</strong></td> <td>20</td> </tr> <tr> <td><strong>Battlefield 3 (fps)</strong></td> <td><strong>56</strong></td> <td>53</td> <td><strong>57</strong></td> <td>41</td> </tr> </tbody> </table> </div> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7 3960X Extreme Edition in an Asus P9X79 motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 7 Ultimate. All tests, except for the 3DMark tests, are run at 2560x1600 with 4X AA.</em></p> <h3>PCI Express 2.0 vs. PCI Express 3.0<em></em></h3> <p>PCI Express is the specification that governs the amount of bandwidth available between the CPU and the PCI Express slots on your motherboard. We've recently made the jump from version 2.0 to version 3.0, and the PCI Express interface on all late-model video cards is now PCI Express 3.0, causing many frame-rate addicts to question the sanity of placing a PCIe 3.0 GPU into a PCIe 2.0 slot on their motherboard. The reason why is that PCIe 3.0 has quite a bit more theoretical bandwidth than PCIe 2.0. Specifically, one PCIe 2.0 lane can transmit 500MB/s in one direction, while a PCIe 3.0 lane can pump up to 985MB/s, so it's almost double the bandwidth, and then multiply that by 16, since there are that many lanes being used, and the difference is substantial. However, that extra bandwidth will only be important if it’s even needed, which is what we wanted to find out. <em></em></p> <p><strong>The Test:</strong> We plugged an Nvidia GTX Titan into our Asus P9X79 board and ran several of our gaming tests with the top PCI Express x16 slot alternately set to PCIe 3.0 and PCIe 2.0. On this particular board you can switch the setting in the BIOS. <em></em></p> <p><strong>The Results:</strong> We had heard previously that there was very little difference between PCIe 2.0 and PCIe 3.0 on current systems, and our tests back that up. In every single test, Gen 3.0 was faster, but the difference is so small it’s very hard for us to believe that PCIe 2.0 is being saturated by our GPU. It’s also quite possible that one would see more pronounced results using two or more cards, but we wanted to “keep it real” and just use one card. <em></em></p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 620px; height: 267px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light"><span style="font-family: times new roman,times;">GTX Titan PCIe 2.0</span></th> <th><span style="font-family: times new roman,times;">GTX Titan PCIe 3.0</span></th> </tr> </thead> <tbody> <tr> <td class="item"><strong>3DMark Fire Strike</strong></td> <td class="item-dark">9,363</td> <td> <p><strong>9,892</strong></p> </td> </tr> <tr> <td><strong>Unigine Heaven 4.0 (fps)</strong></td> <td>37</td> <td><strong>40</strong></td> </tr> <tr> <td class="item"><strong>Crysis 3 (fps)<br /></strong></td> <td class="item-dark">31</td> <td><strong>32<br /></strong></td> </tr> <tr> <td><strong>Shogun 2 (fps)</strong></td> <td>60</td> <td><strong>63</strong></td> </tr> <tr> <td><strong>Far Cry 3 (fps)</strong></td> <td>38</td> <td><strong>42</strong></td> </tr> <tr> <td><strong>Metro: Last Light (fps)</strong></td> <td>22</td> <td><strong>25</strong></td> </tr> <tr> <td><strong>Tomb Raider (fps)</strong></td> <td>22</td> <td><strong>25</strong></td> </tr> </tbody> </table> </div> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7 3960X Extreme Edition in an Asus P9X79 motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 7 Ultimate. All games are run at 2560x1600 with 4X AA except for the 3DMark tests.</em></p> <h3>PCIe x8 vs. PCIe x16</h3> <p>PCI Express expansion slots vary in both physical size and the amount of bandwidth they provide. The really long slots are called x16 slots, as they provide 16 lanes of PCIe bandwidth, and that’s where our video cards go, for obvious reasons. Almost all of the top slots in a motherboard (those closest to the CPU) are x16, but sometimes those 16 lanes are divided between two slots, so what might look like a x16 slot is actually a x8 slot. The tricky part is that sometimes the slots below the top slot only offer eight lanes of PCIe bandwidth, and sometimes people need to skip that top slot because their CPU cooler is in the way or water cooling tubes are coming out of a radiator in that location. Or you might be running a dual-card setup, and if you use a x8 slot for one card, it will force the x16 slot to run at x8 speeds. Here’s the question: Since a x16 slot provides 3.2GB/s of bandwidth in one direction, and a x8 slot pumps 1.6GB/s, is your performance hobbled by running at x8?</p> <p><strong>The Test:</strong> We wedged a GTX Titan into first a x16 slot and then a x8 slot on our Asus P9X79 motherboard and ran our gaming tests in order to compare the difference.</p> <p><strong>The Results:</strong> We were surprised by these results, which show x16 to be a clear winner. Sure, it seems obvious, but we didn’t think even current GPUs were saturating the x8 interface, but apparently they are, so this is an easy win for x16.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/asus_p9x79_small_0.jpg"><img src="/files/u152332/asus_p9x79_small.jpg" alt="The Asus P9X79 offers two x16 slots (blue) and two x8 slots (white)." title="Asus P9X79" width="620" height="727" /></a></p> <p style="text-align: center;"><strong>The Asus P9X79 offers two x16 slots (blue) and two x8 slots (white).</strong></p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 620px; height: 267px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light"><span style="font-family: times new roman,times;">GTX Titan PCIe x16</span></th> <th><span style="font-family: times new roman,times;">GTX Titan PCIe x8</span></th> </tr> </thead> <tbody> <tr> <td class="item"><strong>3DMark Fire Strike</strong></td> <td class="item-dark"><strong>9,471</strong></td> <td> <p>9,426</p> </td> </tr> <tr> <td><strong>Catzilla (Tiger) Beta</strong></td> <td><strong>7,921</strong></td> <td>7,095</td> </tr> <tr> <td class="item"><strong>Unigine Heaven 4.0 (fps)<br /></strong></td> <td class="item-dark"><strong>40</strong></td> <td>36</td> </tr> <tr> <td><strong>Crysis 3 (fps)</strong></td> <td>32</td> <td><strong>37</strong></td> </tr> <tr> <td><strong>Shogun 2 (fps)</strong></td> <td><strong>64</strong></td> <td>56</td> </tr> <tr> <td><strong>Far Cry 3 (fps)</strong></td> <td><strong>43</strong></td> <td>39</td> </tr> <tr> <td><strong>Metro: Last Light (fps)</strong></td> <td><strong>25</strong></td> <td>22</td> </tr> <tr> <td><strong>Tomb Raider (fps)</strong></td> <td><strong>25</strong></td> <td>23</td> </tr> <tr> <td><strong>Battlefield 3 (fps)</strong></td> <td><strong>57</strong></td> <td>50</td> </tr> </tbody> </table> </div> <p><em>Tests performed on an Asus P9X79 Deluxe motherboard. </em></p> <h3>IDE vs. AHCI<em></em></h3> <p>If you go into your BIOS and look at the options for your motherboard’s SATA controller, you usually have three options: IDE, AHCI, and RAID. RAID is for when you have more than one drive, so for running just a lone wolf storage device, you have AHCI and IDE. For ages we always just ran IDE, as it worked just fine. But now there’s AHCI too, which stands for Advanced Host Controller Interface, and it supports features IDE doesn’t, such as Native Command Queuing (NCQ), and hot swapping. Some people also claim that AHCI is faster than IDE due to NCQ and the fact that it's newer. Also, for SSD users, IDE does not support the Trim command, so AHCI is critical to an SSD's well-being over time, but is there a speed difference between IDE and AHCI for an SSD? We set to find out. <em></em></p> <p><strong>The Test:</strong> We enabled IDE on our SATA controller in the BIOS, then installed our OS. Next, we added our Corsair test SSD and ran a suite of storage tests. We then enabled AHCI, reinstalled the OS, re-added the Corsair Neutron test SSD, and re-ran all the tests.<em></em></p> <p><strong>The Results:</strong> We haven’t used IDE in a while, but we assumed it would allow our SSD to run at full speed even if it couldn’t NCQ or hot-swap anything. And we were wrong. Dead wrong. Performance with the SATA controller set to IDE was abysmal, plain and simple. <em></em></p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 620px; height: 267px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light"><span style="font-family: times new roman,times;">Corsair Neutron GTX IDE</span></th> <th><span style="font-family: times new roman,times;">Corsair Neutron GTX AHCI</span></th> </tr> </thead> <tbody> <tr> <td class="item"><strong>CrystalDiskMark</strong></td> <td class="item-dark">&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>Avg. Sustained Read (MB/s)</strong></td> <td>224</td> <td><strong>443</strong></td> </tr> <tr> <td class="item"><strong>Avg. Sustained Write (MB/s)<br /></strong></td> <td class="item-dark">386</td> <td><strong>479</strong></td> </tr> <tr> <td><strong>AS SSD - Compressed Data</strong></td> <td>&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>Avg. Sustained Read (MB/s)</strong></td> <td>210</td> <td><strong>514</strong></td> </tr> <tr> <td><strong>Avg. Sustained Write (MB/s)</strong></td> <td>386</td> <td><strong>479</strong></td> </tr> <tr> <td><strong>ATTO</strong></td> <td>&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>64KB File Read (MB/s, 4QD)</strong></td> <td>151</td> <td><strong>351</strong></td> </tr> <tr> <td><strong>64KB File Write (MB/s, 4QD)</strong></td> <td>354</td> <td><strong>485</strong></td> </tr> <tr> <td><strong>Iometer</strong></td> <td>&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>4KB Random Write 32QD <br />(IOPS)</strong></td> <td>19,943</td> <td><strong>64,688</strong></td> </tr> <tr> <td><strong>PCMark Vantage x64 </strong></td> <td>6,252</td> <td><strong>41,787</strong></td> </tr> </tbody> </table> </div> <p><em>Best scores are bolded. All tests conducted on our hard drive test bench, which consists of a Gigabyte Z77X-UP4 motherboard, Intel Core i5-3470 3.2GHz CPU, 8GB of RAM, Intel 520 Series SSD, and a Cooler Master 450W power supply.</em></p> <p><em>Click the next page to read about SSD RAID vs a single SSD!</em></p> <p><em></em></p> <hr /> <p>&nbsp;</p> <h3>SSD RAID vs. Single SSD</h3> <p>This test is somewhat analogous to the GPU comparison, as most people would assume that two small-capacity SSDs in RAID 0 would be able to outperform a single 256GB SSD. The little SSDs have a performance penalty out of the gate, though, as SSD performance usually improves as capacity increases because the controller is able to grab more data given the higher-capacity NAND wafers—just like higher-density platters increase hard drive performance. This is not a universal truth, however, and whether or not performance scales with an SSD’s capacity depends on the drive’s firmware, NAND flash, and other factors, but in general, it’s true that the higher the capacity of a drive, the better its performance. The question then is: Is the performance advantage of the single large drive enough to outpace two little drives in RAID 0?</p> <p>Before we jump into the numbers, we have to say a few things about SSD RAID. The first is that with the advent of SSDs, RAID setups are not quite as common as they were in the HDD days, at least when it comes to what we’re seeing from boutique system builders. The main reason is that it’s really not that necessary since a stand-alone SSD is already extremely fast. Adding more speed to an already-fast equation isn’t a big priority for a lot of home users (this is not necessarily our audience, mind you). Even more importantly, the biggest single issue with SSD RAID is that the operating system is unable to pass the Trim command to the RAID controller in most configurations (Intel 7 and 8 series chipsets excluded), so the OS can’t tell the drive how to keep itself optimized, which can degrade performance of the array in the long run, making the entire operation pointless. Now, it’s true that the drive’s controller will perform “routine garbage collection,” but how that differs from Trim is uncertain, and whether it’s able to manage the drive equally well is also unknown. However, the lack of Trim support on RAID 0 is a scary thing for a lot of people, so it’s one of the reasons SSD RAID often gets avoided. Personally, we’ve never seen it cause any problems, so we are fine with it. We even ran it in our Dream Machine 2013, and it rocked the Labizzle. So, even though people will say SSD RAID is bad because there’s no Trim support, we’ve never been able to verify exactly what that “bad” means long-term.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/reviews-10645_small_0.jpg"><img src="/files/u152332/reviews-10645_small.jpg" alt="It’s David and Goliath all over again, as two puny SSDs take on a bigger, badder drive. " title="SSDs" width="620" height="398" /></a></p> <p style="text-align: center;"><strong>It’s David and Goliath all over again, as two puny SSDs take on a bigger, badder drive. </strong></p> <p><strong>The Test:</strong> We plugged in two Corsair Neutron SSDs, set the SATA controller to RAID, created our array with a 64K stripe size, and then ran all of our tests off an Intel 520 SSD boot drive. We used the same protocol for the single drive.</p> <p><strong>The Results:</strong> The results of this test show a pretty clear advantage for the RAIDed SSDs, as they were faster in seven out of nine tests. That’s not surprising, however, as RAID 0 has always been able to benchmark well. That said, the single 256 Corsair Neutron drive came damned close to the RAID in several tests, including CrystalDiskMark, ATTO at four queue depth, and AS SSD. It’s not completely an open-and-shut case, though, because the RAID scored poorly in the PC Mark Vantage “real-world” benchmark, with just one-third of the score of the single drive. That’s cause for concern, but with these scripted tests it can be tough to tell exactly where things went wrong, since they just run and then spit out a score. Also, the big advantage of RAID is that it boosts sequential-read and -write speeds since you have two drives working in parallel (conversely, you typically won’t see a big boost for the small random writes made by the OS). Yet the SSDs in RAID were actually slower than the single SSD in our Sony Vegas “real-world” 20GB file encode test, which is where they should have had a sizable advantage. For now, we’ll say this much: The RAID numbers look good, but more “real-world” investigation is required before we can tell you one is better than the other.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 620px; height: 267px;" border="0"> <thead> <tr style="text-align: left;"> <th class="head-empty"> </th> <th class="head-light"><span style="font-family: times new roman,times;">1x Corsair Neutron 256GB</span></th> <th><span style="font-family: times new roman,times;">2x Corsair Neutron 128GB RAID 0 </span></th> </tr> </thead> <tbody> <tr> <td class="item"><strong>CrystalDiskMark</strong></td> <td class="item-dark">&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>Avg. Sustained Read (MB/s)</strong></td> <td>512</td> <td><strong>593</strong></td> </tr> <tr> <td class="item"><strong>Avg. Sustained Write (MB/s)<br /></strong></td> <td class="item-dark">436</td> <td><strong>487</strong></td> </tr> <tr> <td><strong>AS SSD - Compressed Data</strong></td> <td>&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>Avg. Sustained Read (MB/s)</strong></td> <td>506</td> <td><strong>647</strong></td> </tr> <tr> <td><strong>Avg. Sustained Write (MB/s)</strong></td> <td>318</td> <td><strong>368</strong></td> </tr> <tr> <td><strong>ATTO</strong></td> <td>&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>64KB File Read (MB/s, 4QD)</strong></td> <td>436</td> <td>934</td> </tr> <tr> <td><strong>64KB File Write (MB/s, 4QD)</strong></td> <td><strong>516</strong></td> <td>501</td> </tr> <tr> <td><strong>Iometer</strong></td> <td>&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td><strong>4KB Random Write 32QD <br />(IOPS)</strong></td> <td>70,083</td> <td><strong>88,341</strong></td> </tr> <tr> <td><strong>PCMark Vantage x64 <br /></strong></td> <td><strong>70,083</strong></td> <td>23,431</td> </tr> <tr> <td><strong>Sony Vegas Pro 9 Write (sec)</strong></td> <td>343</td> <td><strong>429</strong></td> </tr> </tbody> </table> </div> <p><em>Best scores are bolded. All tests conducted on our hard-drive test bench, which consists of a Gigabyte Z77X-UP4 motherboard, Intel Core i5-3470 3.2GHz CPU, 8GB of RAM, Intel 520 Series SSD, and a Cooler Master 450W power supply.</em></p> <h3>Benchmarking: Synthetic vs. Real-World<em>&nbsp;</em></h3> <p>There’s a tendency for testers to dismiss “synthetic” benchmarks as having no value whatsoever, but that attitude is misplaced. Synthetics got their bad name in the 1990s, when they were the only game in town for testing hardware. Hardware makers soon started to optimize for them, and on occasion, those actions would actually hurt performance in real games and applications.<em>&nbsp;</em></p> <p>The 1990s are long behind us, though, and benchmarks and the benchmarking community have matured to the point that synthetics can offer very useful metrics when measuring the performance of a single component or system. At the same time, real-world benchmarks aren’t untouchable. If a developer receives funding or engineering support from a hardware maker to optimize a game or app, does that really make it neutral? There is the argument that it doesn’t matter because if there’s “cheating” to improve performance, that only benefits the users. Except that it only benefits those using a certain piece of hardware.</p> <p>In the end, it’s probably more important to understand the nuances of each benchmark and how to apply them when testing hardware. SiSoft Sandra, for example, is a popular synthetic benchmark with a slew of tests for various components. We use it for memory bandwidth testing, for which it is invaluable—as long as the results are put in the right context. A doubling of main system memory bandwidth, for example, doesn’t mean you get a doubling of performance in games and apps. Of course, the same caveats apply to real-world benchmarks, too.</p> <h3>Avoid the Benchmarking Pitfalls<em></em></h3> <p>Even seasoned veterans are tripped up by benchmarking pitfalls, so beginners should be especially wary of making mistakes. Here are a few tips to help you on your own testing journey.<em></em></p> <p>Put away your jump-to-conclusions mat. If you set condition A and see a massive boost—or no difference at all when you were expecting one—don’t immediately attribute it to the hardware. Quite often, it’s the tester introducing errors into the test conditions that causes the result. Double-check your settings and re-run your tests and then look for feedback from others who have tested similar hardware to use as sanity-check numbers.<em></em></p> <p>When trying to compare one platform with another (certainly not ideal)—say, a GPU in system A against a GPU in system B—be especially wary of the differences that can result simply from using two different PCs, and try to make them as similar as possible. From drivers to BIOS to CPU and heatsink—everything should match. You may even want to put the same GPU in both systems to make sure the results are consistent.<em></em></p> <p>Use the right benchmark for the hardware. Running Cinebench 11.5—a CPU-centric test—to review memory, for example, would be odd. A better fit would be applications that are more memory-bandwidth sensitive, such as encoding, compression, synthetic RAM tests, or gaming.<em></em></p> <p>Be honest. Sometimes, when you shell out for new hardware, you want it to be faster because no one wants to pay through the nose to see no difference. Make sure your own feelings toward the hardware aren’t coloring the results.<em><br /></em></p> http://www.maximumpc.com/pc_performance_tested_2014#comments 2013 air cooling benchmark cpu graphics card Hardware Hardware liquid cooling maximum pc motherboard pc speed test performance ssd tests October 2013 Motherboards Features Mon, 10 Feb 2014 22:46:41 +0000 Maximum PC staff 26909 at http://www.maximumpc.com