Maximum PC - Reviews http://www.maximumpc.com/articles/40/feed en Blizzcon 2014: We Check Out the HP Omen Gaming Laptop at Intel's Booth [Video] http://www.maximumpc.com/blizzcon_2014_we_check_out_hp_omen_gaming_laptop_intels_booth_video_2014 <!--paging_filter--><h3><img src="/files/u166440/hp_omen.jpg" alt="HP Omen" title="HP Omen" width="200" height="138" style="float: right;" />Is it what PC gamers are looking for?</h3> <p>It is not all fun and games at Blizzcon 2014 for Maximum PC online managing editor Jimmy Thang. He has been busy recording video of the various vendors who are on the show floor of the convention taking place in Anaheim, California. Aside from seeing products from <a title="Gigabyte Blizzcon 2014" href="http://www.maximumpc.com/blizzcon_2014_gigabyte_shows_new_brix_gaming_pc_and_top_tier_x99_motherboard_2014" target="_blank"><span style="color: #ff0000;">Gigabyte</span></a>, Jimmy also visited the Intel booth where he was able to see the <strong>HP Omen gaming laptop</strong> up close and personal.</p> <p>Jimmy was able to talk to a rep about the HP Omen, which has been specifically designed for gaming and to handle triple-A games. The 15.6-inch laptop features an IPS touchscreen display with Full HD 1080p resolution and is powered by an Intel Core i7 4710HQ processor, contains 8GB of DDR3 RAM, and has a Nvidia GeForce GTX 860M GPU.&nbsp;</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/FnmHos3yWlY?list=UUdLWXfNqKICJBpE8jVMm6_w" width="600" height="315" frameborder="0"></iframe></p> <p style="text-align: left;">HP’s gaming laptop was revealed <a title="HP Omen" href="http://www.maximumpc.com/hp_sends_thin_and_light_omen_laptop_gaming_community_2014" target="_blank"><span style="color: #ff0000;">last Tuesday</span></a> and will weigh around 4.68 pounds, is 19.9mm thin, and features a solid aluminum chassis. The backlit keyboard can be programmed for millions of different colors and divided into zones, while the lights will pulsate along with any music that the user is listening or react to video games that are being played&nbsp;</p> <p>The HP Omen gaming laptop will have a starting price of $1,499.&nbsp;</p> <p>What do you think of HP’s new laptop?</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/blizzcon_2014_we_check_out_hp_omen_gaming_laptop_intels_booth_video_2014#comments HP Omen HP Omen gaming laptop intel nvidia News Notebooks Sun, 09 Nov 2014 22:59:17 +0000 Sean D Knight and Jimmy Thang 28870 at http://www.maximumpc.com Blizzcon 2014: Gigabyte Shows New Brix Gaming PC and Top-Tier X99 Motherboard [Video] http://www.maximumpc.com/blizzcon_2014_gigabyte_shows_new_brix_gaming_pc_and_top_tier_x99_motherboard_2014 <!--paging_filter--><h3><img src="/files/u166440/x99_ga_motherboard.jpg" alt="X99 Motherboard" title="X99 Motherboard" width="200" height="129" style="float: right;" />It’s not all about the games and cosplay</h3> <p>Blizzcon isn’t just a convention that revolves around all things Blizzard, such as the developer’s recently announced FPS game <a title="Overwatch article" href="http://www.maximumpc.com/blizzard_announces_team-based_shooter_%E2%80%9Coverwatch%E2%80%9D" target="_blank"><span style="color: #ff0000;">Overwatch</span></a>.&nbsp; Vendors and manufacturers, such as <strong>Gigabyte,</strong> are also there to advertise their products. Maximum PC online managing editor Jimmy Thang took the time to visit the Gigabyte booth, where he got to see the new model of the Brix Gaming PC kit and check out the topitier x99 motherboard.</p> <p>The Brix Gaming PC kit was revealed <a title="Brix Gaming" href="http://www.maximumpc.com/gigabyte_adds_gaming_mini_pc_brix_family311" target="_blank"><span style="color: #ff0000;">back in June</span></a>, and the i5 processor model was released back in September. Jimmy was able to speak to a Gigabyte representative who showed him the yet-to-be-released Brix Gaming kit with an i7 processor and Nvidia GTX 760 GPU. Unlike the i5 model's green color, the i7 version will come in black and is expected to be out during late November or sometime in December.</p> <p>Be sure to watch the video to learn more:</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/5WDmVGTS_KM?list=UUdLWXfNqKICJBpE8jVMm6_w" width="600" height="315" frameborder="0"></iframe></p> <p style="text-align: left;">Jimmy Thang also got to look at the GA-X99 Gaming G1 WIFI motherboard, which was one of three new X99 chipset mobos that was announced <a title="X99 motherboards" href="http://www.maximumpc.com/take_sneak_peek_three_upcoming_gigabyte_motherboards_haswell-e_2014" target="_blank"><span style="color: #ff0000;">back in August</span></a>. This is the top-tier mobo, and features a heatsink that lights up and blinks with the beat of your music. The lights will also pulsate on and off, and can of course be turned off if you aren't into that flashing lights thing. It also sports the LGA 2011 socket-v3 socket for Haswell-E processors and is the first generation to have DDR4 memory support.&nbsp;</p> <p>The GA-X99 Gaming G1 WIFI is expected to sell for around $350 and is currently available online.</p> <p>For additional details, check out the video:</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/PsE1B53H1kQ?list=UUdLWXfNqKICJBpE8jVMm6_w" width="600" height="315" frameborder="0"></iframe></p> <p style="text-align: center;">&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/blizzcon_2014_gigabyte_shows_new_brix_gaming_pc_and_top_tier_x99_motherboard_2014#comments Brix Gaming PC gigabyte Hardware X99 Motherboard Gaming News Motherboards Sun, 09 Nov 2014 02:49:03 +0000 Sean D Knight and Jimmy Thang 28866 at http://www.maximumpc.com Best Free Hardware Monitoring Tools http://www.maximumpc.com/best_free_hardware_monitoring_tools_2014 <!--paging_filter--><h3>Apps that regulate your rig’s internals</h3> <p>Making sure your rig’s temperatures, hardware, and clock speeds are running correctly is a good way to monitor your PC’s health. We always recommend stress-testing your shiny-new rig, or checking your hardware if you experience any stability issues that occur out of the blue. We’ve gathered up a list of the best free utilities you can use to make sure you have a healthy PC.</p> <p>Know of any other free monitoring tools? Let us know in the comments section below!</p> <p><strong><a title="CPU-Z" href="http://www.cpuid.com/softwares/cpu-z.html" target="_blank">CPU-Z:</a></strong></p> <p style="text-align: center;"><img src="/files/u154280/cpuid_cpu_z.png" alt="CPU-Z" title="CPU-Z" /></p> <p>CPU-Z tells you what’s going on with your CPU by giving you readouts of your Core Speed, Multiplier, Bus Speed, and your different cache levels. It also tells you the make and model of your motherboard and video card, along with your RAM speed and capacity.&nbsp;</p> <p>We recommend using this tool if you have a preconfigured system from an OEM like Lenovo, HP, or Dell and need to find out your motherboard’s model number (if it isn’t printed on the board). The tool can also be used to monitor your CPU’s voltage, so it's overclocker friendly.</p> <p><strong><a title="GPU-Z" href="http://www.techpowerup.com/downloads/SysInfo/GPU-Z/" target="_blank">GPU-Z:</a></strong></p> <p style="text-align: center;"><img src="/files/u154280/gpu_z.png" alt="GPU-Z" title="GPU-Z" width="560" height="637" /></p> <p>GPU-Z gives you detailed readouts of your GPU’s clock speeds and memory size. You can use this tool to make sure that your video card is running at PCIe 3.0, as some boards run in 2.0 instead of 3.0 by default. You’ll look at the Bus Interface box to check out your video card's PCIe configuration.</p> <p><strong><a title="Furmark" href="http://www.ozone3d.net/benchmarks/fur/" target="_blank">Furmark:</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/furmark.png" alt="Furmark" title="Furmark" width="600" height="453" /></strong></p> <p>Got GPU problems? Furmark is a fantastic tool if you’re getting blue screens during games and want to find out if your video card is the culprit. The utility gives your GPU a workload to max-out your video card. You’ll also see a temperature read from it, so you can see if your card is running hot.</p> <p><strong><a title="FRAPS" href="http://www.fraps.com/download.php" target="_blank">FRAPS:</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/fraps.png" alt="FRAPS" title="FRAPS" width="600" height="370" /></strong></p> <p>Getting weird frame rate issues after freshly installing BF4 or Assassins Creed Black Flag? FRAPS will give you readouts of your real-time frame rate in-game, so you can see when and where you rig is starting to stutter. We like using this utility when a game is running poorly, so we can keep an eye on our frame rate during gameplay. We also use this tool to capture average frame rates of games that don’t come with benchmarking tools like BF4, Far Cry 3, and Crysis 3.</p> <hr /> <p>&nbsp;</p> <p><strong><a title="Core Temp" href=" http://www.alcpu.com/CoreTemp/" target="_blank">Core Temp:</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/core_temp.png" alt="Core Temp" title="Core Temp" width="351" height="388" /></strong></p> <p>Unlike other utilities in this round-up of free apps, Core Temp tells you the individual temperatures of each of your CPU’s cores. We use this tool to make sure our processor isn’t running too hot. Core Temp also tells you the TDP, voltage, and power consumption of your&nbsp; CPU.</p> <p><strong><a title="AMD Catalyst Control Center" href="http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64" target="_blank">AMD Catalyst Control Center:&nbsp;</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/amd_overdrive.png" alt="AMD Catalyst Control Center" title="AMD Catalyst Control Center" width="600" height="573" /></strong></p> <p>AMD video card users can use AMD’s Catalyst Control center to monitor their video card’s performance. You’ll be able to change your GPU’s core and memory clock speeds by using AMD’s Overdrive utility, which is found in the performance tab of AMD’s Catalyst driver. You can also adjust your video card’s fan speed here.</p> <p><strong><a title="Prime 95" href="http://files.extremeoverclocking.com/file.php?f=205" target="_blank">Prime 95:&nbsp;</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/prime_95_running.png" alt="Prime 95" title="Prime 95" width="600" height="378" /></strong></p> <p>Prime 95 puts your CPU through its paces by giving it a workload that will max-out your processor’s cores. We suggest using this utility if you’re having blue screen errors or freezing issues to make sure that your CPU isn’t the offender behind those infuriating messages.&nbsp;</p> <p><strong><a title="3DMark" href=" http://store.steampowered.com/app/223850/" target="_blank">3DMark:&nbsp;</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/3dmark_demo.png" alt="3DMark" title="3DMark" width="600" /></strong></p> <p>3DMark is great for benchmarking your system’s overall performance, and the free demo version also shows you where your rig stacks up with other systems that have similar hardware. The paid version lets you run the Extreme benchmarks, which run in 1080p instead of the demo’s 720p default.</p> <p><strong><a title="Rainmeter" href="http://rainmeter.net/" target="_blank">Rainmeter:</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/rainmeter.png" alt="Rainmeter" title="Rainmeter" width="600" /></strong></p> <p>Rainmeter is a simple widget that displays your CPU and RAM usage and also tells you how full your hard drive and/or SSD are.&nbsp; &nbsp;</p> <p><strong><a title="EVGA Precision X" href=" http://www.evga.com/precision/" target="_blank">EVGA Precision X:&nbsp;</a></strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/evga_precision_x.png" alt="EVGA Precision X" title="EVGA Precision X" width="600" height="471" /></strong></p> <p>Precision X is made by EVGA exclusively for Nvidia video cards. The tool allows you to check out your GPU clock speed and temperatures, and adjust your fan speeds, too. You can also overclock your GPU with the sliders, seen above. This tool displays your GPU's load, which we find quite handy.</p> http://www.maximumpc.com/best_free_hardware_monitoring_tools_2014#comments apps benchmark components cpu id free furmark gpu z Hardware Hardware monitoring tools overclock pc monitor heat Software News Features Tue, 21 Oct 2014 23:41:16 +0000 Chris Zele 27117 at http://www.maximumpc.com Computer Upgrade Guide http://www.maximumpc.com/computer_upgrade_2014 <!--paging_filter--><h3>Avoid the pitfalls and upgrade your computer like a pro</h3> <p>Building a new PC is a relatively easy task—you pick your budget and build around it. It’s not the same with upgrading a computer. No, upgrading an older computer can be as dangerous as dancing Footloose-style through a minefield. Should you really put $500 into this machine, or just buy a new one? Will that new CPU really be faster than your old one in the real world? Are you CPU-limited or GPU-limited?</p> <p>To help give you more insight on how to best upgrade a PC that is starting to show its age, follow along as we take three real-world boxes and walk you through the steps and decisions that we make as we drag each machine back to the future through smart upgrades. While our upgrade decisions may not be the same ones you would make, we hope that we can shed some light on our thought process for each component, and help you answer the eternal question of: “What should I upgrade?”</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/mpc99.feat_pcupgrade.opener_small_0.jpg"><img src="/files/u154082/computer_upgrade.jpg" alt="computer upgrade" title="computer upgrade" width="620" height="533" /></a></p> <h3>Practical PC upgrading advice</h3> <p>There’s really two primary reasons to upgrade. The first is because you can—and believe us, we’ve upgraded just because “we could” plenty of times. Second, because you need to. How you define “need to” is very much a personal preference—there’s no way to put a hard number on it. You can’t say, “If I get a 5.11 in BenchMarkMark, I need to upgrade.” No, you need to determine your upgrade needs using everyday metrics like, “I will literally throw this PC through a window if this encode takes any longer,” or “I have literally aged a year watching my PC boot.” And then there’s the oldie: “My K/D at Call of Battlefield 5 is horrible because my graphics card is too slow.”</p> <p>Whether or not any of these pain points apply to you, only you can decide. Also, since this article covers very specific upgrades to certain components, we thought we’d begin with some broad tips that are universally applicable when doing the upgrade dance.</p> <h4>Don’t fix what’s not broken</h4> <p>One of the easiest mistakes to make with any upgrade plan is to upgrade the wrong component. The best example is someone who decides that his or her PC is “slow,” so they need to add RAM and take it from 8GB to 16GB, or even 16GB to 32GB. While there are cases where adding more RAM or higher-clocked RAM will indeed help, the vast majority of applications and games are pretty happy with 8GB. The other classic trap is deciding that a CPU with more cores is needed because the machine is “slow” in games. The truth is, the vast majority of games are coded with no more than four cores in mind. Some newer games, such as Battlefield 4, do indeed run better with Hyper-Threading on a quad-core or a six-core or more processor (in some maps) but most games simply don’t need that many cores. The lesson here is that there’s a lot of context to every upgrade, so don’t just upgrade your CPU willy-nilly on a hunch. Sometimes, in fact, the biggest upgrade you can make is not to upgrade.</p> <h4>CPU-bound</h4> <p>You often hear the term “CPU-bound,” but not everyone understands the nuances to it. For the most part, you can think of something being CPU-bound when the CPU is causing a performance bottleneck. But what exactly is it about the CPU that is holding you back? Is it core or thread count? Clock speeds, or even microarchitecture efficiency? You’ll need to answer these questions before you make any CPU upgrade. When the term is used in association with gaming, “CPU-bound” usually indicates there is a drastic mismatch in GPU power and CPU power. This would be evident from, say, running a GeForce Titan in a system with a Pentium 4. Or say, running a Core i7-4960X with a GeForce 8800GT. These are extreme cases, but certainly, pairing a GeForce Titan or Radeon 290X with a low-end dual-core CPU will mean you would not see the most performance out of your GPU as you could with a more efficient quad-core or more CPU. That’s because the GPU depends on the CPU to send it tasks. So, in a CPU-bound scenario, the GPU is waiting around twiddling its thumbs most of the time, since the CPU can’t keep up with it.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/mpc99.feat_pcupgrade.nehalem_small_0.jpg"><img src="/files/u152332/mpc99.feat_pcupgrade.nehalem_small.jpg" alt="One of the trickier upgrades is the original LGA1366 Core i7 chips. Do you upgrade the chip, overclock it, or just dump it?" width="620" height="605" /></a></p> <p style="text-align: center;"><strong>One of the trickier upgrades is the original LGA1366 Core i7 chips. Do you upgrade the chip, overclock it, or just dump it?</strong></p> <h4>GPU-bound</h4> <p>The situation can be reversed, too. You can indeed get GPU-bound systems by running older or entry-level graphics with a hopped-up CPU. An example could be a Haswell Core i7-4770K overclocked to 4.5GHz paired with say, an entry-level GeForce GTX 750. You will certainly get the best frame rate out of the GPU possible, but you probably did not need the overclocked Haswell to do it. You could have kept that entry-level GPU well-fed with instructions using a cheaper Core i5-4670K or AMD FX part. Still, the rule of thumb with a gaming machine is to invest more in the GPU than the CPU. If we had to make up a ratio though, we’d say your CPU can cost half that of your GPU. A $500 GPU would be good with a $250 CPU and a $300 GPU would probably be OK with a $150–$170 CPU.</p> <h4>You can ignore the GPU sometimes</h4> <p>Keep in mind, this GPU/CPU relationship is in reference to gaming performance. When it comes to application performance, the careful balance between the two doesn’t need to be respected as much, or even at all. For a system that’s primarily made for encoding video, photo editing, or other CPU-intensive tasks, you’ll generally want as fast a CPU as possible on all fronts. That means a CPU with high clocks, efficient microarchitecture, and as many cores and threads possible will net you the most performance. In fact, in many cases, you can get away with integrated graphics and ignore discrete graphics completely. We don’t recommend that approach, though, since GPUs are increasingly becoming important for encoding and even photo editing, and you rarely need to spend into the stratosphere to get great performance. Oftentimes, in fact, older cards will work with applications such as Premiere Pro or Photoshop, while the latest may not, due to drivers and app support from Adobe.</p> <p>&nbsp;</p> <hr /> <p>&nbsp;</p> <h3>Core 2 Quad box</h3> <p><strong>A small Form Factor, Light-Gaming Rig before SFF was popular</strong></p> <p>This small box has outlived its glory days, but with a modest injection of capital and a few targeted upgrades, we’ll whip it back into shape in no time. It won’t be able to handle 4K gaming, but it’ll be faster than greased lightning and more than capable of 1080p frag-fests.</p> <p>This particular PC could have very easily resided on the desktop of any Maximum PC staffer or reader back in the year 2009. We say that because this is, or was, actually a pretty Kick Ass machine in the day. It was actually a bit ahead of its time, thanks to its combination of benchmark-busting horsepower and small, space-saving dimensions. This mini-rig was probably used for light gaming and content creation, with its powerful CPU and mid-tier GPU. As far as our business here goes, its diminutive size creates some interesting upgrade challenges.</p> <div class="module orange-module article-module"><strong><span class="module-name">Specifications</span></strong><br /> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light">Original part</th> <th>Upgrade Part</th> <th>Upgrade Part Cost</th> </tr> </thead> <tbody> <tr> <td class="item">Case/PSU</td> <td class="item-dark">Silverstone SG03/500w</td> <td><span style="text-align: center;">No Change</span></td> <td>&nbsp;</td> </tr> <tr> <td>CPU</td> <td>Intel Core 2 Quad QX6800</td> <td><span style="text-align: center;">No change</span></td> <td>&nbsp;</td> </tr> <tr> <td class="item">Motherboard</td> <td class="item-dark">Asus P5N7A- VM</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>Cooling</td> <td>Stock</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>RAM</td> <td>4GB DDR2/1600 in dual-channel mode</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td class="item">GPU</td> <td class="item-dark">GeForce 9800 GT</td> <td><strong>EVGA GTX 750 Ti<br /></strong></td> <td>$159</td> </tr> <tr> <td>HDD/SSD</td> <td>500GB 7,200rpm WD Caviar</td> <td>240GB OCZ Vertex 460</td> <td>$159</td> </tr> <tr> <td>ODD</td> <td>DVD burner</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>OS</td> <td>32-bit Windows Vista Ultimate</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>Misc.</td> <td>&nbsp;</td> <td>USB 3.0 add-in card</td> <td>$12</td> </tr> <tr> <td>Total upgrade cost</td> <td>&nbsp;</td> <td>&nbsp;</td> <td>$330</td> </tr> </tbody> </table> </div> </div> <p>It’s built around a Silverstone SG03 mini-tower, which is much shorter and more compact than the SFF boxes we use nowadays. For example, it can only hold about nine inches of GPU, and puts the PSU directly above the CPU region, mandating either a stock cooler or a low-profile job. So, either way, overclocking is very much out of the question. Water-cooling is also a non-starter, due to the lack of space for a radiator either behind the CPU area or on the floor of the chassis. In terms of specs, this system isn’t too shabby, as it’s rocking an LGA 775 motherboard with a top-shelf Core 2 Quad “Extreme” CPU and an upper-midrange GPU. We’d say it’s the almost exact equivalent of a $2,000 SFF gaming rig today. The CPU is a 65nm Kentsfield Core 2 Quad Extreme QX6800, which at the time of its launch was ludicrously expensive and the highest-clocked quad-core CPU available for the Core 2 platform at 2.93GHz. The CPU is plugged into an Asus P5N7A-VM motherboard, which is a microATX model that sports an nForce 730i chipset, supports up to 16GB of RAM, and has one PCIe x1 slot in addition to two PCI slots, and one x16 PCI Express slot. GPU duties are handled by the venerable GeForce 9800 GT, and it’s also packing 4GB of DDR2 memory, as well as a 500GB 7,200rpm Western Digital hard drive. Its OS is Windows Vista Ultimate 32-bit.</p> <h4>Lets dig in</h4> <p>The first question that crossed our minds when considering this particular machine’s fate was, “Upgrade certain parts, or go whole-hog with a new motherboard/CPU/RAM?” Sure, this is Maximum PC, and it would be easy to just start over. But that’s not really an upgrade; that’s more like open-heart surgery. Besides, where’s the challenge in that? Anyone can put together a new system, so we decided to buckle down, cinch up our wallets, and go part-by-part.</p> <p>Starting with the motherboard, CPU, and RAM, we decided to leave those as they were. For Intel at the time, this CPU was as good as it gets, and the only way to upgrade using the same motherboard and chipset is to move to a Yorkfield quad-core CPU. That’s a risky upgrade, though, for two reasons. First, not all of those 45nm chips worked in Nvidia’s nForce chipset, and second, benchmarks show mostly single-digit percent performance increases over Kentsfield. So, you’d have to be crazy to attempt this upgrade. We also deemed its 4GB of DDR2 to be satisfactory, since we’re running a 32-bit OS and anything over 4GB can’t be seen by it. If we were running a 64-bit OS, we’d upgrade to 8GB as a baseline amount of memory, though. We’re not happy about the motherboard’s SATA 3Gb/s ports, and the lack of a x2 PCIe slot is a problem, but SATA 3Gb/s is fast enough to handle any late-model hard drive, or an SSD upgrade. Another problem area is its bounty of 12 USB 2.0 ports. We appreciate the high number of ports, but USB 2.0 just plain sucks, so we added a PCIe USB 3.0 adapter, which gave us four SuperSpeed ports on the back of the chassis.</p> <p>One area ripe for upgrade is the GPU, because a GeForce 9800 GT is simply weak sauce these days. It was actually a rebadge of the 8800 GT when it arrived in 2009. This GPU was actually considered to be the low-end of the GeForce family when it arrived, as there were two models above it in the product stack—the 9800 GTX and the dual-GPU 9800 GX2. This single-slot GPU was only moderately powered at the time and features 112 shader processors clocked at 1,500MHz, and 512MB of GDDR3 clocked at 1.5GHz on a 256-bit memory bus. Since this system has limited space and only a single six-pin PCIe connector, we decided to upgrade the GPU to the Sapphire Radeon R7 265, which is our choice for the best $150 GPU. Unfortunately, the AMD card did not get along at all with our Nvidia chipset, so we ditched it in favor of the highly clocked and whisper-quiet EVGA GTX 750 Ti, which costs $159. This will not only deliver DX11 gaming at the highest settings at 1080p, but will also significantly lower the sound profile of the system, since this card is as quiet as a mouse breaking wind.</p> <p>Another must-upgrade part was the 500GB WD hard drive. As we wrote elsewhere, an SSD is a must-have in any modern PC, and we always figured it could make an aging system feel like new again, so this was our chance to try it in the real world. Though we wanted to upgrade to a 120GB Samsung 840 EVO, we couldn’t get our hands on one, so we settled for a larger and admittedly extravagant OCZ Vertex 460 240GB for $160. We decided to leave the OS as-is. Despite all the smack talk it received, Windows Vista SP2 was just fine.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_3_small_0.jpg"><img src="/files/u152332/main_image_3_small.jpg" width="620" height="404" /></a></p> <h4>Real-World Results</h4> <p>Since we upgraded the GPU and storage subsystem, we’ll start with those results first. With the SSD humming along, our boot time was sliced from 1:27 to 1:00 flat, which is still a bit sluggish but doesn’t tell the whole story. Windows Vista felt instantly “snappy,” thanks to the SSD’s lightning-fast seek times. Everything felt fast and responsive, so though we didn’t get a sub-20-second boot time like we thought we would, we still gained a very noticeable increase in day-to-day use of the machine. For the record, we blame the slow boot time on the motherboard or something with this install of Vista, but this is still an upgrade we’d recommend to anyone in a similar situation. Interestingly, we also saw a boost in one of our encoding benchmarks, which could be due to the disk I/O, as well. For example, Sticth.Efx 2.0 dropped from 41 minutes to 36 minutes, which is phenomenal. Stitch.Efx creates in excess of 20,000 files, which will put a drag on a 500GB hard drive.</p> <p>Our gaming performance exploded, though, going from 11fps in Heaven 4.0 to 42fps. In Batman: Arkham Origins, we went from a non-playable 22 fps to a smooth 56fps, so anyone who thinks you need a modern CPU for good gaming performance is mistaken (at least for some games); the GPU does most of the heavy lifting in gaming. We also got a major reduction in case temps and noise by going from the hot-and-loud 9800 GT to the silent-and-cool GTX 750 Ti. The old card ran at 83 C under load, while the new one only hit 53 C, and made no noise whatsoever.</p> <h4>No regrets</h4> <p>Since we couldn’t do much with the motherboard/CPU/RAM on this board without starting fresh, we upgraded what we could and achieved Kick Ass real-world results from it, so this operation upgrade was very successful. Not only does it boot faster and feel ultra-responsive, it’s also ready for at least another year of gaming, thanks to its new GPU. Plus, with USB 3.0 added for storage duties, we can attach our external drives and USB keys and expect modern performance. All-in-all, this rig has been given a new lease on life for just a couple hundies—not bad for a five-year-old machine.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light">Pre-upgrade</th> <th></th> </tr> </thead> <tbody> <tr> <td class="item">Cinebench R15 64-bit</td> <td class="item-dark">WNR</td> <td><span style="text-align: center;">WNR</span></td> </tr> <tr> <td>ProShow Producer 5.0 (sec)</td> <td>3,060</td> <td><span style="text-align: center;">3,334 <strong>(-8%)</strong></span></td> </tr> <tr> <td class="item">Stitch.Efx (sec)</td> <td class="item-dark">2,481</td> <td>2,166</td> </tr> <tr> <td>Bootracer (sec)</td> <td>90</td> <td>60</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td>22</td> <td>56 <strong>(+155%)</strong></td> </tr> <tr> <td class="item">Heaven 4.0 (fps)</td> <td class="item-dark">11</td> <td>42<strong> (+282%)</strong></td> </tr> </tbody> </table> </div> <div class="spec-table orange"> <hr /></div> <h3>Skeleton Rises</h3> <p><strong>Flying the AMD flag</strong></p> <p>Our second rig flies the AMD “Don’t Underclock Me” flag. You know the type. No matter how wide a gap Intel opens up with its latest CPU techno-wonder, this AMD CPU fanboy won’t switch until you pry that AM3 CPU from his cold, dead motherboard. In fact, the bigger the performance gap with Intel, the deeper this fanboy will dig in his heels.</p> <p>The box itself is built around the eye-catching and now discontinued Antec Skeleton open-air chassis. It draws a lot of whistles from case aficionados when they walk by, but truth be told, it’s really not great to work in and not exactly friendly to upgrading. The base machine parts are pretty respectable, though. The mainboard is an Asus Crosshair IV (CHIV) Formula using the AMD 890FX chipset, with a quad-core 3.2GHz Phenom II X4 955 and GeForce GTX 570 graphics. For the record, this machine was not built by us, nor do we know who built it, but the original builder made the typical error of inserting the pair of 2GB DDR3/1066 DIMMs into the same channel memory slots, causing the sticks to run in single-channel mode instead of dual-channel. As any salty builder knows, there’s a reason the phrase “RTFM” exists. For storage, the machine packs a single 1TB 7,200rpm hard drive and a DVD burner. Power is handled by an AntecTruePower 750, which is plenty for a rig like this. Cooling is a stock AMD affair with dual heat pipes.</p> <div class="module orange-module article-module"><strong><span class="module-name">Specifications</span></strong><br /> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light">Original part</th> <th>Upgrade Part</th> <th>Upgrade Part Cost</th> </tr> </thead> <tbody> <tr> <td class="item">Case/PSU</td> <td class="item-dark">Antec Skeleton / TruePower 750</td> <td><span style="text-align: center;">No Change</span></td> <td>&nbsp;</td> </tr> <tr> <td>CPU</td> <td>3.2GHz Phenom II X4 955</td> <td><span style="text-align: center;">4GHz FX-8350 Black Edition</span></td> <td>$199</td> </tr> <tr> <td class="item">Motherboard</td> <td class="item-dark">Asus Crosshair IV Formula</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>Cooling</td> <td>Stock</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>RAM</td> <td>4GB DDR3/1066 in single-channel mode</td> <td>8GB DDR3/1600 in dual-channel mode</td> <td>$40</td> </tr> <tr> <td class="item">GPU</td> <td class="item-dark">EVGA GeForce GTX 570 HD</td> <td>Asus GTX760-DC2OC-2GD5<strong><br /></strong></td> <td>$259</td> </tr> <tr> <td>HDD/SSD</td> <td>1TB 7,200 Hitachi</td> <td>256GB Sandisk Ultra</td> <td>$159</td> </tr> <tr> <td>ODD</td> <td>DVD burner</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>OS</td> <td>32-bit Windows Vista Ultimate</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>Total upgrade cost</td> <td>&nbsp;</td> <td>&nbsp;</td> <td>$657</td> </tr> </tbody> </table> </div> </div> <h4>The easy upgrade path</h4> <p>All in all, it’s not a bad PC, but the most obvious upgrade was storage. It’s been a long time since we used a machine with a hard drive as the primary boot device, and having to experience it once again was simply torture. We’re not saying we don’t love hard drives—it’s great to have 5TB of space so you never have to think about whether you have room to save that ISO or not—just not as the primary boot device. Our first choice for an upgrade was a 256GB Sandisk Ultra Plus SSD for $159. We thought about skimping for the 128GB version, but then figured it’s worth the extra $60 to double the capacity—living on 128GB is difficult in this day and age. The SSD could easily be moved to a new machine, too, as it’s not tied to the platform.</p> <p>The OS is 64-bit Windows 7 Pro, so there’s no need to “upgrade” to Windows 8.1. No, we’d rather put that $119 into the two other areas that need to be touched up. The GPU, again, is the GeForce GTX 570. Not a bad card in its day, but since the Skeleton’s current owner does fair bit of gaming, we decided it was worth it to invest in a GPU upgrade. We considered various options, from the GeForce GTX 770 to a midrange Radeon R9 card, but felt a GeForce GTX 760 was the right fit, considering the system’s specs. It simply felt exorbitant to put a $500 GPU into this rig. Even the GTX 770 at $340 didn’t feel right, but the Asus GTX760-DC2OC-2GD5 gives us all the latest Nvidia technologies, such as ShadowPlay. The card is also dead silent under heavy loads.</p> <p>Our next choice was riskier. We definitely wanted more performance out of the 3.2GHz Phenom II X4 955 using the old “Deneb” cores. The options included adding more cores by going to a 3.3GHz Phenom II X6 1100T Thuban, but all we’d get is two more cores and a marginal increase in clock speed. Since the Thuban and Deneb are so closely related, there would be very little to be gained in microarchitecture upgrades. X6 parts can’t be found new, and they fetch $250 or more on eBay. As any old upgrading salt knows, you need to check the motherboard’s list of supported chips before you plug in. The board has an AM3 socket, but just because it fits doesn’t mean it works, right? Asus’ website indicates it supports the 3.6GHz FX-8150 “Zambezi” using the newer Bulldozer core, but the Bulldozer didn’t exactly blow us away when launched and they’re also out of circulation. (Interestingly, the FX-8150 sells for less than the Phenom II X6 chips.) Upgrading the motherboard was simply out of the question, too. Our last option was the most controversial. As we said, you should always check the motherboard maker first to find out what chips are supported.</p> <p>After that, you should then check to see if some other adventurous user has tried to do it anyway: “Damn the CPU qual list, full upgrade ahead!” To our surprise, yes, several anonymous Internet forums have indeed dropped the 4GHz FX-8350 “Vishera” into their CHIV boards with no reported of issues. That FX-8350 is also only $199—cheaper than a used X6 part. We considered overclocking the part, but the Skeleton’s confines make it pretty difficult. It’s so tight that we had issues putting the GeForce GTX 760 in it, so using anything larger than the stock cooler didn’t make sense to us. We’re sure you can find a cooler that fit, but nothing that small would let us overclock by any good measure, so it didn’t seem prudent.</p> <h4 style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_2_small_0.jpg"><img src="/files/u152332/main_image_2_small.jpg" width="620" height="401" /></a></h4> <h4>Was it worth it?</h4> <p>Let’s just say this again if it’s not clear to you: If you are running a hard drive as your boot device, put this magazine down and run to the nearest store to buy an SSD. Yes, hard drives are that slow compared to SSDs. In fact, if we had money for only one upgrade, it would be the SSD, which will make an old, slow machine feel young again. This machine, for example, would boot to the desktop in about 38 seconds. With the SSD, that was cut down to 15 seconds and general usability was increased by maybe 10 million percent.</p> <p>Our CPU upgrade paid off well, too. AMD’s Vishera FX-8350 offers higher clock speeds and significant improvements in video encoding and transcoding. We saw an 83 percent improvement in encoding performance. The eight cores offer a huge advantage in thread-heavy 3D modelling, as well. We didn’t get the greatest improvement with Stitch.Efx 2.0, but the app is very single-threaded initially. Still, we saw a 30 percent increase, which is nothing to sneeze at.</p> <p>In gaming, we were actually a bit disappointed with our results, but perhaps we expected too much. We tested using Batman: Arkham Origins at 1080P with every setting maxed out and saw about a 40 percent boost in frame rates. Running Heaven 4.0 at 1080P on max we also saw about a 42 percent increase in frame rate. Again, good. But for some reason, we expected more.</p> <h4>Regrets, I’ve had a few</h4> <p>PC upgrades can turn into a remorsefest or an inability to face the fact that you made the wrong choice. With our upgrades, we were generally pleased. While some might question the CPU upgrade (why not just overclock that X4?), we can tell you that no overclock would get you close to the FX-8350 upgrade in overall performance. The SSD upgrade can’t be questioned. Period. End of story. The difference in responsiveness with the SSD over the 1TB HDD is that drastic.</p> <p>When it comes to the GPU upgrade, though, we kind of wonder if we didn’t go far enough. Sure, a 40 percent performance difference is the difference between playable and non-playable frame rates, but we really wanted to hit the solid 50 percent to 60 percent mark. That may simply be asking too much of a two-generation GPU change, not going all the way to the GeForce GTX 570’s spiritual replacement: the GeForce GTX 770. That would actually put us closer to our rule of thumb on a gaming rig of spending about half on your CPU as your GPU, but the machine’s primary purpose isn’t just gaming, it’s also content creation.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong></div> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light">Pre-upgrade</th> <th></th> </tr> </thead> <tbody> <tr> <td class="item">Cinebench R15</td> <td class="item-dark">326</td> <td><span style="text-align: center;">641</span></td> </tr> <tr> <td>ProShow Producer 5.0 (sec)</td> <td>3,276</td> <td><span style="text-align: center;">1,794</span></td> </tr> <tr> <td class="item">Stitch.Efx (sec)</td> <td class="item-dark">1,950</td> <td>1,500</td> </tr> <tr> <td>Bootracer (sec)</td> <td>37.9</td> <td>15 <strong>(+153%)</strong></td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td>58</td> <td>81</td> </tr> <tr> <td class="item">Heaven 4.0 (fps)</td> <td class="item-dark">29.5</td> <td>41.9<strong>&nbsp;</strong></td> </tr> </tbody> </table> </div> <div class="spec-table orange"> <hr /></div> <h3>One Dusty Nehalem</h3> <p><strong>The original Core i7 still has some juice</strong></p> <p>It’s easy to make upgrade choices on an old dog with AGP graphics and Pentium 4, or even a Core 2 Duo on an obsolete VIA P4M890 motherboard (yes, it exists, look it up.) When you get to hardware that’s still reasonably fast and relatively “powerful,” the upgrade choices you have to make can get quite torturous.</p> <p>That’s certainly the case with this PC, which has an interesting assortment of old but not obsolete parts inside the Cooler Master HAF 922 case. We’ve always been fans of the HAF series, and despite being just plain-old steel, the case has some striking lines. It does, however, suffer from a serious case of dust suckage. Between the giant fan in front and various other fans, this system was chock-full of the stuff.</p> <p>The CPU is the first-generation Core i7-965 with a base clock of 3.2GHz and a Turbo Boost of 3.46GHz. That may seem like a pretty mild Turbo, but that’s the way it was way back in 2008, when this chip was first released. It’s plugged into an Asus Rampage II Extreme motherboard using the X58 chipset, and running 6GB of DDR3/1600 in triple-channel mode.</p> <p>In graphics, it’s also packing some heat with the three-year-old GeForce GTX 590 card. For those who don’t remember it, the card has two GPU cores that basically equal a pair of GeForce GTX 570 cards in SLI. There was a secondary 1TB drive in the machine, but in the state we got it, it was still using it’s primary boot device—a 300GB Western Digital Raptor 10,000rpm hard drive that was 95 percent stuffed with data. Oh, and the OS is also quite vintage, with 64-bit Windows Vista Ultimate.</p> <div class="module orange-module article-module"><strong><span class="module-name">Specifications</span></strong><br /> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light">Original part</th> <th>Upgrade Part</th> <th>Upgrade Part Cost</th> </tr> </thead> <tbody> <tr> <td class="item">Case/PSU</td> <td class="item-dark">Cooler Master HAF 922 / PC Power and Cooling 910</td> <td><span style="text-align: center;">No Change</span></td> <td>&nbsp;</td> </tr> <tr> <td>CPU</td> <td>3.2GHz Core i7-965 Extreme Edition</td> <td><span style="text-align: center;">No change</span></td> <td>&nbsp;</td> </tr> <tr> <td class="item">Motherboard</td> <td class="item-dark">Asus Rampage II Extreme</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>Cooling</td> <td>Stock</td> <td>Corsair Hydro Cooler H75</td> <td>$69</td> </tr> <tr> <td>RAM</td> <td>6GB DDR3/1600 in dual-channel mode</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td class="item">GPU</td> <td class="item-dark">GeForce GTX 590</td> <td>No Change</td> <td></td> </tr> <tr> <td>HDD/SSD</td> <td>300GB 10,000rpm WD Raptor, 1TB 7,200rpm Hitachi </td> <td>256GB Sandisk Ultra</td> <td>$159</td> </tr> <tr> <td>ODD</td> <td>Lite-On Blu-Ray burner</td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>OS</td> <td>64-bit Windows Vista Ultimate </td> <td>No Change</td> <td>&nbsp;</td> </tr> <tr> <td>Total upgrade cost</td> <td>&nbsp;</td> <td>&nbsp;</td> <td>$277</td> </tr> </tbody> </table> </div> </div> <h4>Always Be Upgrading The SSD</h4> <p>Our first upgrade decision was easy—SSD. In its day, the 300GB Raptor was the drive to have for its performance, but with the drive running at 90 percent of its capacity, this sucker was beyond slow. Boot time on the well lived-in Vista install was just over two minutes. Yes, a two-minute boot time. By moving to an SSD and demoting the Raptor to secondary storage, the machine would see an immediate benefit in responsiveness. For most people who don’t actually stress the CPU or GPU, an SSD upgrade is actually a better upgrade than buying a completely new machine. And yes, we fully realize the X58 doesn’t have support for SATA 6Gb/s, but the access time of the SSD and pretty much constant read and writes at full bus speed will still make a huge difference in responsiveness.</p> <p>The real conundrum was the CPU. As we said, this is the original Core i7, a quad-core chip with Hyper-Threading and support for triple-channel RAM. The CPU’s base clock is 3.2GHz. It is an unlocked part, but the chip is sporting a stock 130W TDP Intel cooler. Believe it or not, this is actually how some people build their rigs—they buy the overclocked part but don’t overclock until later on, when they need more performance. Well, we’re at that point now, but we knew we weren’t going very far with a stock Intel cooler, so we decided that this was the time to introduce a closed-loop liquid cooler in the form of a Corsair H75. Our intention was to simply overclock and call it a day, but when we saw some of the performance coming out of the AMD Skeleton, we got a little jealous. In two of our tests for this upgrade story, the AMD FX-8350 was eating the once-mighty Nehalem’s lunch. Would overclocking be enough? That got us wondering if maybe we should take the LGA1366 to its next-logical conclusion: the Core i7-970. The Core i7-970 boasted six cores with Hyper-Threading for a total of 12 threads. It has the same base clock of 3.2GHz and same Turbo Boost of 3.46GH, but it uses the newer and faster 32nm “Westmere” cores. Long since discontinued, it’s easy to find the chips used for about $300, which is about half its original price. This is that conundrum we spoke of—while the Westmere would indeed be faster, especially on thread-heavy tasks such as video encoding and 3D modeling, do we really want to spend $300 on a used CPU? That much money would almost get us a Core i7-4770K, which would offer far more performance in more apps. Of course, we’d have to buy a new board for that, too. In the end, we got cold feet and decided to stick with just an overclock.</p> <h4>Windows Vista Works</h4> <p>Even our OS choice had us tied up. There’s a reason Windows Vista was a hated OS when it was released. It was buggy, slow, and drivers for it stunk. For the most part, though, Windows Vista turned into a usable OS once Service Pack 1 was released, and Service Pack 2 made it even better. While we’d never buy Vista over Windows 7 today, it’s actually functional, and the performance difference isn’t as big as many believe it to be, when it’s on a faster system. The only real shortcoming of Windows Vista is the lack of trim support for the SSD. That means the build would have to have the SSD manually optimized using the drive’s utility, or we’d have to count on its garbage collection routines. For now, we’d rather put the $119 in the bank toward the next system build with, perhaps, Windows 9.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image1_small_0.jpg"><img src="/files/u152332/main_image1_small.jpg" width="620" height="403" /></a></p> <p>Even more difficult was our choice on the GPU. The GeForce GTX 590 was a top-of-the-line card and sold for $700 in 2011. Obviously, this card was put into the system after the box was initially built, so it has had one previous upgrade. In looking at our upgrade options, our first thought was to go for something crazy—such as a second GTX 590 card. They can be found used for about $300. That would give the machine Quad SLI performance at far less the cost of a newer top-tier GPU. That fantasy went up in smoke when we realized the PC Power and Cooling Silencer 910 had but two 8-pin GPU power connectors and we’d need a total of four to run Quad SLI. Buying another expensive PSU just to run Quad SLI just didn’t make sense in the grand scheme of things, since the PSU is perfectly functional and even still under warranty. Once the second GTX 590 was ruled out, we considered a GeForce GTX 780 Ti as an option. While the 780 Ti is a beast, we came to the realization that the GTX 590 honestly still has plenty of legs left, especially for gaming at 1080p. The 780 Ti is indeed faster by 20 to 50 percent, but we decided not to go that route, as the machine still produces very passable frame rates.&nbsp; In the end, we spent far less upgrading this machine than the other two. But perhaps that makes sense, as its components are much newer and faster than the other two boxes.</p> <h4>Post-upgrade performance</h4> <p>With our only upgrades on this box being an overclock and an SSD, we didn’t expect too much—but we were pleasantly surprised. Our mild overclock took the part to 4GHz full-time. That’s 800MHz over the base clock speed. In Cinebench R15, the clock speed increase mapped pretty closely to the performance difference. In both ProShow Producer and Stitch.Efx, though, we actually saw greater performance than the simple overclock can explain. We actually attribute the better performance to the SSD. While encoding tasks are typically CPU-bound, disk I/O can make a difference. Stitch.Efx also spits out something on the order 20,000 files while it creates the gigapixel image. The SSD, of course, made a huge difference in boot times and system responsiveness, even if it wasn’t on a SATA 6Gb/s port.</p> <h4>Regrets</h4> <p>Overall, we were happy with our upgrade choices, with the only gnawing concern being not upgrading the GPU. It just ate us up knowing we could have seen even better frame rates by going to the GTX 780 Ti. But then, we also have $750 in our pocket that can go toward the next big thing.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong><br /> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light">Pre-upgrade</th> <th></th> </tr> </thead> <tbody> <tr> <td class="item">Cinebench R15 64-bit</td> <td class="item-dark">515</td> <td><span style="text-align: center;">617</span></td> </tr> <tr> <td>ProShow Producer 5.0 (sec)</td> <td>2,119</td> <td><span style="text-align: center;">1,641<strong>&nbsp;</strong></span></td> </tr> <tr> <td class="item">Stitch.Efx (sec)</td> <td class="item-dark">1,446</td> <td>983</td> </tr> <tr> <td>Bootracer (sec)</td> <td>126</td> <td>18&nbsp; <strong>(+600%)</strong></td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td>86</td> <td>87</td> </tr> <tr> <td class="item">Heaven 4.0 (fps)</td> <td class="item-dark">68.2</td> <td>68.7</td> </tr> </tbody> </table> </div> </div> <h3> <hr /></h3> <h3>How to upgrade from Windows XP</h3> <p><strong>It’s game over, man!</strong></p> <p>Stick a fork in it. It’s done. Finito. Windows XP is a stiff. Bereft of life, it rests in peace… on a considerable number of desktops worldwide, much to Microsoft’s chagrin.</p> <p>You’ve read Microsoft’s early-2012 announcement. You’ve seen all the news since then: the warnings, the pleas, the tomes of comments from frustrated users who wish they could just have a fully supported Windows XP until the launch of Windows 20. If you were a holdout, you even got a few pop-ups directly in your operating system from Microsoft itself, imploring you to switch on up to a more powerful (re: supported) version of Windows. So says Microsoft:</p> <p>“If you continue to use Windows XP after support ends, your computer should still work, but it will become five times more vulnerable to security risks and viruses. And as more software and hardware manufacturers continue to optimize for more recent versions of Windows, a greater number of programs and devices like cameras and printers won’t work with Windows XP.”</p> <p>There you have it: Keep on keepin’ on with Windows XP and you’ll slowly enter the wild, wild west of computing. We can’t say that your computer is going to be immediately infected once you reach a set time period past what’s been chiseled on the operating system’s tombstone. However, the odds of you suffering an attack that Microsoft has no actual fix for certainly increase. You wouldn’t run a modern operating system without the latest security patches; why Windows XP?</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_4_small_0.jpg"><img src="/files/u152332/main_image_4_small.jpg" width="620" height="397" /></a></p> <p>So, what’s a person to do? Upgrade, obviously. We do warn in advance that if your current Windows XP machine is chock-full of legacy apps (or you’re using more antiquated hardware like, dare we say it, a printer attached to a parallel port), then you might find that upgrading to a newer version of the OS ruins the experience you previously had. For that, we can only suggest taking advantage of the ability of newer versions of Windows to support virtualized Windows XP environments—Windows 7 supports the Virtual PC–based “Windows XP Mode” natively, whereas those on Windows 8 can benefit from freeware like Virtualbox to run a free, Microsoft-hosted download of a virtualized Windows XP.</p> <p>As for what you should upgrade to, and how, we’re recommending that you go with Windows 8—unless you can find Windows 7 for extremely cheap. Microsoft has greatly improved resource use in its flagship OS, in addition to streamlining startup times, adding more personalization, and beefing up security. Windows 8 has far more time before its end-of-life than Windows 7, even though, yes, you’ll have to deal with the Modern UI a bit when you make your upgrade.</p> <h3>Step-by-Step Upgrade Guide</h3> <p><strong>Anyone can upgrade, but there is a right way and wrong way</strong></p> <p style="text-align: center;"><strong><a class="thickbox" href="/files/u152332/mpc99.feat_pcupgrade.xp_3_small_0.jpg"><img src="/files/u152332/mpc99.feat_pcupgrade.xp_3_small.jpg" alt="The Windows 7 Upgrade Advisor is a bit more useful than the Windows 8 Upgrade Assistant in terms of actionable items that you’ll want to know about. Doesn’t hurt to run both!" width="620" height="457" /></a></strong></p> <p style="text-align: center;"><strong>The Windows 7 Upgrade Advisor is a bit more useful than the Windows 8 Upgrade Assistant in terms of actionable items that you’ll want to know about. Doesn’t hurt to run both!<br /></strong></p> <p>Will your legacy system even run a modern version of Windows? That’s the first thing you’re going to want to check before you start walking down the XP-to-8 upgrade path. Microsoft has released two different tools to help you out—only one of them works for Windows XP, however. Hit up Microsoft’s site and do a search for “Windows 8 Upgrade Assistant.” Download that, install it on your Windows XP machine, and run the application.</p> <p>After a (hopefully) quick scan of your system, the program will report back the number of apps and devices you’re using that are compatible with Windows 8. In a perfect world, that would be all of them. However, the tool will also report back fatal flaws that might prevent you from running Windows 8 on your Windows XP machine to begin with—like, for example, if your older motherboard and CPU don’t support the Windows 8–required Data Execution Prevention.</p> <p>Since Windows 8 is quite a bit removed, generation-wise, from Windows XP, there’s no means by which you can simply run an in-place upgrade that preserves your settings and installed applications. Personal files, yes, but now’s as good a time as any to get your data organized prior to the big jump—no need to have Windows 8 muck things up for you, as it will just create a “windows.old” folder that’s a dump of the “Documents and Settings” folders on your XP system.</p> <p>If you have a spare hard drive lying around, you could always clone your current disk using a freeware app like Clonezilla, install Windows 8 on your old drive, and sort through everything later. If not, then you’re going to want to grab some kind of portable storage—or, barring that, sign up for a cloud-based storage service—and begin the semi-arduous task of poring over your hard drive for all of your important information.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/mpc99.feat_pcupgrade.xp_7_small_0.jpg"><img src="/files/u152332/mpc99.feat_pcupgrade.xp_7_small.jpg" alt="The Windows Easy Transfer app, downloadable from Microsoft, helps automate the otherwise manual process of copying your files from your XP machine to portable storage." width="620" height="491" /></a></p> <p style="text-align: center;"><strong>The Windows Easy Transfer app, downloadable from Microsoft, helps automate the otherwise manual process of copying your files from your XP machine to portable storage.</strong></p> <p>There really isn’t a great tool that can help you out in this regard, except perhaps WinDirStat—and that’s only assuming that you’ve stored chunks of your important data in key areas around your hard drive. If worse comes to worse, you could always back up the entire contents of your “Documents and Settings” folder, just to be safe. It’s unlikely that you’ll have much critical data in Program Files or Windows but, again, it all depends on what you’ve been doing on your PC. Gamers eager to make sure that their precious save files have been preserved can check out the freeware GameSave Manager to back up their progress.</p> <p>As for your apps, you’re going to have to reinstall those. You can, however, simplify this process by using a tool like Ninite to quickly and easily install common apps. CCleaner, when installed on your old XP system, can generate a list of all the apps that you’ve previously installed within the operating system—handy for making a checklist for things you’ll want to reinstall later, we suppose. And finally, an app like Magical Jelly Bean’s Product Key Finder can help you recover old installation keys for apps that you might want to reinstall within Windows 8.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/mpc99.feat_pcupgrade.xp_8_small_0.jpg"><img src="/files/u152332/mpc99.feat_pcupgrade.xp_8_small.jpg" width="620" height="452" /></a></p> <p style="text-align: center;"><strong>Need to know what you’ll need to reinstall in Windows 8? Use CCleaner to make a simple text file of every app you installed on Windows XP, and check off as you go! </strong></p> <p>As for installing Windows 8, we recommend that you purchase and download the ISO version of the operating system and then use Microsoft’s handy Windows 7 USB/DVD Download Tool to dump the contents of that ISO onto a portable flash drive. Your installation process will go much faster, trust us. From there, installing the OS is as easy as inserting your USB storage, resetting your computer, and booting from the flash drive—which might be accessible via some “boot manager” option during your system’s POST, or might be a boot order–related setting that you have to set up within the BIOS itself.</p> <p>Other than that, the installation process is fairly straightforward once Windows 8 gets going. You’ll enter your product key, select a Custom installation, delete or format your drive partitions, install Windows 8 on the new chunk of blank, empty storage, and sit back and relax while the fairly simple installation process chugs away.</p> <p>You might not have the speediest of operating systems once Windows 8 loads, depending on just how long your Windows XP machine has been sitting around, but at least you’ll be a bit more secure! And, hey, now that you have a license key, you can always upgrade your ancient system (or build a new one!) and reinstall.</p> http://www.maximumpc.com/computer_upgrade_2014#comments computer upgrade Hardware Hardware how to June issue 2014 maximum pc Memory News Features Mon, 13 Oct 2014 22:11:21 +0000 Maximum PC staff 28535 at http://www.maximumpc.com Nvidia GeForce GTX 980 Review http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014 <!--paging_filter--><h3><span style="font-size: 1.17em;">4K and SLI tested on Nvidia's high-end Maxwell card</span></h3> <p>Sometimes things don't go according to plan. Both AMD and Nvidia were supposed to have shifted to 20-nanometer parts by now. In theory, that's supposed to get you lower temperatures, higher clock speeds and quieter operation. Due to circumstances largely out of its control, Nvidia has had to go ahead with a 28nm high-end Maxwell part instead, dubbed GM204. This is not a direct successor to the GTX 780, which has more transistors, texture mapping units, and things like that. The 980 is actually the next step beyond the GTX 680, aka GK104, which was launched in March 2012.</p> <p>Despite that, our testing indicates that the GTX 980 can still be meaningfully faster than the GTX 780 and 780 Ti (and AMD’s Radeon R9 290 and 290X, for that matter, though there are a of couple games better optimized for Radeon hardware). When 20nm processes become available sometime next year, we’ll probably see the actual successor to the GTX 780. But right now, the GTX 980 is here, and comes in at $500. That seems high at first, but recall that the GTX 680, 580, and 480 all launched at this price. And keep in mind that it’s a faster card than the 780 and 780 Ti, which currently cost more. (As we wrote this, AMD announced that it was dropping the base price of the R9 290X from $500 to $450, so that war rages on.) The GTX 970 at $329 may be a better deal, but we have not yet obtained one of those for testing.</p> <p>In other news, Nvidia told us that they were dropping the price of the GTX 760 to $219, and the GTX 780 Ti, 780 and 770 are being officially discontinued. So if you need a second one of those for SLI, now is a good time.</p> <p>Let's take a look at the specs:</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>GM204</td> <td>GM204&nbsp;</td> <td>GK104&nbsp;</td> <td>GK104&nbsp;</td> <td class="item-dark">GK110</td> <td>&nbsp;Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1126</td> <td>&nbsp;1050</td> <td>&nbsp;1006</td> <td>&nbsp;863</td> <td>876</td> <td>&nbsp;"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1216</td> <td>&nbsp;1178</td> <td>&nbsp;1058</td> <td>&nbsp;900</td> <td class="item-dark">928</td> <td>&nbsp;N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7000</td> <td>&nbsp;7000</td> <td>&nbsp;6000</td> <td>&nbsp;6000</td> <td>7000</td> <td>&nbsp;5000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;2GB/4GB</td> <td>&nbsp;3GB/6GB</td> <td>3GB</td> <td>&nbsp;4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;64</td> <td>&nbsp;64</td> <td>&nbsp;32</td> <td>&nbsp;48</td> <td>48</td> <td>&nbsp;64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;128</td> <td>&nbsp;192</td> <td>240</td> <td>&nbsp;176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;2048</td> <td>&nbsp;1664</td> <td>&nbsp;1536</td> <td>&nbsp;2304</td> <td>2880</td> <td>&nbsp;2816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;8</td> <td>&nbsp;12</td> <td>&nbsp;15</td> <td>&nbsp;N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;195</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;290</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$499</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>&nbsp;$549</td> </tr> </tbody> </table> </div> <p>On paper, the 980 and 970 don't look like much of a jump from the 680. In fact, the 980 has only 128 shaders (aka "CUDA cores") per streaming multi-processor (SM). Performance tends to increase with a higher number of shaders per SM, so how did the 980 GTX perform so well in our benches, despite having a worse ratio than all the other cards? Well, Nvidia claims that they've improved the performance of each CUDA core by 40%. Provided that this calculation is accurate, the GTX 980 effectively has about as many CUDA cores as a 780 Ti. Add the GTX 980's bigger clock speeds, and performance should be higher.</p> <p><img src="/files/u160416/7g0a0209_620.jpg" width="620" height="349" /></p> <p>You probably also noticed the unusually low price for the GTX 970. The GTX 670 launched at $400 in May 2012, and the GTX 570 launched at $350 in December 2010. These earlier two cards were also had more similar specs compared to their bigger brothers. For example, the GTX 570 had 480 CUDA cores, while the 580 had 512 cores. This is a difference of just 6.25%, although the memory bus was reduced from 384-bits to 320-bits. In contrast, the 970 gets nearly 20% fewer CUDA cores than the 980, though its memory bus remains unchanged. As we said, we haven't gotten a 970 in yet, but, based on its specs, we doubt that we can compensate with overclocking, as we've been able to do in the past with the GTX 670 and 760, and the Radeon R9 290.</p> <p>Nvidia also says that the official boost clock on these new Maxwell cards is not set in stone. We witnessed our cards boosting up to 1,253MHz for extended periods of time (i.e., 20 seconds here, 30 seconds there). When the cards hit their thermal limit of 80 degrees Celsius, they would fall down as low as 1,165Mhz, but we never saw them throttle below the official base clock of 1,126MHz. In SLI, we also noted that the upper card would go up to 84 C. According to Nvidia, these cards have an upper boundary of 95 C, at which point they will throttle below the base clock to avoid going up in smoke. We were not inclined to test that theory, for now.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,1" target="_blank">Next Page: Voxels, new anti-aliasing, and VR</a></h4> <hr /> <p>The company also says that its delta color compression algorithms have improved bandwidth requirements by about 25 percent on average (it varies from game to game). This extra headroom provides more space for increased frame rates. Since DCC directly affects pixels, this effect should scale with your resolution, becoming increasingly helpful as you crank your res higher.</p> <p>You can also combine these gains with Nvidia’s new Multi-Frame Sampled Anti-Aliasing (MFAA). This technique rotates a pixel’s sampling points from one frame to the next, so that two of these points can simulate the visual results of four sampling points whose locations remain static. The effect starts to shimmer at about 20FPS, whereupon it’s automatically disabled. But when running well, Nvidia claims that it can be 30 percent faster, on average, than the visually equivalent level of Multi-Sample Anti-Aliasing (MSAA). Like TXAA (Temporal Anti-Aliasing), this technique won’t be available on AMD cards (or if it is, it will be built by AMD from the ground up and called something else).</p> <p><img src="/files/u160416/7g0a0238_resize.jpg" width="620" height="349" /></p> <p>Unfortunately, MFAA was not available in the version 344.07 beta drivers given to us for testing, but Nvidia said it would be in the driver after this one. This means that the package will not be complete on launch day. Support will trickle down to the older Kepler cards later on. Nvidia hasn’t been specific about timelines of specific cards, but it sounded like the 750 and 750 Ti (also technically Maxwell cards), will not be invited to this party.</p> <p>Another major upgrade is Voxel Global Illumination, or VXGI. Nvidia positions this as the next step beyond ambient occlusion. With VXGI, light bounces off of surfaces to illuminate nooks and crannies that would otherwise not be lit realistically, in real time. Ordinarily, light does not bounce around in a 3D game engine like it does in meatspace. It simply hits a surface, illuminates it, and that’s the end. Sometimes the lighting effect is just painted onto the texture. So there’s a lot more calculation going on with VXGI.</p> <p><img src="/files/u160416/maxwell_die_620.jpg" width="620" height="349" /></p> <p>But Nvidia has not made specific performance claims because the effect is highly scalable. A developer can choose how many cones of light they want to use, and the degree of bounced light resolution (you can go for diffused/blurry spots of light, or a reflection that’s nearly a mirror image of the bounced surface), and they balance this result against a performance target. Since this is something that has to be coded into the game engine, we won’t see that effect right away by forcing it in the drivers, like Nvidia users can with ambient occlusion.</p> <p>Next is Dynamic Super Resolution (in the 344.11 drivers released today, so we'll be giving this one a peek soon). This tech combines super-sampling with a custom filter. Super sampling takes a higher resolution that your monitor can display and squishes it down. This is a popular form of anti-aliasing, but the performance hit is pretty steep. The 13-tap Gaussian filter that the card lays on top can further smooth out jaggies. It's a post-process effect that's thankfully very light, and you can also scale DSR down from 3840x2160 to 2560x1440. It's our understanding that this effect is only available to owners of the 980 and 970, at least for now, but we'll be checking on that ASAP.</p> <p>Nvidia is also investing more deeply into VR headsets with an initiative called VR Direct. Their main bullet point is a reduction in average latency from 50ms to 25ms, using a combination of code optimization, MFAA, and another new feature called Auto Asynchronous Warp (AAW). This displays frames at 60fps even when performance drops below that. Since each eye is getting an independently rendered scene, your PC effectively needs to maintain 120FPS otherwise, which isn’t going to be common with more demanding games. AAW takes care of the difference. However, we haven’t had the opportunity to test the GTX 980 with VR-enabled games yet.</p> <p>Speaking of which, Nvidia is also introducing another new feature called Auto Stereo. As its name implies, it forces stereoscopic rendering in games that were not built with VR headsets in mind. We look forward to testing VR Direct at a later date.</p> <p>Lastly, we also noticed that GeForce Experience can now record at this resolution. It was previously limited to 2560x1600.</p> <p>Until we get our hands on MFAA and DSR, we have some general benchmarks to tide you over. We tested the GTX 980 in two-way SLI and by itself, at 2560x1600 and 3820x2160. We compared it to roughly equivalent cards that we've also run in solo and two-way configs.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,2" target="_blank">Next Page: SLI Benchmarks!</a></h4> <hr /> <p>Here's the system that we've been using for all of our recent GPU benchmarks:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Thermaltake Toughpower Grand (1,050 watts)</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 Update 1</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p class="MsoNormal" style="text-align: left;"><span style="text-align: center;">Now, let’s take a look at our results at 2560x1600 with 4xMSAA. For reference, this is twice as many pixels as 1920x1080. So gamers playing at 1080p on a similar PC can expect roughly twice the framerate, if they use the same graphical settings. We customarily use the highest preset provided by the game itself; for example, <em>Hitman: Absolution</em> is benchmarked with the “Ultra” setting. 3DMark runs the Firestrike test at 1080p, however. We also enable TressFX in Tomb Raider, and PhysX in Metro: Last Light.</span></p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>33</strong></td> <td>&nbsp;19</td> <td>25</td> <td class="item-dark">&nbsp;27</td> <td>&nbsp;26</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>46</strong></td> <td>&nbsp;21</td> <td>&nbsp;22</td> <td>&nbsp;32</td> <td>&nbsp;30</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;75</td> <td>&nbsp;51</td> <td>&nbsp;65</td> <td>&nbsp;<strong>78</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;27</td> <td>&nbsp;40</td> <td>&nbsp;45</td> <td>&nbsp;<strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;45</td> <td>&nbsp;30</td> <td>&nbsp;43</td> <td>&nbsp;<strong>48</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;64</td> <td>&nbsp;35</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;34</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>11,490</strong></td> <td>&nbsp;6,719</td> <td>&nbsp;8,482</td> <td>&nbsp;9,976</td> <td>9,837</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong><strong>)</strong></p> <p class="MsoNormal" style="text-align: left;">To synthesize the results into a few sentences, we would say that the 980 is doing very well for its price. It’s not leapfrogging over the 780 and 780 Ti, but Nvidia indicates that it’s not supposed to anyway. It dominates the GTX 680, but that card is also two years old and discontinued, so the difference is not unexpected or likely to change buying habits. The R9 290X, meanwhile, is hitting $430, while the not-much-slower 290 can be had for as little as $340. And you can pick up a 780 Ti for $560. So the GTX 980's price at launch is going to be a bit of a hurdle for Nvidia.</p> <p class="MsoNormal" style="text-align: left;">Performance in Metro: Last Light has also vastly improved. (We run that benchmark with “Advanced PhysX” enabled, indicating that Nvidia has made some optimizations there. Further testing is needed.) Loyal Radeon fans will probably not be swayed to switch camps, at least on the basis of pure performance. Hitman in particular does not appear to favor the Green Team.</p> <p class="MsoNormal" style="text-align: left;">We were fortunate enough to obtain a second GTX 980, so we decided to set them up in SLI, at the same resolution of 2560x1600. Here, the differences are more distinct. We’ve honed the comparison down to the most competitive cards that we have SLI/CF benchmarks for. (Unfortunately, we do not have a second GTX 680 in hand at this time. But judging by its single-card performance, it's very unlikely to suddenly pull ahead.) For this special occasion, we brought in the Radeon R9 295X2, which has two 290X GPUs on one card and has been retailing lately for about a thousand bucks.</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>66</strong></td> <td>&nbsp;45</td> <td>&nbsp;56</td> <td>&nbsp;50</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>70</strong></td> <td>&nbsp;52</td> <td>&nbsp;53</td> <td>&nbsp;48</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;131</td> <td>&nbsp;122</td> <td>&nbsp;<strong>143</strong></td> <td>&nbsp;90</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;77</td> <td>&nbsp;74</td> <td>&nbsp;<strong>79</strong></td> <td>&nbsp;79</td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;80</td> <td>&nbsp;72</td> <td>&nbsp;<strong>87</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;73</td> <td>&nbsp;60</td> <td><strong>&nbsp;77</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>17,490</strong></td> <td>&nbsp;14,336</td> <td>&nbsp;16,830</td> <td>&nbsp;15,656</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p class="MsoNormal" style="text-align: left;">While a solo 980 GTX is already a respectable competitor for the price, its success is more pronounced when we add a second card—as is the gap between it and the 780 Ti. It still continues to best the GTX 780, getting us over 60 FPS in each game with all visual effects cranked up. That's an ideal threshold. It also looks like Nvidia's claim of 40 percent improved CUDA core performance may not be happening consistently. Future driver releases should reveal if this is a matter of software optimization, or if it's a limitation in hardware. Or just a random cosmic anomaly.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,3" target="_blank">Next Page: 4K benchmarks and conclusion</a></h4> <hr /> <p class="MsoNormal" style="text-align: left;">So, what happens when we scale up to 3840x2160, also known as “4K”? Here we have almost twice as many pixels as 2560x1600, and four times as many as 1080p. Can the GTX 980’s 256-bit bus really handle this much bandwidth?</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;16</td> <td>&nbsp;8.7*</td> <td>&nbsp;26</td> <td class="item-dark">&nbsp;<strong>28</strong></td> <td>&nbsp;28</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>36</strong></td> <td>&nbsp;12</td> <td>&nbsp;18</td> <td>&nbsp;19</td> <td>&nbsp;18</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;35</td> <td>&nbsp;25</td> <td>&nbsp;33</td> <td>&nbsp;<strong>38</strong></td> <td>&nbsp;38</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;20</td> <td>&nbsp;15</td> <td>&nbsp;20</td> <td>&nbsp;24</td> <td><strong>&nbsp;28</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;19</td> <td>&nbsp;15</td> <td>&nbsp;<strong>30</strong></td> <td><strong>&nbsp;30</strong></td> <td>&nbsp;26</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;19</td> <td>&nbsp;11</td> <td>&nbsp;<strong>23</strong></td> <td><strong>&nbsp;23</strong></td> <td>&nbsp;18</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>*TressFX disabled</p> <p>The 980 is still scaling well, but the 384-bit 780 and 780 Ti are clearly scaling better, as is the 512-bit 290X. (<strong>Update:</strong>&nbsp;We've re-checked our test results for Hitman: Absolution, and the AMD cards weren't doing nearly as well as we originally thought, though they're still the best option for that particular game. The Batman tests have been re-done as well.) We had to disable TressFX when benchmarking the 680, because the test would crash otherwise, and it was operating at less than 1FPS anyway. At 4K, that card basically meets its match, and almost its maker.</p> <p>Here's 4K SLI/Crossfire. All tests are still conducted at 4xMSAA, which is total overkill at 4K, but we want to see just how hard we can push these cards. (Ironically, we have most of the SLI results for the 290X here, but not for 2560x1600. That's a paddlin'.)</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;33</td> <td>&nbsp;41</td> <td>&nbsp;44</td> <td class="item-dark">&nbsp;52</td> <td>&nbsp;<strong>53</strong></td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;21</td> <td>&nbsp;27</td> <td>&nbsp;29</td> <td>&nbsp;26</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;<strong>68</strong></td> <td>&nbsp;60</td> <td>&nbsp;65</td> <td>&nbsp;67</td> <td>&nbsp;66</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;40</td> <td>&nbsp;44</td> <td><strong>&nbsp;53</strong></td> <td><strong>&nbsp;</strong><strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;39</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;40</td> <td>&nbsp;24</td> <td>&nbsp;19</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;34</td> <td>&nbsp;33</td> <td>&nbsp;<strong>44</strong></td> <td>&nbsp;17</td> <td>&nbsp;34</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>It does appear that the raw memory bandwidth of the 780, 780 Ti, and 290X come in handy at this resolution, despite the optimizations of Maxwell CUDA cores. That Metro: Last Light score remains pretty interesting. It's the only one we run with PhysX enabled (to balance out using TressFX in Tomb Raider). It really does look like Maxwell is much better at PhysX than any other GPU before it. That tech isn't quite common enough to change the game. But if the difference is as good as our testing indicates, more developers may pick it up.</p> <p>Even a blisteringly fast card can be brought down by high noise levels or prodigious heat. Thankfully, this reference cooler is up to the task. Keep in mind that this card draws up to 165 watts, and its cooler is designed to handle cards that go up to 250W. But even with the fan spinning up to nearly 3,000rpm, it’s not unpleasant. With the case side panels on, you can still hear the fan going like crazy, but we didn’t find it distracting. These acoustics only happened in SLI, by the way. Without the primary card sucking in hot air from the card right below it, its fan behaved much more quietly. The GTX 980’s cooling is nothing like the reference design of the Radeon R9 290 or 290X.</p> <p><img src="/files/u160416/key_visual_620.jpg" width="620" height="349" /></p> <p>With a TDP of just 165W, a respectable 650-watt power supply should have no trouble powering two 980 GTXs. Meanwhile, the 290-watt R9 290X really needs a nice 850-watt unit to have some breathing room, and even more power would not be unwelcome.</p> <p>Since MFAA and DSR were not available in the driver that was supplied for testing, there’s more story for us to tell over the coming weeks. (<strong>Update</strong>: DSR settings are actually in this driver, just not in the location that we were expecting.) And we still need to do some testing with VR. But as it stands right now, the GTX 980 is another impressive showing for Nvidia. Its 4K scaling isn't as good as we'd like, especially since Maxwell is currently the only tech that will have Dynamic Super Resolution. If you want to play at that level, it looks like the 290 and 290X are better choices, price-wise, while the overall performance crown at 4K still belongs to the 780 and 780 Ti. But considering the price difference between the 980 and the 780, its similar performance is commendable.</p> <p>For 2560x1600 or lower resolutions, the 980 GTX emerges as a compelling option, but we're not convinced that it's over $100 better than a 290X. Then again, you have MFAA, DSR, and VR Direct, (and the overall GeForce Experience package that's a bit slicker than AMD's Gaming Evolved) which might work some people, or for Nvidia loyalists who've been waiting for an upgrade from their 680 that's not quite as expensive as the 780 or 780 Ti.</p> <p><a href="http://www.pcgamer.com/2014/09/19/nvidia-gtx-980-tested-sli-4k-and-single-gpu-benchmarks-and-impressions/" target="_blank">Our amigo Wes Fenlon over at PC Gamer has a write-up of his own</a>, so go check it out.</p> http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014#comments 4k 980 GTX benchmarks comparison geforce gpu nvidia performance Review sli Video Card Videocards Fri, 19 Sep 2014 03:04:15 +0000 Tom McNamara 28564 at http://www.maximumpc.com Restore Your Computer to its Glory Days http://www.maximumpc.com/restore_computer_2014 <!--paging_filter--><h3>Give your PC a clean start</h3> <p>If you’re reading this, it’s highly likely that your PC is a fine-tuned piece of 64-bit technology, customized to the hilt and purring like a kitten with a belly full of formula. Yup, she’s a beaut, and attacks your daily tasks like a Belgian Police Dog going after a fleeing perp. All is well in the world, until one day when you sit down, fire it up, and realize something is different. That extra bit of snap when programs open is missing, and encoding video seems to take longer than it used to. Even downloading files seems to require more patience than you’re accustomed to exhibiting. It’s at this very moment that you silently say to yourself, “What the FRACK???”</p> <p>First things first—calm down, power user. Before you smash your rig with a hammer, pound on the keyboard, and decide to just nuke it from orbit, realize it’s just a temporary slowdown and it happens to everyone, even Maximum PC editors. Over time, PCs get slower; it’s just the nature of the beast. Don’t fret, we’re here to help by showing you how to give your PC a clean start. We'll show you how to restore you computer to its glory days, if you will. We’ll walk you step-by-step through the cleaning process, showing you what you need to get ’er done, and if you find you can’t resolve the problem, how to properly nuke it from orbit. We’ll also detail—pun intended—physically cleaning your rig. Once you’re finished, your PC will be noticeably perkier and everything will be right as rain. Now, drop the hammer, and let’s get started.</p> <h3>Back it up and kick the tires</h3> <p><strong>The only person to blame for not having a backup is you</strong></p> <p>There’s only two kinds of storage devices in this world: those that have already died and those that are going to die. If you’ve already identified that your PC is acting wonky, it’s time to back that mother up. It may seem counterintuitive that you would run a backup before you do a PC cleanup, but we highly recommend it: If you break something or something finally gives up the ghost, you’ll kiss your USB ports that you made a backup before it all went sideways. There are numerous aftermarket tools, but Microsoft has been kind enough to give you a fairly powerful backup and imaging tool in the OS itself. If you’re using Windows 7, just search for Backup, or dig into the Control Panel and look under System and Security. If you’re using Windows 8.x, the backup system is the same, although it’s hidden. To find it, go to the Control Panel and search for Windows 7 File Recovery.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/gordon_backup_small_1.jpg"><img src="/files/u152332/gordon_backup_small_0.jpg" alt="The Windows backup and restore program works well enough, and should be run regularly." width="620" height="547" /></a></p> <p style="text-align: center;"><strong>The Windows backup and restore program works well enough, and should be run regularly.</strong></p> <p>If you have multiple drives, you can choose how you want the backup to run, and manually select the other drives in the system for the backup set. You should set an automatic backup as well, and create a system restore disc. Ensure that you created a system image, also, should you need to restore the backup to a completely new hard drive.</p> <p>With your backup complete, it’s time to do a basic visual inspection of the internals of the PC for obvious problems, such as fans clogged with so much cat hair and dust that they’re causing the CPU or GPU to overheat and throttle, or data or power cables that have wiggled loose. Typically, loose or unplugged cables result in immediate show-stopping errors and crashes rather than a system slowdown. You’re more likely to find your fans clogged with dust running at low RPMs or fans that have died.</p> <h3>Mash Malware</h3> <p><strong>Don’t always blame malware, except when it’s to blame</strong></p> <p>If there’s a bogeyman of mysterious system slowdowns, it’s malware. In fact, if we had a nickel for every time a relative told us a “virus” was the cause of their slowdown, we’d have 0.08-34 of a Bitcoin. With that said, before you get too hip-deep in trying to speedupify a PC, a sweep for malware should be run. We’d also do a cursory examination of the OS for extraneous toolbars or tray items that have been installed. These aren’t truly malware, but still worthy of eradication.</p> <p>We’d also recommend a full system scan by the system’s real-time AV software (after updating the virus definitions). A secondary sweep using various on-demand tools is also on the to-do list. This would include browser-based file scanners available from all of the popular AV vendors, as well local tools such as Malwarebytes (www.malwarebytes.org) or SuperAntiSpyware (<a href="http://www.superantispyware.com/">www.superantispyware.com</a>). Running specific rootkit removal tools available from companies such as Malwarebytes and Sophos, among others, can’t hurt. Rootkits are a class of malware designed to thwart normal detection means. Before you get crazy about removing any detections, you should research it to make sure it isn’t just a false positive. And be advised that many types of malware can’t be removed with a single-click tool. You’ll typically have to dig deep in a multi-page guide to remove many of today’s specialty infections. Obviously, Binging will lead you to most guides, but a great place to start is Bleepingcomputer.com. The site has loads of removal guides and links to useful tools. But again, a word of warning: don’t just start ripping things out of the OS without knowing what you’re removing.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/gordon_rootkit_small_0.jpg"><img src="/files/u152332/gordon_rootkit_small.jpg" alt="A thorough check for malware is recommended before any serious system cleanup." title="Mash Malware" width="620" height="516" /></a></p> <p style="text-align: center;"><strong>A thorough check for malware is recommended before any serious system cleanup.</strong></p> <h3> <hr />Cruft clearing</h3> <p><strong>Declutter the system files</strong></p> <p>Any PC that you use daily will build up hundreds of gigabytes of file clutter over the months and years that you use it. As most people are rolling large mechanical drives, the clutter has an impact on performance and your ability to pack away even more cute kitten videos downloaded from the Internet.</p> <p>For this step, we’ll start with the low-hanging fruit. Simply open My Computer, right-click your primary drive, and select properties. Click Disk Cleanup and check off the things that are clutter (just about everything is in this panel) and click OK. We did this on a work box and shaved off 5GB in Windows Update files that had been sitting around. While 5GB isn’t much in the day of 4TB drives, many people still run 1TB and smaller drives with every nook, cranny, and sector filled (you know who you are.)</p> <p>The next easy cruft targets are the system restore points automatically created by Windows. Windows typically creates these snapshots of the OS when you install a new driver, OS update, or application. Windows sets a default for these based on the size of the drive it’s installed on, but they typically occupy gigabytes on the drive. To free up space, you can delete all but the latest restore points by clicking the More Options panel from Disk Cleanup, and selecting Clean Up under System Restore and Shadow Copies.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/cruft1_small_0.jpg"><img src="/files/u152332/cruft1_small.jpg" alt="The built-in disk cleanup does a decent first pass at dumping unneeded system clutter." title="Cruft clearing" width="500" height="612" /></a></p> <p style="text-align: center;"><strong>The built-in disk cleanup does a decent first pass at dumping unneeded system clutter.</strong></p> <p>Before you do this, though, think about how the recent stability&nbsp; of your system. If it’s been reliable but slow for the last few months, wiping the previous restore points should be fine. But if the system is being wonky, you may just need to rely on those restore points to get the box back to a point where it’s stable, so we’d recommend keeping the old restore points until you’re sure the box is working. You should also be aware that Windows 7 and Windows Vista used System Protection and Restore Points to occasionally make backup copies of your personal data files through the Volume Shadow Copies service. These older versions may be purged when you do this, but it won’t touch your most recent versions.</p> <p>Yeah, we know, many power users will thumb their nose at System Restore and some will outright switch it off because malware can use it as a place to hide, but the feature can truly be a bacon-saver sometimes.</p> <p>Another easy target to clean out is the default downloads folder. Other than documents, the vast majority of downloaded files can usually be dumped overboard.</p> <h3>Clean the Crap</h3> <p><strong>CCleaner is an easy-to-use, one-stop declogger</strong></p> <p>Originally named Crap Cleaner, this handy application has since been renamed to the more palatable CCleaner, but it still works amazingly well at clearing out the junk from the corners of your OS. Available for free from http://bit.ly/MPC_CCleaner, it’s an easy one-stop shop for freeing up space that you might normally miss with the built-in cleaner. As much as we like CCleaner, you shouldn’t expect miracles. We ran it on a three-year-old scungy build of Windows 7 after running the Window’s cleaning routine and CCleaner came up with 18.3GB to clean out—16GB had accumulated in the trash bin. One word of warning: By default, CCleaner will wipe out your browser cookies, which might throw you for a loop when you’re forced to sign into web sites that you may have forgotten the passwords for. It’s probably best to exclude browser history and also Windows Explorer Recent Documents from the CCleaner clean-out, too, because they don’t net you much space but make your system more livable.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/ccleaner_small_2.jpg"><img src="/files/u152332/ccleaner_small_1.jpg" alt="CCleaner still does an admirable job of emptying out unneeded files." width="620" height="546" /></a></p> <p style="text-align: center;"><strong>CCleaner still does an admirable job of emptying out unneeded files.</strong></p> <h3>Stop Startups</h3> <p><strong>Giddyap quicker</strong></p> <p>Oddly, many people still define their computing experience by how long it takes to cold-boot their PC. First, we just have to ask, have you tried standby or even hibernate? You know, those handy modes that can have you at the desktop five or 10 seconds after touching the mouse button or keyboard? No? You still prefer to boot from cold, anyway?</p> <p>If your OS install is a year or two old, you will have accumulated enough startup programs to significantly impact hard-drive boot times. The easiest way to remove these programs is click on the Start button, and type msconfig. Click on the Startup tab and scroll through the list, looking for things that don’t need to be started at launch. Uncheck them, click apply, then OK, and reboot.</p> <p>One thing to remember, Windows 7 will optimize the boot times automatically. If you reboot, and wait five minutes and reboot four or five times, the boot times should actually get better automatically as Windows 7 decides what it can prioritize.</p> <p>Windows 8.x (yes, haters, step back) actually improves upon boot times, as well. Anyone who has used the new OS can attest to its fast boot times. Win8 moves startup optimization to the Task Manager (ctrl-shift-esc). Click on the Startup tab, and Windows 8 will even tell you what’s slowing things down, and give you an estimate of how long it took to boot after the process was handed over to the OS.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msconfig_small_0.jpg"><img src="/files/u152332/msconfig_small.jpg" alt="You can manually deselect programs that start up from msconfig to speedify your boots." width="620" height="414" /></a></p> <p style="text-align: center;"><strong>You can manually deselect programs that start up from msconfig to speedify your boots.</strong></p> <p>Those of us who have moved on to the SSD-based western shores of Valinor live lives fairly well untroubled by slow startups. But those poor souls of middle earth still using mechanical-based drives are the ones who need to concern themselves with startup optimization.&nbsp;</p> <h3> <hr />Consider an upgrade</h3> <p><strong>Hardware isn’t always the answer, but it usually is</strong></p> <p>The vast majority of our tips to clean up a slow-running PC can be solved in software, but sometimes software isn’t the answer. How will you know the difference? One of the clearest indicators is age. Old PC components do not age like wine. If you’re at your buddy’s house to “take a look at his computer” and that computer is a Pentium 4 or Athlon XP, it’s a lost cause.</p> <p>So, while most newbs you’re trying to help can still benefit from the cleaning tips in this story, the P4/Athlon XP machines aren’t going to sing no matter how much you tune them. Putting money into a hardware upgrade for these old dogs should be carefully weighed: new parts can be difficult to locate and everything in the box is suspect.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/p4.northwood_small_0.jpg"><img src="/files/u152332/p4.northwood_small.jpg" alt="Unless you’re in the retro computing club, we’d recommend dumping that Pentium 4 box." title="P4" width="620" height="496" /></a></p> <p style="text-align: center;"><strong>Unless you’re in the retro computing club, we’d recommend dumping that Pentium 4 box.</strong></p> <p>It’s not so bad for a Phenom II or Core 2 box. In fact, these machines can be quite workable if the user has realistic expectations. Dropping an SSD into a Phenom II or Core 2 rig would be a game-changer for these old platforms, even if the motherboard doesn’t support the full SATA 6Gb/s speeds. Sometimes, a little RAM will even help, if the box was already memory-starved to begin with. With a 64-bit OS, 8GB is standard and 4GB is borderline.</p> <p>If gaming needs a boost, dropping in a newer GPU can certainly help. Even those rigs that are constrained by low-wattage PSUs now have a modern option with Nvidia’s new Maxwell series, which can run on even 300W PSUs.</p> <p>If the machine is also running that now-abandoned OS, Windows XP, an OS upgrade to Windows 7 or even Windows 8 is advised.</p> <p>Obviously, we don’t recommend $400 in upgrades on a $200 PC, but a $100 upgrade on a box that buys the person another 24 months of use can be a godsend for those on tight budgets. As we said, though, everything at or below the P4/Athlon XP line should be abandoned.</p> <h3>Visualize your drive</h3> <p><strong>Think of WinDirStat as Google Maps for your HDDs</strong></p> <p>You’ve cleaned up the extraneous system files on your machine, but the real junk is the gigabytes of nothingness you’ve collected from repeatedly dumping that 32GB memory card onto the hard drive because you were afraid to delete something you might need later. Six months later, those same unkempt files are bogging down your system and freeloading on your dime. When space gets tight, we turn to WinDirStat (<a href="http://www.windirstat.info">www.windirstat.info</a>).</p> <p>In the past, when drives were smaller and your file-hoarding was limited to a mere 500GB or so, you could rely on the good old-fashioned search-and-destroy technique: browsing through Windows Explorer for old photos, games, and files that you simply don’t use anymore. With 3TB and even 4TB drives packed with god knows what, that technique isn’t effective anymore. Instead, use Windows Directory Statistics, or WinDirStat, to help visualize and locate files on our drives that can be slated for termination. WinDirStat is an extremely lightweight (less than a megabyte) open-source program that scans your hard drive to provide you with three sets of information: directory list, tree map, and file extensions list. The tree map—easily the most attractive feature in the program—represents every file on your hard drive as a colored rectangle. Also handy is the extension list, which gives you total percentages calculated by file extensions.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/windirstat_small_2.jpg"><img src="/files/u152332/windirstat_small_1.jpg" alt="We dig the simple and effective representation of our hard drives from WinDirStat." width="620" height="349" /></a></p> <p style="text-align: center;"><strong>We dig the simple and effective representation of our hard drives from WinDirStat.</strong></p> <p>The tree map is the handiest and helps you easily see where you have bloat on your drives—the bigger the file, the bigger the rectangle. Scrolling over files displays the file name and its location, and you can delete files from within the program by selecting a file and pressing the delete key.</p> <h3>Dedupe it</h3> <p><strong>Duplicate often </strong></p> <p>Most people treat hard drives like the attic or garage. Rather than immediately culling extra files, you simply put it in storage to deal with at a later date (the road to hell, good intentions, etc). No matter that you already put those files in storage just last week—you’ll get around to dumping the duplicate files eventually. While there are many, many deduplication tools available, one good starting place is Auslogic’s free Duplicate File Finder app (<a href="http://www.auslogics.com/en/">www.auslogics.com</a>) It doesn’t have the bells or whistles of apps that analyze audio, photo, and video for duplicates, but it works fairly fast and is a good way to eliminate the obvious duplicate files. On one old Windows 7 box, Duplicate File Finder turned up a good 39GB of dupes that could be tossed. Simply fire up Duplicate File Finder, have it search your drive, and it will give you a list of duplicate files. Under Action, select All Duplicates In Each Group, and it will mark the duplicate files for dumping into a trash can, or moving into the Rescue Center, where you can recover the file if you realize later on you made a mistake.</p> <p>The program works well enough, but we wouldn’t wipe out files willy-nilly without first making a separate backup and making sure that the irreplaceable files going away are actually duplicates. DFF will show you the file name, file size, and creation date, which gives most people enough confidence to delete, but the paranoia in us would want to visually confirm it, too. This same philosophy is probably what brought us to this space issue in the first place. After all, am I sure I really did copy all of the images from the memory card to the computer? Even the ones I took last weekend? I’ll just make another copy... I have plenty of space.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/duplicatefilefinder_small_0.jpg"><img src="/files/u152332/duplicatefilefinder_small.jpg" alt="Duplicate File Finder can quickly, er, find your duplicate files." title="Duplicate File Finder " width="620" height="484" /></a></p> <p style="text-align: center;"><strong>Duplicate File Finder can quickly, er, find your duplicate files.</strong></p> <h3>Optimize your storage</h3> <p><strong>Storage is usually the prime suspect in system slowdowns</strong></p> <p>Before we get started discussing problems with your storage system and how to optimize it, make sure you have done two things: First, that you’ve connected your SSD to a SATA 6Gb/s port on your motherboard (consult your manual), and second, that you’ve enabled AHCI on your SATA controller via the motherboard BIOS. If you’ve already installed Windows and your SATA controller is set to IDE instead of AHCI, hit Google to find the registry hack to fix it. And yes, running in IDE mode rather than AHCI on a modern SSD can indeed rob you of performance.</p> <p>With that out of the way, the first thing to do when you sense your system is slowing down and you see your hard-drive activity LED churning constantly, is enlist the trusty three-finger salute. For the uninitiated, that means pressing ctrl-alt-delete to bring up the Task Manager in Windows. Select the Performance tab to see if anything is spiking or is nearing 100 percent utilization. From there, you can go to the Processes tab to see which process is taking up all those resources. In the screenshot below, we see a staff member’s work PC that suffered daily paralyzation at the hands of a virus scan and several associated processes. The resolution was to kill the processes, then make sure to schedule the virus scans during non-work hours.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/ssd_optimize_small_0.jpg"><img src="/files/u152332/ssd_optimize_small.jpg" alt="Both Samsung and Intel offer free “tuning” software that helps keep your SSD running in tip-top shape. " width="620" height="467" /></a></p> <p style="text-align: center;"><strong>Both Samsung and Intel offer free “tuning” software that helps keep your SSD running in tip-top shape. </strong></p> <p>If everything looks fine in the Task Manager but the system still feels slow, run a few benchmarks to see if the numbers are up to spec. For sequential read and write tests, we recommend Crystal-DiskMark for SSDs and HDTune for Hard drives. Admittedly, none of us use HDDs for our OS anymore—there’s no reason to with SSD prices falling faster than the value of Bitcoin.</p> <p>If you run the benchmarks and find the performance is lacking on your SSD, you have a few options. Your first is to optimize the drive via the Trim command. What this does is send a command to the drive that tells it to run its garbage-collection routine, which means it will erase all the blocks that have been deleted, clearing the way for them to receive fresh writes. If the drive has not been trimmed in a while, data can become fragmented all over the drive, and since blocks of an SSD have to be erased before they are written to (as opposed to a hard drive, where they can just be overwritten at any time), a simple write command can require the controller to delete blocks, move data around, and then perform the write, which can seriously degrade performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wtf_-_copy_small_0.jpg"><img src="/files/u152332/wtf_-_copy_small.jpg" alt="If your system feels like it’s stuck in the mud, the Task Manager will reveal what’s causing the problem. " width="620" height="564" /></a></p> <p style="text-align: center;"><strong>If your system feels like it’s stuck in the mud, the Task Manager will reveal what’s causing the problem. </strong></p> <p>In general, if you’re running Windows 7 or newer, you should be fine. However, you can Trim a drive manually on Windows 8: right-click the drive in My Computer, and click Properties, Tools, and then Optimize. If you own a Samsung or Intel SSD, you can download the free Samsung Magician or SSD Toolbox software, respectively, which also let you Trim your drive.</p> <p>&nbsp;</p> <hr /> <p>&nbsp;</p> <h3>HDD “Optimization”</h3> <p><strong>Fast hard drives aren’t</strong></p> <p>If you are running a hard drive and want to optimize it, there’s not a whole lot you can do beyond keeping it defragmented. To make sure it’s “defragged,” right-click the drive, select Properties, Tools, and then Defragmentation. Ideally, you should do this after you’ve done your cleaning of unused junk from the machine. If it’s your boot device, some people like to disable hibernation before a defrag to get a little extra “boost” out of the defrag by eliminating the multi-gigabyte hiberfil.sys file. Frankly, we don’t think it matters much anymore. In our opinion, the concept of a “fast hard drive” is antiquated now, due to SSDs, as is the concept of “optimizing” them. Any gains you make toward keeping a hard drive optimized will be largely unnoticeable in the real world, beyond dumping the useless cruft and running a basic defrag, which the OS will do on its own.</p> <h3>Let’s Get Physical</h3> <p><strong>Knock, knock, house cleaning</strong></p> <p>Unless you live in a HEPA-filtered cleanroom, a desktop PC will eventually need a physical cleanup as well as a digital one. That means opening up the case, which means turning off your rig and unplugging it from the wall. Don’t want to lose a finger in those fan blades. Most case panels are secured with six-sided Phillips screws, sometimes call a “hex” screw. Or they have thumbscrews, which can usually be removed by hand. Once taken out, keep these together in a small container. An empty coffee mug will do in a pinch.</p> <p>If you’ve had this PC for several months, you should see a coating of dust inside. That has to be removed, because it insulates surfaces and clogs up fans, which can lead to overheating. With a can of compressed air, spray short bursts at the dust. Long sprays can freeze the inner workings of the can. And tilting the can may also cause its liquid to spray, which contains a solvent that can damage the contact surface. Ideally, do this dusting outside, because you don’t want all that dust floating around indoors.</p> <p>Case fan filters can also get gnarly. These days, most of them slide out. Spray them with air, or remove them, run them under the tap, and air dry. Fans themselves also get grody. You may need to temporarily remove the CPU fan from the heatsink to clean both items sufficiently. When spraying fans, hold their blades down to prevent them from spinning, otherwise you may damage the motor.</p> <p>A periodic disinfecting wipe or baby wipe can take care of your mouse, but keyboards usually need you to pull their keycaps to really get at the crustiness underneath. A puller tool is best for this. You can order one online from Newegg or Amazon, and regional computer stores like Fry’s and Microcenter usually sell them. Some people run their boards through the dishwasher. Don’t use detergent or hot water for that, and give them at least a day to fully dry out.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/babywipes_small_0.jpg"><img src="/files/u152332/babywipes_small.jpg" alt="Gordon agrees, baby wipes work amazingly well for cleaning the surfaces of a dirty desktop or laptop." width="620" height="381" /></a></p> <p style="text-align: center;"><strong>Gordon agrees, baby wipes work amazingly well for cleaning the surfaces of a dirty desktop or laptop.</strong></p> <p>Last but not least, don’t forget to wipe the dust off your monitor’s screen. But don’t use conventional glass cleaner, because it can permanently damage the panel. You can buy screen-cleaning kits from most office supply stores, or you can use a spare microfiber cloth, like the kind made for camera lenses. Pharmacies also stock these. Just gently wipe the screen with it. If you need some liquid to clean the screen, spray your cloth with plain water from a mister. Never spray the screen itself, because the liquid can drip into the panel housing and corrode the components within.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/kitten_p34_small_0.jpg"><img src="/files/u152332/kitten_p34_small.jpg" alt="Tuxie the cat, pointing out a spot Josh missed while cleaning." width="620" height="521" /></a></p> <p style="text-align: center;"><strong>Tuxie the cat, pointing out a spot we missed while cleaning.</strong></p> <h4>An Ounce of Prevention</h4> <p>If you’ve just cleaned out a rig that’s never seen a proper cleaning, you’re probably wondering what you can do to avoid such horrors in the future. Fan filters are obviously one option. If they’re not built into your case, you can get them from sites like Newegg, Amazon, and Frozen CPU. Some have magnets, and you just slap them on; others need to be screwed in. To get the correct sizing, measure your fan diagonally with a ruler. The most common size is 120mm. A filter’s dense mesh will reduce airflow and increase temps in the case, so there’s a trade-off. Even the best filter will not completely eliminate dust, it will only reduce the number of times per year that you need to clean the insides. Smokers and owners of furry pets will also need to clean more often than usual. Periodically brushing those critters will help reduce buildup.</p> <p>And we don’t know if we have to mention this, but washing your hands a few times over the course of the day will also help prevent unsightly crud from building up on your input devices. This is especially important after a meal or after spending time outdoors. And speaking of food, try to keep it away from your keyboard, which is a crumb magnet and said to be dirtier than a toilet. If your mouse pad has an old-style fabric surface, you may want to consider eliminating it altogether (unless your desk is made of glass), or switching to one made of plastic or metal—materials that can be cleaned quickly and easily.</p> <h3>Nuke it from orbit</h3> <p><strong>Nothing can save LV426, so when it’s too mangled or infested, just nuke it</strong></p> <p>We won’t bother telling you to back up your data before you send your OS to meet its maker, because that is too obvious. But before you nuke the OS, make sure you have everything you need.</p> <p>What might not be obvious is that because of piracy, a lot of the more expensive software packages require activation, which also requires you to deactivate any serial numbers before you begin your bombing run. Most professional Adobe packages work this way, so if you’re running Photoshop, Illustrator, or any locally stored creative suite, be sure to open the app, click Help, and then Deactivate. Make sure you’ve done it correctly by firing up the program again to see if it asks you to activate. If it does, you’re good to go; keep in mind you’ll need Internet access to successfully do this. Also keep in mind that if you deactivate a piece of software, then upgrade your system, the software might think it’s a different computer, which can complicate re-activation.</p> <p>The activation process varies on a program-to-program basis, so use Google if you run into any issues. Microsoft’s Office suites react the same as the operating system, and any significant change in hardware will trigger a reactivation. The bottom line: If you have a mission critical application that you absolutely have to have up and running as soon as possible, be sure to know what the re-activation process is before you pull the trigger so there are no surprises. Some apps require you to contact the vendor for a new code before they will run, which is a wonderful thing to learn at midnight Friday before a three-day weekend when you need the app that night.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/adobe_small_0.jpg"><img src="/files/u152332/adobe_small.jpg" alt="In order to reinstall certain software, such as Adobe products, you must first deactivate the serial key." width="620" height="444" /></a></p> <p style="text-align: center;"><strong>In order to reinstall certain software, such as Adobe products, you must first deactivate the serial key.</strong></p> <p>There are other apps you should also pay attention to. First up, browser bookmarks. Chrome will let you sync your bookmarks on other machines, but you need to set it up to do so. If you’re into the old-school method, you can also export your bookmarks file as HTML and then re-import it. You’ll want to make sure you have a copy of your iTunes library handy, too, which is located in C:\Users\Username\My Music. Be sure to deauthorize iTunes while you’re at it. You’ll also want to back up your Steam library so that you don’t have to re-download all your games. To do this in Steam, click Steam in the upper left-hand corner, select Backup and Restore Games, then follow the prompts. Alternatively, you can do it manually by copying the entire Steam directory over. You no longer have to worry about save-game files, since they are now all automatically saved to the “Steam Cloud.”</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/steam_backup_small_0.jpg"><img src="/files/u152332/steam_backup_small.jpg" alt="Steam includes a built-in Backup and Restore tool, and we recommend using it." width="620" height="362" /></a></p> <p style="text-align: center;"><strong>Steam includes a built-in Backup and Restore tool, and we recommend using it.</strong></p> <p>Your final stop on this trail of tears is to make sure you have all the drivers you need for anything connected to your PC. At the very minimum, be sure to have your chipset and LAN drivers, as those always go first, and with an Internet connection you can always download anything else you need care of the helpful SlimDrivers utility. Don’t forget your printer drivers, though, and it doesn’t hurt to download Windows 7 SP1 or Windows 8.1 either, though Windows Update could also do it for you.</p> <p>Once you’ve deactivated your software, collected all the serial keys you need, made sure your Steam and iTunes libraries are backed up, saved your browser bookmarks, and have all your drivers, you are ready to proceed. Before you reboot your PC to reinstall, be sure to take a moment to consider all the amazing times it’s given you. Once that’s complete, shut her down, and we’ll see you on the other side.</p> http://www.maximumpc.com/restore_computer_2014#comments Adobe application malware May issues 2014 restore computer Software Office Applications Software Features Mon, 15 Sep 2014 22:49:00 +0000 Maximum PC staff 28340 at http://www.maximumpc.com Best Keyboard http://www.maximumpc.com/article/features/best_keyboard_2013 <!--paging_filter--><h3>UPDATE: We've added six more keyboards to our best keyboard roundup</h3> <p>If you’re a gamer, you can probably identify a few points in time when you realized something important about your control setup that made you better at the game. When you discovered that putting your left hand on WASD gives you more options than putting it on the arrow keys, for instance, or when you realized that your crappy optical mouse was actually holding you back in shooters. These kinds of peripheral epiphanies don’t happen every day, but it might be just about time for you to have a new one. It might be time for you to realize that your keyboard is holding you back.</p> <h3 style="text-align: center;"><img src="http://www.maximumpc.com/files/u152332/keyboard_opener13195_small_1.jpg" alt="best keyboard" title="best keyboard" width="620" height="480" /></h3> <p>We’re giving you some credit here—we’re not talking about making the upgrade from a $6 keyboard you got at the grocery store. No, we’re talking about making the upgrade from a gaming keyboard to an amazing gaming keyboard. Going from entry level or midrange to top-of-the-line.</p> <p>We looked around and picked out some of the <strong>best keyboards</strong> we could find. To compare them, we put them through our usual battery of real-world testing, including gaming and typing, and compared their features and overall feel. Because these keyboards come attached to some pretty heavy price tags, we made sure to give them extra scrutiny. We know that minor inconveniences that might fly on a cheap keyboard become a lot more galling when you’ve paid $150 for the privilege of suffering them, and our verdicts reflect this.</p> <p>Ready to make the upgrade to serious typing hardware? Then let’s go!</p> <h4 style="font-size: 10px;">CMStorm Mech</h4> <p><strong>CMStorm looks to get a handle on the high-end mechanical keyboard market<br /></strong></p> <p>The CMStorm Mech is, first of all, a great-looking keyboard. Most of the top of the keyboard is wrapped in a subtly etched aluminum plate, and the board’s geometric, asymmetrical silhouette is more imaginative than most. The aluminum plate can be removed for easy cleaning, which is a nice feature, but the seven hex screws that make removal possible mar the Mech’s otherwise-excellent aesthetics.</p> <p>Despite the Mech’s metal-clad looks, it’s not the sturdiest keyboard in this roundup. The back side of the board, and particularly the wrist rest, are made of hollow plastic that sometimes flexes and creaks under pressure. It also features a large handle on one side, and a detachable USB cable. These would be handy features for someone who takes their keyboard on the road frequently, but it’s not otherwise an especially portable keyboard. It would be nice if the handle were removable or retractable, because it adds an extra two or three inches to the Mech’s already substantial width.</p> <p>The software support is simple and easy to use. It allows you to customize the five dedicated macro keys, or to rebind any other key on the board, and includes a flexible macro editor.</p> <p>Actual typing and gaming performance is top-notch and virtually identical to the other mechanical gaming keyboards on the market. Fans of any variety of Cherry MX switch will be able to find a Mech that’s right for them—CMStorm offers the keyboard with Red, Blue, or Brown switches.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboards-13204_small_0.jpg"><img src="/files/u152332/keyboards-13204_small.jpg" alt="The Mech is a big mechanical keyboard, but isn't quite as sturdy as it looks." title="CMStorm Mech" width="620" height="425" /></a></p> <p style="text-align: center;"><strong>The Mech is a big mechanical keyboard, but isn't quite as sturdy as it looks.</strong></p> <p>In all, the Mech is a solid gaming keyboard, but doesn’t quite live up to its top-of-the-line $160 price tag.</p> <p><strong>CMStorm Mech</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$160,&nbsp;<a href="http://www.cmstorm.com/ " target="_blank">www.cmstorm.com</a></strong></p> <h4 style="font-size: 10px;">Mad Catz STRIKE 3</h4> <p><strong>Is a less-extravagant Strike a better deal?</strong></p> <p>The Strike 3 is the least expensive in Mad Catz’s line of high-end gaming keyboards, but it’s by no means a piece of budget hardware. If the $100 price tag doesn’t convince you of that, seeing the Strike 3 in person will.</p> <p>It’s designed to look like the higher-end Strike boards, which can be split into two parts and rearranged, but this one doesn’t actually come apart. Build quality is good overall, with a removable wrist-rest and a pair of USB passthroughs. The board comes in glossy black, red, and white, and features customizable backlighting.</p> <p>The Strike 3 isn’t mechanical, which weakens the credibility of this $100 keyboard, but Mad Catz hasn’t ignored key quality altogether. The dome switches on the Strike 3 are some of the best we’ve felt, with a crisp actuation that feels almost, but not quite, as good as a mechanical model. They definitely feel better than any of the other non-mechanical boards we tested for this roundup.</p> <p>The Strike 3 features five dedicated macro keys on the right side of the board, and seven macro buttons at the top-left. The left-side buttons, unfortunately, are pretty abysmal. They’re tiny, far away from the home row, and strangely wiggly in their sockets—we found it virtually impossible to hit a particular one without looking.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboards-13217_small_0.jpg"><img src="/files/u152332/keyboards-13217_small.jpg" alt="The seam down the middle of the Strike 3 is just for show—this keyboard's only one piece." title="Mad Catz STRIKE 3" width="620" height="461" /></a></p> <p style="text-align: center;"><strong>The seam down the middle of the Strike 3 is just for show—this keyboard's only one piece.</strong></p> <p>The Strike 3 is a good keyboard, but we would generally recommend a mechanical board if you’re looking to spend this much. If you personally prefer non-mechanical switches, however, this would be an excellent choice.</p> <p><strong>Mad Catz Strike 3</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$100, <a title="mad catz" href="http://www.madcatz.com" target="_blank">www.madcatz.com</a></strong></p> <h4 style="font-size: 10px;">Click the next page for more keyboard reviews.</h4> <h4 style="font-size: 10px;"> <hr />SteelSeries Apex</h4> <p><strong>All the keys you could want, and then some</strong></p> <p>Sometimes, more is more. That seems to be the guiding principle behind the SteelSeries Apex keyboard, which comes with about as many keys as we’ve ever seen on a gaming keyboard. In addition to the standard full QWERTY layout with number pad, the Apex includes 10 macro keys and four layer keys down the left side, 12 more macro keys above the function row, and six dedicated media buttons along the right side. Even the arrow pad gets two extra diagonal keys. SteelSeries doesn’t advertise the Apex as an MMO keyboard specifically, but it’s hard to imagine what other application could make use of this abundance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboards-13209_small_2.jpg"><img src="/files/u152332/keyboards-13209_small_1.jpg" alt="You can prop the Apex up in the back by replacing two of it's rubber feet." title="SteelSeries Apex" width="620" height="448" /></a></p> <p style="text-align: center;"><strong>You can prop the Apex up in the back by replacing two of it's rubber feet.</strong></p> <p>Despite its absurd inventory of keys, the Apex doesn’t feel cluttered at all, and in fact looks quite nice. With its built-in wrist rest the board is pretty enormous, but the low-profile keys and customizable sectioned backlighting keep it looking sleek. The build quality is good, though not quite as hardy as SteelSeries’s mechanical keyboards. The Apex includes a pair of USB passthroughs, and allows for some angle customization with a pair of swappable rear feet.</p> <p>Our only real issue with the Apex is that it doesn’t use mechanical keys, and even compared to other dome-switch keyboards in this roundup, like the Strike 3, the Apex’s keys feel distinctly mushy. If it had better key performance, it would be a strong contender for best keyboard in this price range. As it is, we’d recommend it highly to those who prioritize lots of macro keys and great design over maximum key responsiveness.</p> <p><strong>SteelSeries Apex</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$100,&nbsp;<a href="http://steelseries.com/ " target="_blank">www.steelseries.com</a></strong></p> <h3>What We Look for in a Keyboard</h3> <p>When we review a keyboard, we look at it on three levels. The first and most important level is basic user experience—how the board feels when you use it. This includes factors like key quality and responsiveness, layout, and build quality. Ninety-nine percent of the time, the way you use your keyboard comes down to those standard QWERTY keys, so we’ll take a great-feeling keyboard over a flimsy one with a zillion features any day. We would also consider a keyboard without enough anti-ghosting/rollover for gaming usage to have failed on this basic level.</p> <p>Second, we examine the board on the level of practical, value-adding features. These are what make a gaming keyboard different from a more standard keyboard, and include things like macro keys, profiles, USB/audio passthroughs, the ability to rebind any key, and media controls. Of course, there’s no standard rule for what’s “practical” and what’s not, and we take into consideration that, for instance, the first five macro keys add a lot more value to the keyboard than macro keys number 15-20. This is also the level where we consider the keyboard’s software support.</p> <p>Finally, we look at the keyboard’s less-essential features, and what they bring to the table. Here you’ll see us talk about things like backlighting, interchangeable keycaps, and paint jobs. These are frequently surface features, designed more for showing off to other gamers than for your own use.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/81173948_copy_small_1.jpg"><img src="/files/u152332/81173948_copy_small.jpg" width="620" height="412" /></a></p> <p>All of this isn’t to say that we think keyboards should be boring, just that it’s important they have their priorities straight. Awesome backlighting can be a great addition to a gaming keyboard, but boards with tons of bells and whistles built into a crappy or just mediocre foundation are distressingly common.</p> <h4 style="font-size: 10px;">Roccat Ryos Mk Pro</h4> <p><strong>This flashy keyboard is more than just looks</strong></p> <p>Build quality on the Ryos MK Pro is outstanding. It’s all plastic, as far as we can see, but is incredibly weighty and rugged-feeling. The surface is treated with a glossy dot-matrix pattern that gives the Ryos a high-class look without leaving it as vulnerable to fingerprints as a pure-gloss keyboard. Like the last Roccat keyboard we tested, the Ryos has a non-removable integrated wrist rest. It’s comfortable (particularly with the back of the board elevated on sturdy-feeling supports), but makes the keyboard take up an absolutely massive amount of desk space.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboards-13210_smalll_0.jpg"><img src="/files/u152332/keyboards-13210_smalll.jpg" alt="LEDs in each key in the Roccat MK Pro can light up and blink independently." title="Roccat Ryos Mk Pro" width="620" height="451" /></a></p> <p style="text-align: center;"><strong>LEDs in each key in the Roccat MK Pro can light up and blink independently.</strong></p> <p>The software support for the Ryos is fine, though not outstanding. The interface is a little cluttered and at times unresponsive, but it gets the job done, allowing you to customize lighting, macros, and key binding for each profile.</p> <p>A lot of keyboards have backlighting these days, but this is the first one we’ve tested that has completely independent lights behind every key. The color can’t be changed, but you can choose which keys should light up and which shouldn’t for each profile. Better still, the Ryos MK Pro comes with a few special lighting effects, which can cause pressed keys to briefly light up, or even to send out a ripple of light across the whole keyboard. It’s simultaneously the most superfluous and most fun new feature we’ve seen in a keyboard in years.</p> <p>It’s hard to say that the Ryos Mk Pro completely justifies the $170 asking price—that’s quite a bit more money than other very good mechanical keyboards—but it at least comes close.</p> <p><strong>Roccat Ryos MK Pro</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$170,&nbsp;<a href="http://www.roccat.org/ " target="_blank">www.roccat.org</a></strong></p> <h4 style="font-size: 10px;">Click the next page to read about the Gigabyte K7 review and more.</h4> <h4 style="font-size: 10px;"> <hr />Gigabyte Force K7</h4> <p><strong>A budget-friendly board that’s light on features</strong></p> <p>With a $50 MSRP, the Force K7 targets the budget-minded consumer, but still hovers comfortably above the bottom of the barrel. Any keyboard involves compromises, but with the K7, there just might be too many.</p> <p>The K7 advertises “extreme short actuation distance” for its keys, which are built on laptop-style scissor switches. Keyboard feel is a matter of personal preference, of course, but for gaming we’ve never been very fond of scissor switches, which offer almost no tactile feedback. The key layout on the K7 is standard, though it uses the half-width backspace key and double-decker enter key configuration that’s less commonly seen in gaming keyboards and makes touch typing a bit more difficult.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboards-13214_small_0.jpg"><img src="/files/u152332/keyboards-13214_small.jpg" alt="LEDs in each key in the Roccat MK Pro can light up and blink independently." title="Gigabyte Force K7" width="620" height="454" /></a></p> <p style="text-align: center;"><strong>The Force K7 has a low profile, with laptop-style scissor-switch keys.</strong></p> <p>Build quality on the K7 is generally good—it’s sturdy and feels heavy on the desk. Our review unit did, however, come with an extra 0 key instead of the hyphen key, which raises some questions about quality assurance.</p> <p>If anything, the K7 is notable for its lack of gaming-specific features. It has no macro keys, no profiles, no ability to rebind keys, no USB passthroughs—none of the things that identify a keyboard as made especially for gaming. The only extra features the board does include are underwhelming three-color backlighting and a pair of thumbwheels, which can only be used to control volume and backlight intensity.</p> <p>There are no glaring problems with the K7, but without a clear performance advantage, there’s nothing to recommend this board over one of the low-end Logitech or Microsoft keyboards, which are similarly priced and offer a better set of features.</p> <p><strong>Gigabyte Force K7</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$50,&nbsp;<a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h4 style="font-size: 10px;">Corsair Raptor K50</h4> <p><strong>The Cadillac of non-mechanical keyboards</strong></p> <p>The Corsair Raptor K50 is a beautifully designed board, borrowing the floating-keys design of the more expensive Vengeance boards, with just a hint of brushed aluminum along the top edge. The look is rounded out with high-quality customizable key lighting that shines through the keycaps, without leaking out around the edges of the keys. Build quality is second-to-none, and as usual, the raised-key design makes it easy to keep crumbs from accumulating under the keycaps.<strong>&nbsp;</strong></p> <p>The K50 is nicely feature-packed, with a USB passthrough, media keys, a large metal volume wheel, and, oh yeah, like a million macro keys. Well, 18, anyway, all in one huge bank at the left, along with dedicated buttons for switching between three macro layers and recording them on the fly. That number might be bordering on the too-many-to-actually-use zone, but some gamers might find a use for them all, and on-the-fly recording is a feature we wish more boards had. The software for the K50 works well, and onboard storage allows you to use your profiles on any computer.&nbsp;<strong>&nbsp;</strong></p> <p style="text-align: center;"><strong><a class="thickbox" href="/files/u152332/keyboards-13212_small_0.jpg"><img src="/files/u152332/keyboards-13212_small.jpg" alt="If you're the kind of gamer who needs an unhealthy number of macro keys, the Raptor K50 is for you." title="Corsair Raptor K50" width="620" height="413" /></a></strong></p> <p style="text-align: center;"><strong>If you're the kind of gamer who needs an unhealthy number of macro keys, the Raptor K50 is for you.<br /></strong></p> <p>We like the K50 a lot, but—at the risk of sounding like a broken record—for most users we wouldn’t recommend a non-mechanical $100 board. Our recommendation at this price range would be to get a mechanical board with slightly fewer features, or to jump up an extra $30 and get a similarly feature-packed mechanical board, such as Corsair’s own Vengeance K70 or K90.</p> <p><strong>Corsair Raptor K50</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$100,&nbsp;<a href="http://www.corsair.com/ " target="_blank">www.corsair.com</a></strong></p> <p>Click the next page to read about some of the older mechanical keyboards we've reviewed such as the Razer Deathstalker Ultimate and more.</p> <hr /> <p>&nbsp;</p> <h4>Razer Deathstalker Ultimate</h4> <p><strong>Fun to look at, less fun to use</strong></p> <p>The Razer Deathstalker is really a thing to behold. The gaming keyboard is thin, sleek, and nicely designed with tri-color glowing keys, but nothing draws your attention like the “Switchblade” user interface, borrowed from the <a title="razer blade" href="http://www.maximumpc.com/razer_blade_review2012" target="_blank">Razer Blade</a> gaming laptop.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wwkeyboards-5227183_small_3.jpg"><img src="/files/u152332/wwkeyboards-5227183_small_2.jpg" alt="Instead of a number pad, the Deathstalker Ultimate features a touchscreen, along with 10 contextual keys." title="Razer Deathstalker Ultimate" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>Instead of a number pad, the Deathstalker Ultimate features a touchscreen, along with 10 contextual keys.</strong></p> <p>The Switchblade UI consists of a responsive multitouch 4.3-inch LCD touchscreen and 10 context-sensitive dynamic keys. The screen can act as a trackpad, or can play host to a number of applications including a web browser, Twitter client, YouTube viewer, and plenty of others, such as game-specific apps for a handful of popular titles. Additionally, the keyboard has plenty of on-the-fly macro keys, and the software suite that manages it is polished and very powerful. In other words, the Razer Deathstalker is clearly the most sophisticated gaming keyboard around. The question is, do the Deathstalker’s technical flourishes justify its massive $250 price tag.</p> <p>At that kind of price, we expect every element of a keyboard to be top-notch; unfortunately, that’s not the case with the <a title="deathstalker" href="http://www.razerzone.com/deathstalker" target="_blank">Razer Deathstalker</a>. The problem is the keyboard itself, which uses widely spaced chiclet-style keys, familiar to anyone who’s used a MacBook or most Ultrabooks. They look nice, but it’s not clear why a large, high-end gaming keyboard would opt to use them over mechanical switches or even rubber-dome membrane keys. The chiclet keys simply don’t feel very good to use—they float around inside their tracks and have miniscule travel when pressed. They’re not awful, but we’d expect a lot better from a $250 keyboard.</p> <div class="lowdown"> <div class="module orange-module article-module verdict-block"><span class="module-name-header" style="font-size: 14px; border-bottom: 1px solid #000;">Razer Deathstalker Ultimate</span><br /> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="product-verdict"> <div class="positive"><span class="header">Juicy Fruit<br /></span> <p>Super-cool Switchblade UI; good software support.</p> </div> <div class="negative"><span class="header">Chiclets<br /></span> <p>Key quality is subpar for typing and game play; very expensive.</p> </div> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> </div> </div> </div> <p><strong>$150, <a href="http://www.razerzone.com " target="_blank">www.razerzone.com</a></strong></p> <h4>S.T.R.I.K.E. 7</h4> <p><strong>Plenty of novel features, but look at that price</strong></p> <p>Probably the most interesting thing about the <a title="strike 7" href="http://www.cyborggaming.com/strike7/" target="_blank">S.T.R.I.K.E. 7</a> is that it’s modular and customizable. When you first take it out of the box, the keyboard is in seven pieces, which can be screwed together in a number of different configurations. One of the pieces is a large touchscreen, which can be affixed to either the left or right side of the keyboard, as can an extra bank of macro keys and the adjustable “active palm rest,” which features a thumb wheel and button. The two halves of the keyboard can be used separately, though both must be connected to the touchscreen, and the kit comes with a set of 16 replacement key caps, so you can make sure your S.T.R.I.K.E. 7 doesn’t look like anyone else’s.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wwkeyboards-5227_small_0.jpg"><img src="/files/u152332/wwkeyboards-5227_small.jpg" alt="The S.T.R.I.K.E. 7 is modular, and can be assembled in several different configurations." title="Cyborg S.T.R.I.K.E. 7" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>The S.T.R.I.K.E. 7 is modular, and can be assembled in several different configurations.</strong></p> <p>On the other hand, you probably won’t meet anyone else with a S.T.R.I.K.E. 7, unless you regularly attend LAN parties down at the yacht club. At $300, this is the most expensive keyboard we can remember reviewing, and some of the features just don’t rise to the level of expectations set by the price. The touchscreen, for instance, is resistive and not nearly as responsive as the screen on the Razer Deathstalker Ultimate. And like the Deathstalker, the S.T.R.I.K.E. opts for non-mechanical keys. Though the dome-style membrane keys are better than the Deathstalker’s chiclet keys, we firmly believe that a keyboard that costs three times as much as most of its competition ought to have the best keys available.</p> <p><iframe src="//www.youtube.com/embed/3AbwJON7ECk" width="560" height="315" frameborder="0"></iframe></p> <div class="lowdown"> <div class="module orange-module article-module verdict-block"><span class="module-name-header" style="font-size: 14px; border-bottom: 1px solid #000;">S.T.R.I.K.E. 7</span><br /> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="product-verdict"> <div class="positive"><span class="header">Home Run<br /></span> <p>The most customizable keyboard around; tons of room for macros on keyboard and touchscreen.</p> </div> <div class="negative"><span class="header">Strike Out<br /></span> <p>Super pricey; non-mechanical keyboard feels so-so; touchscreen responsiveness is lacking.</p> </div> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> </div> </div> </div> <p><strong>$300, <a href="http://www.madcatz.com" target="_blank">www.madcatz.com</a></strong></p> <h4>Logitech G710+</h4> <p><strong>Logitech brings it back to basics</strong></p> <p>Logitech has finally decided that the recent trend toward mechanical gaming keyboards isn’t a passing fad, and has thrown its own hat into the ring with the G710+. At $150, the <a title="logitech g710+" href="http://gaming.logitech.com/en-us/product/g710plus-mechanical-gaming-keyboard" target="_blank">G710+</a> is one of the company’s most expensive boards, but it forgoes the LCD screens and raft of macro buttons usually found on Logitech’s highest-end products. Instead, the G710+ is a relatively straightforward keyboard built around a sturdy base of mechanical keys.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboards-5227187_small_1.jpg"><img src="/files/u152332/keyboards-5227187_small_0.jpg" alt="The backlight for the Logitech G710+’s arrow and WASD keys is separate from the rest of the board, so you can make them glow brighter than their surroundings." title="Logitech G710+" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>The backlight for the Logitech G710+’s arrow&nbsp; and WASD keys is separate from the rest of the board, so you can make them glow brighter than their surroundings.</strong></p> <p>The G710+ uses MX Cherry Brown switches, which are a sort of compromise between the hyper-sensitive Reds and the tactile (and loud) Blues. They’re a nice middle-ground switch, excellent for both gaming and typing, though not completely ideal for either. Logitech has augmented the Cherry Browns with noise-dampening rings inside each key, for a quieter gaming session. The keys are mounted into a heavy board, with a clean black-and-gray aesthetic with orange accents. When connected via USB, the G710+’s laser-etched keycaps glow white—you can’t change the color, but the brightness is adjustable. In a nice, novel feature, the brightness of the WASD and arrow keys can be adjusted independently, to make them stand out more.</p> <p>Beyond the mechanical keys, the G710+ doesn’t have a lot of flashy features—just a set of macro keys (programmable on-the-fly), some media controls, and a standard-issue software suite with pre-made macro profiles for most modern games. It comes with a removable wrist rest, and includes a single USB pass-through. In all, it’s a nice, well-constructed keyboard, though its feature set is just a tiny bit smaller than some similarly priced mechanical boards from other brands.</p> <div class="lowdown"> <div class="module orange-module article-module verdict-block"><span class="module-name-header" style="font-size: 14px; border-bottom: 1px solid #000;">Logitech G710+</span><br /> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="product-verdict"> <div class="positive"><span class="header">O.G.<br /></span> <p>Excellent typing and gaming feel; dual-zone lighting;noise-dampened keys.</p> </div> <div class="negative"><span class="header">Oh No<br /></span> <p>On the pricier side; few pass-throughs.</p> </div> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> </div> </div> </div> <p><strong>$150, <a href="http://www.logitech.com " target="_blank">www.logitech.com</a></strong></p> <h3>The Art of Cherrypicking</h3> <p>If you’re the pattern-recognizing sort, you may notice that every mechanical keyboard in this roundup uses Cherry MX switches for their key mechanisms. That’s because virtually all mechanical gaming keyboards today use some variety of Cherry MX switch, such as Brown or Blue. The names indicate both the actual color of the switch (pry a keycap up and you’ll be able to tell by sight which switch is underneath), and the switch’s mechanical characteristics, in terms of tactility and resistance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/k60_d_install_small_0.jpg"><img src="/files/u152332/k60_d_install_small.jpg" width="620" height="403" /></a></p> <p>A switch that is highly tactile has a noticeable “bump” that you overcome as you press it down, and tends to make a click noise as it passes that bump. A switch with high resistance requires more force to depress. Here are the four most common varieties of Cherry MX switch:</p> <p>Red: A non-tactile switch with low resistance. The pressing action is smooth, with no bump, and because of its low resistance it is very responsive. Good for action gamers.</p> <p>Black: A non-tactile switch, like the Red, with higher resistance.</p> <p>Blue: A highly tactile switch, with a dramatic (and loud) click. Considered the best switch for typing, but they can be slightly harder to double-tap quickly for gaming.</p> <p>Brown: A middle-ground switch, with a light tactile click and medium resistance. Functions well for both typing and gaming.</p> <p>Click <a title="mechanical keyboard guide" href="http://www.maximumpc.com/mechanical_keyboard_guide_2013" target="_blank">here</a> to read our in-depth mechanical keyboard guide.&nbsp;</p> <p>&nbsp;</p> <hr /> <p>&nbsp;</p> <h4>Corsair Vengeance K90</h4> <p><strong>All the macro keys money can buy</strong></p> <p>The <a title="K90" href="http://www.corsair.com/gaming-peripherals/gaming-keyboards/vengeance-k90-performance-mmo-mechanical-gaming-keyboard.html" target="_blank">Corsair Vengeance K90</a> launched early last year alongside the Vengeance K60. It is, at heart, an expanded version of that board, fitted with a vast bank of customizable macro keys at the far left, and a detachable rubberized wrist rest. The extra functionality is mostly aimed at MMO players, who may have need for the truly staggering number of macro keys—18 keys, arranged into three banks of six, with three profile buttons for a total of 54 programmable actions. We’re a bit skeptical about the utility of so many macro buttons, as it becomes difficult to remember which key does what, and to hit them without looking, as the button count increases. Still, you should be able to imagine whether you’d be able to put the buttons to good use or not.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboards-5227181_1.jpg"><img src="/files/u152332/keyboards-5227181_0.jpg" alt="With the K90, Corsair goes deep on macro keys. Unfortunately, only the main QWERTY keyboard and arrow keys are mechanical." title="Corsair Vengeance K90" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>With the K90, Corsair goes deep on macro keys. Unfortunately, only the main QWERTY keyboard and arrow keys are mechanical.</strong></p> <p>Beyond those extra keys, the K90 features the strong points of the K60, including a rugged all-aluminum body and responsive Cherry MX Red switches. The fantastic-looking low-profile aluminum design is even snazzier in the K90, thanks to blue backlighting that shines through the laser-etched keycaps. One of the strangest and worst features of the K90 is that it uses membrane-style switches for a small subset of the keys on the board (the 18 macro keys, the function keys, as well as the block above the arrow keys), which feel noticeably worse than the mechanical keys that make up the rest of the board. Especially for keys that are meant to be used in the heat of the moment, the transition to non-mechanical keys is very jarring.</p> <div class="lowdown"> <div class="module orange-module article-module verdict-block"><span class="module-name-header" style="font-size: 14px; border-bottom: 1px solid #000;">Corsair Vengeance K90</span><br /> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="product-verdict"> <div class="positive"><span class="header">Macro<br /></span> <p>Tons of macro keys; nice build quality and design; mechanical.</p> </div> <div class="negative"><span class="header">Micro<br /></span> <p>Not all keys are mechanical; giant block of macro keys is difficult to use efficiently.</p> </div> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> </div> </div> </div> <p><strong>$130, <a href="http://www.corsair.com " target="_blank">www.corsair.com</a></strong></p> <h4>Rosewill RK-9100 Mechanical Gaming Keyboard</h4> <p><strong>A solid board, low on features</strong></p> <p>Sometimes it’s nice when a company comes along and boils down a product category to just the features that are important. With the <a title="rk-9100" href="http://www.rosewill.com/products/2320/ProductDetail_Overview.htm" target="_blank">RK-9100</a>, Rosewill does just that, offering a solid mechanical gaming keyboard with few flourishes.</p> <p>The RK-9100 is a compact design with no wrist rest and a minimal lip around the outside of the board. It’s heavy, and feels quite sturdy. It uses mechanical keys—once again, Cherry MX switches, though with the RK-9100 you have a choice of the typing-friendly Blue switches, or the in-between Browns. We tend to prefer the Browns as a nice compromise between gaming and typing, which makes it a bit frustrating that the Brown-switch version of the RK-9100 retails for $130, $20 more than the Blue version.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wwkeyboards-5227185_small_0.jpg"><img src="/files/u152332/wwkeyboards-5227185_small.jpg" alt="The Rosewill RK-9100 isn’t the fanciest-looking keyboard, but it feels great to use." title="Rosewill RK-9100 Mechanical Gaming Keyboard" width="620" height="321" /></a></p> <p style="text-align: center;"><strong>The Rosewill RK-9100 isn’t the fanciest-looking keyboard, but it feels great to use.</strong></p> <p>The keyboard has a nice blue backlight, except for the scroll-, num-, and caps-lock keys, which glow green while active. It’s a good idea, but for some reason the green light is incredibly bright, and angled to shine right into your eyes while active. It’s distracting, and unfortunately can’t be turned off—we wouldn’t be surprised if most RK-9100 owners end up fixing the problem with electrical tape. That’s the only significant problem we noticed while using Rosewill’s keyboard, but we couldn’t shake the feeling that $130 is a bit too much to ask for this board. The Logitech G710+ features the same MX Brown switches, and with street a price that’s currently only about $10 more than RK-9100, includes significantly more features that set it apart as a gaming keyboard.</p> <div class="lowdown"> <div class="module orange-module article-module verdict-block"><span class="module-name-header" style="font-size: 14px; border-bottom: 1px solid #000;">Rosewill RK-9100 Mechanical Gaming Keyboard</span><br /> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="product-verdict"> <div class="positive"><span class="header">Rose water<br /></span> <p>No-nonsense design; selection of different Cherry MX switches.</p> </div> <div class="negative"><span class="header">Hose water<br /></span> <p>No macro keys; no software support.</p> </div> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> </div> </div> </div> <p><strong>$130, <a href="http://www.rosewill.com " target="_blank">www.rosewill.com</a></strong></p> <h4>Roccat Isku</h4> <p><strong>Membrane plank makes strong impression</strong></p> <p>If you’re not ready to make the jump to a mechanical keyboard, and aren’t interested in touchscreens or scalp massagers or whatever other luxury features are going into the $200-plus planks, your money will go a lot farther. Specifically, it’ll go all the way to the <a title="roccat" href="http://www.roccat.org/Products/Gaming-Keyboards/ROCCAT-Isku/" target="_blank">Roccat Isku</a>, a handsome and feature-rich keyboard from German newcomer Roccat.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wwkeyboards-5227184_small_0.jpg"><img src="/files/u152332/wwkeyboards-5227184_small.jpg" alt="The Isku is thin but takes up a lot of room, thanks to its broad wrist rest and bezel." title="Roccat Isku" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>The Isku is thin but takes up a lot of room, thanks to its broad wrist rest and bezel.</strong></p> <p>The Isku is wide and flat, with an oversized wrist rest and a wide bezel all around the board, taking up plenty of desk real estate. It’s got a grippy textured-plastic frame and recessed contoured keys that make the whole thing seem flatter and lower to the desk than normal. The dome keys are good (as far as they go) with a fairly crisp and responsive activation.</p> <p>Where the Isku really shines is in its expansive set of features. It has eight macro buttons (including three “thumbster” keys under the spacebar), with on-the-fly recording, and profile switching. It gets further mileage out of the bindable keys and macros with an “EasyShift” button where the caps-lock key would normally be, which temporarily switches the functions of all right-hand-accessible keys while held down. There’s a lot to customize, and the included software suite is intuitive and up to the task.</p> <p>Also, the Isku is part of the “Roccat Talk” ecosystem, which allows button presses on the keyboard to affect the behavior of a Roccat gaming mouse, and vice versa. At this price, we’d strongly recommend buying a mechanical board, but if you can’t or don’t want to, the Isku is an excellent choice.</p> <div class="lowdown"> <div class="module orange-module article-module verdict-block"><span class="module-name-header" style="font-size: 14px; border-bottom: 1px solid #000;">Roccat Isku</span><br /> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="product-verdict"> <div class="positive"><span class="header">Rose water<br /></span> <p>No-nonsense design; selection of different Cherry MX switches.</p> </div> <div class="negative"><span class="header">Hose water<br /></span> <p>No macro keys; no software support.</p> </div> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> </div> </div> </div> <p><strong>$90, <a href="http://www.roccat.org" target="_blank">www.roccat.org</a></strong></p> <h3>A Keyboard for Clean Freaks</h3> <p>One of the keyboards we received while preparing this roundup was the <a title="logitech washable keyboard" href="http://www.logitech.com/en-us/product/washable-keyboard-k310" target="_blank">Logitech Washable Keyboard K310</a>. Somehow it didn’t seem quite fair to pit the $40 K310 against the likes of the Razer Deathstalker in a straight head-to-head, but we couldn’t resist the chance to see if this washable keyboard really works.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/keyboard_before_small_0.jpg"><img src="/files/u152332/keyboard_before_small.jpg" width="620" height="415" /></a></p> <p>The K310 has a standard full-size layout with flat, thick plastic keys. Despite the very plastic-y construction and non-standard keys, the keyboard actually feels pretty decent to use.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/dirtykeyboard_small_1.jpg"><img src="/files/u152332/dirtykeyboard_small_0.jpg" width="620" height="415" /></a></p> <p>We don’t actually have a standard testing procedure worked out for washable keyboards, so we improvised. We took a quick trip to the corner store for a bag of Cheetohs—bane of all keyboards. We then used a mortar and pestle to mash them into a fine, delicious powder, and applied it liberally to the keyboard (and surrounding table).</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/washingkeyboard_small_0.jpg"><img src="/files/u152332/washingkeyboard_small.jpg" width="620" height="415" /></a></p> <p>We were originally going to stick the K310 in the dishwasher, but a label on its back specifically warns against doing so. Instead, we gave it a thorough hand-washing in the sink.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/clean_keyboard_small_0.jpg"><img src="/files/u152332/clean_keyboard_small.jpg" width="620" height="347" /></a></p> <p>What’s the verdict? The keyboard looks like new, and works just fine. Not bad!</p> http://www.maximumpc.com/article/features/best_keyboard_2013#comments March 2013 2013 best keyboard Hardware Hardware Logitech G710+ maximum pc Razer Deathstalker Ultimate reviews strike 7 Keyboards Reviews Features Wed, 10 Sep 2014 21:44:05 +0000 Alex Castle 25598 at http://www.maximumpc.com Haswell-E Review http://www.maximumpc.com/haswell-e_review_2014 <!--paging_filter--><h3>UPDATE: We've updated our Haswell- E story to include our video on Haswell-E (X99) motheboards</h3> <p>After three long years of going hungry with quad-cores, red meat is finally back on the menu for enthusiasts. And not just any gamey slab full of gristle with shared cores, either. With its new eight-core Haswell-E CPU, Intel may have served up the most mouth-watering, beautifully seared piece of red meat in a long time.</p> <p><iframe src="//www.youtube.com/embed/aNTMIHr9Ha0" width="620" height="349" frameborder="0"></iframe></p> <p>And it’s a good thing, too, because enthusiast’s stomachs have been growling. Devil’s Canyon? That puny quad-core was just an appetizer. And that dual-core highly overclockable Pentium K CPU? It’s the mint you grab on your way out of the steak house.</p> <p><iframe src="//www.youtube.com/embed/_h9ggGZHFtU" width="620" height="349" frameborder="0"></iframe></p> <p>No, what enthusiasts have craved and wanted ever since Intel’s original clock-blocking job on the original Sandy Bridge-E was a true, overclockable enthusiast chip with eight cores. So if you’re ready for a belt loosening, belly full of enthusiast-level prime rib, pass the horse radish, get that damned salad off our table, and read on to see if Intel’s Haswell-E is everything we hoped it would be.&nbsp;</p> <p><strong>Meet the Haswell-E parts</strong></p> <p><span style="color: #ff0000;"><img src="/files/u154082/haswell-e_comparison_chart.png" alt="haswell e comparison chart" title="haswell e comparison chart" width="620" height="241" /></span></p> <p>&nbsp;</p> <p><img src="/files/u154082/lga2011v3socket.jpg" alt="haswell e socket" title="haswell e socket" width="620" height="626" /></p> <p><strong>Despite its name, the LGA2011-v3 socket is not same as the older LGA2011 socket. Fortunately, the cooling offsets are exactly the same, so almost all older coolers and accessories should work just fine.&nbsp;</strong></p> <p><img src="/files/u154082/lga2011socket1.jpg" alt="lga2011" title="lga2011" width="620" height="556" /></p> <p><strong>Though they look the same, LGA2011’s socket has arms that are actually arranged differently than the new LGA2011-v3 that replaces it. And no, you can’t drop a newer Haswell-E into this socket and make it work.</strong></p> <h4>Haswell-E</h4> <p><strong>The first consumer Intel eight-core arrives at last</strong></p> <p>Being a card-carrying member of the PC enthusiast class is not an easy path to follow. Sure, you get the most cores and priciest parts, but it also means you get to wait a hell of a long time in between CPU upgrades. And with Intel’s cadence the last few years, it also means you get the leftovers. It’s been that way ever since Intel went with its two-socket strategy with the original LGA1366/LGA1156. Those who picked the big-boy socket and stuck to their guns on Pure PC performance always got the shaft.&nbsp;</p> <p>The original Ivy Bridge in LGA1156 socket, for example, hit the streets in April of 2012. As a reward for having the more efficient and faster CPU, Intel rewarded the small-socket crowd with its Haswell in June of 2013. It wasn’t until September of 2013 that big-boy socket users finally got Ivy Bridge-E for their LGA2011s. But with Haswell already out and tearing up the benchmarks, who the hell cared?</p> <p>Well, that time has come with Haswell-E, Intel’s first replacement for the aging LGA2011 platform since 2011. This time though, Intel isn’t just shuffling new parts into its old stack. For the first since the original Pentium 4 Extreme Edition, paying the price premium actually nets you more: namely, the company’s first consumer eight-core CPU.</p> <p><strong>Meet the T-Rex of consumer CPUs: The Core i7-5960X</strong></p> <p>We were actually a little leery of Haswell when it first launched last year. It was, after all, a chip seemingly tuned for the increasingly mobile/laptoppy world we were told was our post PC-apocalyptic future. Despite this, we recognized the chip as the CPU to have for new system builders. Clock for clock, its 22nm process, tri-gate transistors put everything else to shame—even the six-core Core i7-3930K chip in many tasks. So it’s no surprise that when Intel took a quad-core Haswell, put it in the Xerox machine, and hit the copy x2 button , we’d be ecstatic. Eight cores are decidedly better than six cores or four cores when you need them.&nbsp;</p> <p>The cores don’t come without a cost though, and we don’t mean the usual painful price Intel asks for its highest-end CPUs. It’s no secret that more cores means more heat, which means lower clock speeds. That’s one of the rationales Intel used with the original six-core Core i7-3960X. Although sold as a six-core, the original Sandy Bridge-E was built using an eight-core die on which Intel had permanently switched off two cores. Intel said it wanted to balance the needs of the many versus the needs of the few—that is, by turning off two of the cores, the part could hit higher clock speeds. Indeed, the Core i7-3960X had a base clock of 3.3GHz and Turbo Boost of 3.9GHz, and most could overclock it to 5GHz. The same chip packaged as a Xeon with all eight cores working—the Xeon E5-2687W—was locked down at 3.1GHz and mostly buzzed along at 3.4GHz.</p> <p>With the new Core i7-5960X—the only eight-core of the bunch—the chip starts at a seemingly pedestrian 3GHz with a Turbo Boost of one core up to 3.5GHz. Those subsonic clock speeds won’t impress against the Core i7-4790K, which starts at 4GHz. You’ll find more on how well Haswell-E performs against Haswell in our performance section, but that’s the price to be paid, apparently, to get a chip with this many cores under the heat spreader. Regarding thermals, in fact, Intel has increased the TDP rating to 140 watts versus 130 watts of Ivy Bridge-E and Sandy Bridge-E.&nbsp;</p> <p>If the low clocks annoy you, the good news is the part is fully unlocked, so the use of overclocking has been approved. For our test units, we had very early hardware and tight deadlines, so we didn’t get very far with our overclocking efforts. Talking with vendors, however, most seem very pleased with the clock speeds they were seeing. One vendor told us overclocks of all cores at 4.5GHz was already obtainable and newer microcode updates were expected to improve that. With even the vaunted Devil’s Canyon Core i7-4790K topping out at 4.7GHz to 4.8GHz, a 4.5GHz is actually a healthy overclock for an eight-core CPU.</p> <p><span style="white-space: pre;"> </span>When you dive down into the actual cores though, much is the same, of course. It’s based on a 22nm process. It has “3D” tri-gate transistors and integrated voltage regulation. Oh, and it’s also the first CPU to feature an integrated DDR4 memory controller.</p> <p><strong>Click the next page to read about DDR4</strong></p> <hr /> <p>&nbsp;</p> <h4>DDR4 details</h4> <p>If you think Haswell-E has been a long wait, just think about DDR3, which made its debut as main memory in systems since 2007. Yes, 2007. The only component that has lasted seven years in most enthusiasts systems might be the PSU, but it’s even rare to find anyone kicking a 500-watt PSU from 2007 these days.&nbsp;</p> <p><span style="white-space: pre;"> </span>DDR4 has been in gestation seemingly as long, so why the delay? From what we can tell, resistance to yet another new memory standard during a time when people thought the desktop PC and the PC in general were dying has been the root delay. It didn’t help that no one wanted to stick their head out first, either. RAM makers didn’t want to begin producing it DDR4 in volume until AMD or Intel made chipsets for it, and AMD and Intel didn’t want to support it because of the costs it would add to PCs at a time when people were trying to lower costs. The stalemate finally ends with Haswell-E, which integrates a quad-channel memory controller into its die.</p> <p>Initial launch speeds of DDR4 clock in at DDR4/2133. For those already running DDR3 at 3GHz or higher, a 2,133 data rate is a snooze, but you should realize that anything over 2133 is overclocked RAM. With DDR4, the JEDEC speeds (the body that sets RAM standards) already has target data rates of 3200 on the map. RAM vendors we’ve talked to are already shopping DIMMS near that speed.</p> <p>The best part of DDR4 may be its density message, though. For years, consumer DDR3 has topped out at 8GB on a DIMM. With DDR4, we should see 16B DIMMs almost immediately, and stacking of chips is built into the standard, so it’s possible we’ll see 32GB DIMMs over its lifetime. On a quad-channel, eight-DIMM motherboard, you should expect to be able to build systems with 128GB of RAM using non-ECC DIMMs almost immediately. DDR4 also brings power savings and other improvements, but the main highlights enthusiasts should expect are higher densities and higher clocks. Oh, and higher prices. RAM prices haven’t been fun for anyone of late, but DDR4 will definitely be a premium part for some time. In fact, we couldn’t even get exact pricing from memory vendors as we were going to press, so we’re bracing for some really bad news.</p> <h4>PCIe lanes: now a feature to be blocked</h4> <p>Over the years, we’ve come to expect Intel to clock-block core counts, clock speeds, Hyper-Threading, and even cache for “market segmentation” purposes. What that means is Intel has to find ways to differentiate one CPU from another. Sometimes that’s by turning off Hyper-Threading (witness Core i5 and Core i7) and sometimes its locking down clock speeds. With Haswell-E though, Intel has gone to new heights with its clock-blocking by actually turning off PCIe lanes on some Haswell-E parts to make them less desirable. At the top end, you have the 3GHz Core i7-5960X with eight cores. In the midrange you have the six-core 3.5GHz Core i7-5930K. And at the “low-end” you have the six-core 3.3GHz Core i7-5820K. The 5930K and the 5820K are virtually the same in specs except for one key difference: The PCIe lanes get blocked. Yes, while the Core i7-5960X and Core i7-5930K get 40 lanes of PCIe 3.0, the Core i7-5820K gets an odd 28 lanes of PCIe 3.0. That means those who had hoped to build “budget” Haswell-E boxes with multiple GPUs may have to think hard and fast about using the lowest-end Haswell-E chip. The good news is that for most people, it won’t matter. Plenty of people run Haswell systems with SLI or CrossFire, and those CPUs are limited to 16 lanes. Boards with PLX switches even support four-way GPU setups.</p> <p>Still, it’s a brain bender to think that when you populate an X99 board with the lowest-end Haswell-E, the PCIe configuration will change. The good news is at least they’ll work, just more slowly. Intel says it worked with board vendors to make sure all the slots will function with the budget Haswell-E part.&nbsp;</p> <p><img src="/files/u154082/mpc_haswell_front-back_1.jpg" alt="haswell e chip" title="haswell e chip" width="620" height="413" /></p> <p><strong>There have been clock-blocking rumors swirling around about the Haswell being a 12-core Xeon with four cores turned off. That’s not true and Intel says this die-shot proves it.&nbsp;</strong></p> <p><img src="/files/u154082/ivbe.jpg" alt="ivy bridge e" title="ivy bridge e" width="620" height="550" /></p> <p><strong>Ivy Bridge-E’s main advantage over Sandy Bridge-E was a native six-core die and greatly reduced power consumption. And, unfortunately, like its Ivy Bridge counterpart, overclocking yields on Ivy Bridge-E were greatly reduced over its predecessor, too, with few chips hitting more than 4.7GHz at best.</strong></p> <p><img src="/files/u154082/snbe.jpg" alt="sandy bridge e" title="sandy bridge e" width="308" height="260" /></p> <p><strong>Sandy Bridge-E and Sandy Bridge will long be remembered for its friendliness to overclocking and having two of its working cores killed Red Wedding–style by Intel.</strong></p> <p><strong>Click the next page to read about X99.</strong></p> <hr /> <p>&nbsp;</p> <h4>X99&nbsp;</h4> <p><strong>High-end enthusiasts finally get the chipset they want, sort of</strong></p> <p><img src="/files/u154082/x99blockdiagram.jpg" alt="x99 block diagram" title="x99 block diagram" width="620" height="381" /></p> <p><strong>Intel overcompensated in SATA on X99 but oddly left SATA Express on the cutting-room floor.</strong></p> <p>You know what we won’t miss? The X79 chipset. No offense to X79 owners, while the Core i7-4960X can stick around for a few more months, X79 can take its under-spec’ed butt out of our establishment. Think we’re being too harsh? We don’t.</p> <p>X79 has no native USB 3.0 support. And its SATA 6Gb/s ports? Only two. It almost reads like a feature set from the last decade to us. Fortunately, in a move we wholly endorse, Intel has gone hog wild in over-compensating for the weaknesses of X79.&nbsp;</p> <p>X99 has eight USB 2.0 ports and six USB 3.0 ports baked into the peripheral controller hub in it. For SATA 6Gb/s, Intel adds 10 ports to X99. Yes, 10 ports of SATA 6Gb/s. That gazongo number of SATA ports, however, is balanced out by two glaring omission in X99: no official SATA Express or M.2 support that came with Z97. Intel didn’t say why it left off SATA Express or M.2 in the chipset, but it did say motherboard vendors were free to implement it using techniques they gleaned from doing it on Z97 motherboards. If we had to hazard a guess, we’d say Intel’s conservative nature led it to leave the feature off the chipset, as the company is a stickler for testing new interfaces before adding official support. At this point, SATA Express has been a no-show. After all, motherboards with SATA Express became available in May with Z97, yet we still have not seen any native SATA Express drives. We expect most motherboard vendors to simply add it through discrete controllers; even our early board sample had a SATA Express port.&nbsp;</p> <p>One potential weakness of X99 is Intel’s use of the DMI 2.0. That offers roughly 2.5GB/s of transfer speed between the CPU and the south bridge or PCH, but with the board hanging 10 SATA devices, USB 3.0, Gigabit Ethernet, and 8 PCIe Gen 2.0 lanes off that link, there is the potential for massive congestion—but only in a worst-case scenario. You’d really have to a boat load of hardware lit up and sending and receiving data at once to cause the DMI 2.0 to bottleneck. Besides, Intel says, you can just hang the device off the plentiful PCIe Gen 3.0 from the CPU.</p> <p>That does bring up our last point on X99: the PCIe lanes. As we mentioned earlier, there will be some confusion over the PCIe lane configuration on systems with Core i7-5820K parts. With only 28 lanes of PCIe lanes available from that one chip, there’s concern that whole slots on the motherboard will be turned off. That won’t happen, Intel says. Instead, if you go with the low-rent ride, you simply lose bandwidth. Take an X99 mobo and plug in the Core i7-5930K and you get two slots at x16 PCIe, and one x8 slot. Remove that CPU and install the Core i7-5820K, and the slots will now be configured as one x16, one x8 and one x4. It’s still more bandwidth than you can get from a normal LGA1150-based Core i7-4770K but it will be confusing nonetheless. We expect motherboard vendors to sort it out for their customers, though.</p> <p>Haswell-E does bring one more interesting PCIe configuration though: the ability to run five graphics cards in the PCIe slots at x8 speeds. Intel didn’t comment on the reasons for the option but there only a few apparent reasons. The first is mining configurations where miners are already running six GPUs. Mining, however, doesn’t seem to need the bandwidth a x8 slot would provide. The other possibility is a five-way graphics card configuration being planned by Nvidia or AMD. At this point it’s just conjecture, but one thing we know is that X99 is a welcome upgrade. Good riddance X79.&nbsp;</p> <h4>Top Procs Compared</h4> <p><span style="color: #ff0000;"><span style="white-space: pre;"><img src="/files/u154082/top_processors.png" alt="top processors compared" title="top processors compared" width="620" height="344" /></span></span></p> <h4>Core Competency&nbsp;</h4> <p><strong>How many cores do you really need?</strong></p> <p><img src="/files/u154082/haswelletaskamanger.png" alt="haswell task manager" title="haswell task manager" width="620" height="564" /></p> <p><strong>It is indeed a glorious thing to see a task manager with this many threads, but not everyone needs them.</strong></p> <p>Like the great technology philosopher Sir Mix-A-Lot said, we like big cores and we cannot lie. We want as many cores as legally available. But we recognize that not everyone rolls as hard as we do with a posse of threads. With Intel’s first eight-core CPU, consumers can now pick from two cores all the way to eight on the Intel side of the aisle—and then there’s Hyper-Threading to confuse you even more. So, how many cores do you need? We’ll give you the quick-and-dirty lowdown.</p> <p><strong>Two cores</strong></p> <p>Normally, we’d completely skip dual-cores without Hyper-Threading because the parts tend to be the very bottom end of the pool Celerons. Our asterisk is the new Intel Pentium G3258 Anniversary Edition, or “Pentium K,” which is a real hoot of a chip. It easily overclocks and is dead cheap. It’s not the fastest in content creation by a long shot, but if we were building an ultra-budget gaming rig and needed to steal from the CPU budget for a faster GPU, we’d recommend this one. Otherwise, we see dual-cores as purely ultra-budget parts today.</p> <p><strong>Two cores with Hyper-Threading</strong></p> <p>For your parents who need a reliable, solid PC without overclocking (you really don’t want to explain how to back down the core voltage in the BIOS to grandma, do you?), the dual-core Core i3 parts fulfill the needs of most people who only do content creation on occasion. Hyper-Threading adds value in multi-threaded and multi-tasking tasks. You can almost think of these chips with Hyper-Threading as three-core CPUs.&nbsp;</p> <p><strong>Four cores</strong></p> <p>For anyone who does content creation such as video editing, encoding, or even photo editing with newer applications, a quad-core is usually our recommended part. Newer game consoles are also expected to push min specs for newer games to quad-cores or more as well, so for most people who carry an Enthusiast badge, a quad-core part is the place to start.</p> <p><strong>Four cores with Hyper-Threading</strong></p> <p>Hyper-Threading got a bad name early on from the Pentium 4 and existing software that actually saw it reduce performance when turned on. Those days are long behind us though, and Hyper-Threading offers a nice performance boost with its virtual cores. How much? &nbsp;A 3.5GHz Core i7 quad-core with Hyper-Threading generally offers the same performance on multi-threaded tasks as a Core i5 running at 4.5GHz. The Hyper-Threading helps with content creation and we’d say, if content creation is 30 percent or less of your time, this is the place to be and really the best fit for 90 percent of enthusiasts.</p> <p><strong>Six cores with Hyper-Threading</strong></p> <p>Once you pass the quad-core mark, you are moving pixels professionally in video editing, 3D modeling, or other tasks that necessitate the costs of a six-core chip or more. We still think that for 90 percent of folks, a four-core CPU is plenty, but if losing time rendering a video costs you money (or you’re just ADD), pay for a six-core or more CPU. How do you decide if you need six or eight cores? Read on.&nbsp;</p> <p><strong>Eight cores with Hyper-Threading</strong></p> <p>We recognize that not everyone needs an eight-core processor. In fact, one way to save cash is to buy the midrange six-core chip instead, but if time is money, an eight-core chip will pay for itself. For example, the eight-core Haswell-E is about 45 percent faster than the four-core Core i7-4790K chip. If your render job is three hours, that’s more time working on other paying projects. The gap gets smaller between the six-core and the eight-core of course, so it’s very much about how much your time is worth or how short your attention span is. But just to give you an idea, the 3.3GHz Core i7-5960X is about 20 percent faster than the Core i7-4960X running at 4GHz.</p> <p><strong>Click the next page to see how Haswell-E stacks up against Intel's other top CPUs.</strong></p> <hr /> <p>&nbsp;</p> <h4 style="font-size: 10px;">Intel’s Top Guns Compared</h4> <p><img src="/files/u154082/cpus17918.jpg" alt="haswell" title="haswell" width="620" height="413" /></p> <p><strong><strong>The LGA2011-based Core i7-4960X (left) and the LGA2011-v3-based Core i7-5960X (middle) dwarf the Core i7-4790K chip (right). Note the change in the heat spreader between the older 4960X and 5960X, which now has larger “wings” that make it easier to remove the CPU by hand. The breather hole, which allows for curing of the thermal interface material (solder in this case), has also been moved. Finally, while the chips are the same size, they are keyed differently to prevent you from installing a newer Haswell-E into an older Ivy Bridge-E board.</strong></strong></p> <h4>Benchmarks</h4> <p><strong>Performance junkies, rejoice! Haswell-E hits it out of the ballpark</strong></p> <p><img src="/files/u154082/x99-gaming_5-rev10.jpg" alt="x99 gigabyte" title="x99 gigabyte" width="620" height="734" /></p> <p><strong>We used a Gigabyte X99 motherboard (without the final heatsinks for the voltage-regulation modules) for our testing.</strong></p> <p>For our testing, we set up three identical systems with the fastest available CPUs for each platform. Each system used an Nvidia GeForce GTX 780 with the same 340.52 drivers, Corsair 240GB Neutron GTX SSDs, and 64-bit Windows 8.1 Enterprise. Since we’ve had issues with clock speeds varying on cards that physically look the same, we also verified the clock speeds of each GPU manually and also recorded the multiplier, bclock, and speeds the parts run at under single-threaded and multi-threaded loads. So you know, the 3GHz Core i7-5960X’s would run at 3.5GHz on single-threaded tasks but usually sat at 3.33GHz on multi-threaded tasks. The 3.6GHz Core i7-4960X ran everything at 4GHz, including multi-threading tasks. The 4GHz Core i7-4790K part sat at 4.4GHz on both single- and multi-threaded loads.</p> <p>For Z97, we used a Gigabyte Z97M-D3H mobo with a Core i7-4790K “Devil’s Canyon” chip aboard. &nbsp;An Asus Sabertooth X79 did the duty for our Core i7-4960X “Ivy Bridge-E” chip. Finally, for our Core i7-5960X chip, we obtained an early Gigabyte X99-Gaming 5 motherboard. The board was pretty early but we feel comfortable with our performance numbers as Intel has claimed the Core i7-5960X was “45 percent” faster than a quad-core chip, and that’s what we saw in some of our tests.&nbsp;</p> <p>One thing to note: The RAM capacities were different but in the grand scheme of things and the tests we run, it has no impact. The Sabertooth X79 &nbsp;had 16GB of DDR3/2133 in quad-channel mode, the Z97M-D3H had 16GB of DDR3/2133 in dual-channel mode. Finally, the X99-Gaming 5 board had 32GB of Corsair DDR4/2133. All three CPUs will overclock, but we tested at stock speeds to get a good baseline feel.&nbsp;</p> <p>For our benchmarks, we selected from a pile of real-world games, synthetic tests, as well as real-world applications across a wide gamut of disciplines. Our gaming tests were also run at very low resolutions and low-quality settings to take the graphics card out of the equation. We also acknowledge that people want to know what they can expect from the different CPUs at realistic settings and resolutions, so we also ran all of the games at their highest settings at 1920x1080 resolution, which is still the norm in PC gaming.&nbsp;</p> <p><strong>The results</strong></p> <p>We could get into a multi-sentence analysis of how it did and slowly break out with our verdict but in a society where people get impatient at the microwave, we’ll give you the goods up front: Holy Frakking Smokes, this chip is fast! The Core i7-5960X is simply everything high-end enthusiasts have been dreaming about.&nbsp;</p> <p>Just to give you an idea, we’ve been recording scores from $7,000 and $13,000 PCs in our custom Premiere Pro CS6 benchmark for a couple of years now. The fastest we’ve ever seen is the Digital Storm Aventum II that we reviewed in our January 2014 issue. The 3.3GHz Core i7-5960X was faster than the Aventum II’s Core i7-4960X running at 4.7GHz. Again, at stock speeds, the Haswell-E was faster than the fastest Ivy Bridge-E machine we’ve ever seen.</p> <p>It wasn’t just Premiere Pro CS6 we saw that spread in either. In most of our tests that stress multi-threading, we saw roughly a 45 percent to 50 percent improvement going from the Haswell to the Haswell-E part. The scaling gets tighter when you’re comparing the six-core Core i7-4960X but it’s still a nice, big number. We generally saw a 20 percent to 25 percent improvement in multi-threaded tasks.&nbsp;</p> <p>That’s not even factoring in the clock differences between the parts. The Core i7-4790K buzzes along at 4.4GHz—1.1GHz faster than the Core i7-5960X in multi-threaded tasks—yet it still got stomped by 45 to 50 percent. The Core i7-4960X had a nearly 700MHz clock advantage as well over the eight-core chip.</p> <p>The whole world isn’t multi-threaded, though. Once we get to workloads that don’t push all eight cores, the higher clock speeds of the other parts predictably take over. ProShow Producer 5.0, for example, has never pushed more than four threads and we saw the Core i7-5960X lose by 17 percent. The same happened in our custom Stitch.Efx 2.0 benchmark, too. In fact, in general, the Core i7-4790K will be faster thanks to its clock speed advantage. If you overclocked the Core i7-5960X to 4GHz or 4.4GHz on just four cores, the two should be on par in pure performance on light-duty workloads.</p> <p>In gaming, we saw some results from our tests that are a little bewildering to us. At low-resolution and low-quality settings, where the graphics card was not the bottleneck, the Core i7-4790K had the same 10 percent to 20 percent advantage. When we ran the same tests at ultra and 1080p resolution, the Core i7-5960X actually had a slight advantage in some of the runs against the Core i7-4790K chip. We think that may be from the bandwidth advantage the 5960X has. Remember, we ran all of the RAM at 2,133, so it’s not DDR4 vs. DDR3. It’s really quad-channel vs. dual-channel.</p> <p>We actually put a full breakdown of each of the benchmarks and detailed analysis on MaximumPC.com if you really want to nerd out on the performance.</p> <p><strong>What you should buy</strong></p> <p>Let’s say it again: The Core i7-5960X stands as the single fastest CPU we’ve seen to date. It’s simply a monster in performance in multi-threaded tasks and we think once you’ve overclocked it, it’ll be as fast as all the others in tasks that aren’t thread-heavy workloads.</p> <p>That, however, doesn’t mean everyone should start saving to buy a $1,000 CPU. No, for most people, the dynamic doesn’t change. For the 80 percent of you who fall into the average Joe or Jane nerd category, a four-core with Hyper-Threading still offers the best bang for the buck. It won’t be as fast as the eight-core, but unless you’re really working your rig for a living, made of money, or hate for your Handbrake encodes to take that extra 25 minutes, you can slum it with the Core i7-4790K chip. You don’t even have to heavily overclock it for the performance to be extremely peppy.</p> <p>For the remaining 20 percent who actually do a lot of encoding, rendering, professional photo editing, or heavy multi-tasking, the Core i7-5960X stands as the must-have CPU. It’s the chip you’ve been waiting for Intel to release. Just know that at purely stock speeds, you do give up performance to the Core i7-4790K part. But again, the good news is that with minor overclocking tweaks, it’ll be the equal or better of the quad-core chip.</p> <p>What’s really nice here is that for the first time, Intel is giving its “Extreme” SKU something truly extra for the $999 they spend. Previous Core i7 Extreme parts have always been good overclockers, but a lot of people bypassed them for the midrange chips such as the Core i7-4930K, which gave you the same core counts and overclocking to boot. The only true differentiation Extreme CPU buyers got was bragging rights. With Haswell-E, the Extreme buyers are the only ones with eight-core parts.</p> <p>Bang-for-the-buck buyers also get a treat from the six-core Core i7-5820K chip. At $389, it’s slightly more expensive than the chip it replaces—the $323 Core i7-4820K—but the extra price nets you two more cores. Yes, you lose PCIe bandwidth but most people probably won’t notice the difference. We didn’t have a Core i7-5820K part to test, but we &nbsp;believe on our testing with the Core i7-5960X that minor overclocking on the cheap Haswell-E would easily make it the equal of Intel’s previous six-core chips that could never be had for less than $580.</p> <p>And that, of course, brings us to the last point of discussion: Should you upgrade from your Core i7-4960X part? The easy answer is no. In pure CPU-on-CPU &nbsp;showdowns, the Core i7-4960X is about 20 percent slower in multi-threaded tasks, and in light-duty threads it’s about the same, thanks to the clock-speed advantage the Core i7-4960X has. There are two reasons we might want to toss aside the older chip, though. The first is the pathetic SATA 6Gb/s ports, which, frankly, you actually need on a heavy-duty work machine. The second reason would be the folks for whom a 20 percent reduction in rendering time would actually be worth paying for.&nbsp;</p> <p><strong>Click the next page to check out our Haswell-E benchmarks.</strong></p> <hr /> <h4><span style="font-size: 1.17em;">Haswell-E Benchmarks</span></h4> <p><strong>Haswell-E benchmarks overview</strong></p> <p><span style="font-size: 1.17em;">&nbsp;</span><img src="/files/u154082/haswell_e_benchmarks.png" alt="haswell e benchmarks" title="haswell e benchmarks" width="541" height="968" /></p> <p>&nbsp;</p> <p>&nbsp;</p> <p><strong>Benchmark Breakdown</strong></p> <p>We like to give you the goods on a nice table but not everyone is familiar with what we use to test and what exactly the numbers means so let’s break down some of the more significant results for you.&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p><img src="/files/u154082/cinebenchsinglethreaded.png" alt="cinebench 15 single" title="cinebench 15 single" width="620" height="472" /></p> <p><strong>Cinebench 15 single-threaded performance</strong></p> <p><span style="color: #000000;">We used Maxon’s Cinebench 15 benchmark to see just how fast the trio of chips would run this 3D rendering test. Cinebench 15 allows you to restrict it from using all of the cores or just one core. For this test, we wanted to see how the Core i7-5960X “Haswell-E” would do against the others by measuring a single core. The winner here is the Core i7-4790K “Devil’s Canyon” chip. That’s no surprise—it uses the same microarchitecture as the big boy Haswell-E but it has a ton more clock speed on default. The Haswell-E is about 21 percent slower running at 3.5GHz. The Devil’s Canyon part is running about 900MHz faster at 4.4GHz. Remember, on default, the Haswell-E only hits 3.5GHz on single-core loads. The Haswell-E better microarchitecture also loses to the Core i7-4960X “Ivy Bridge-E,” but not by much and that’s with the Ivy Bridge-E’s clock speed advantage of 500MHz. Still, the clear winner in single-threaded performance is the higher-clocked Devil’s Canyon chip.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/cinebenchmulti.png" alt="cinebench 15 multi" title="cinebench 15 multi" width="620" height="428" /></span></p> <p><span style="color: #000000;"><strong>Cinebench 15 multi-threaded performance</strong></span></p> <p><span style="color: #000000;">You don’t buy an eight-core CPU and then throw only single-thread workloads at it, so we took the handcuffs off of Cinebench 15 and let it render with all available threads. On the Haswell-E part, that’s 16 threads of fun, on Ivy Bridge-E it’s 12-threads, and on Devil’s Canyon we’re looking at eight-threads. The winner by a clear margin is the Haswell-E part. Its performance is an astounding 49 percent faster than the Devil’s Canyon and about 22 percent faster than Ivy Bridge-E. We’ll just have to continue to remind you, too: this is with a severe clock penalty. That 49-percent-faster score is with all eight cores running at 3.3GHz vs all four of the Devil’s Canyon cores buzzing along at 4.4GHz. That’s an 1,100MHz clock speed advantage. Ivy Bridge-E also has a nice 700MHz clock advantage than Haswell-E. Chalk this up as a big, huge win for Haswell-E.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/povray.png" alt="pov-ray" title="pov-ray" width="620" height="491" /></span></p> <p><span style="color: #000000;"><strong>POV-Ray performance</strong></span></p> <p><span style="color: #000000;">We wanted a second opinion on rendering performance, so we ran POV-Ray, a freeware ray tracer that has roots that reach back to the Amiga. Again, Haswell-E wins big-time with a 47 percent performance advantage over Devil’s Canyon and a 25 percent advantage over Ivy Bridge-E. Yeah, and all that stuff we said about the clock speed advantage the quad-core and six-core had, that applies here, too. Blah, blah, blah.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/premierepro.png" alt="premiere pro" title="premiere pro" width="620" height="474" /></span></p> <p><span style="color: #000000;"><strong>Premiere Pro CS6 performance</strong></span></p> <p><span style="color: #000000;">One sanity check (benchmark results Intel produces to let you know what kind of performance to expect) said Haswell-E would outperform quad-core Intel parts by 45 percent in Premiere Pro Creative Cloud when working with 4K content. Our benchmark, however, doesn’t use 4K content yet, so we wondered if our results would be similar. For our test, we render out a 1080p-resolution file using source material shot by us on a Canon EOS 5D Mk II using multiple timelines and transitions. We restrict it to the CPU rather than using the GPU as well. Our result? The 3.3GHz Haswell-E was about 45 percent faster than the 4.4GHz Devil’s Canyon chip. Bada-bing! The two extra cores also spit out the render about 19 percent faster than the six-core Ivy Bridge-E. That’s fairly consistent performance we’re seeing between the different workload disciplines of 3D rendering and video encoding so far, and again, big, big wins for the Haswell-E part.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/handbrake.png" alt="handbrake" title="handbrake" width="620" height="407" /></span></p> <p><span style="color: #000000;"><strong>Handbrake Encoding performance</strong></span></p> <p><span style="color: #000000;">For our encoding test, we took a 1080p-resolution video file and used Handbrake 0.9.9 to transcode it into a file using the Android tablet profile. Handbrake is very multi-threaded and leverages the CPU for its encoding and transcoding. Our results were still fairly stellar, with Haswell-E CPU performing about 38 percent faster than the Devil’s Canyon part. Things were uncomfortably close with the Ivy Bridge-E part though, with the eight-core chip coming in only about 13 percent faster than the six-core chip. Since the Ivy Bridge-E cores are slower than Haswell cores clock-for-clock, we were a bit surprised at how close they were. In the past, we have seen memory bandwidth play a role in encoding, but not necessarily Handbrake. Interestingly, despite locking all three parts down at 2,133MHz, the Ivy Bridge-E does provide more bandwidth than the Haswell-E part. One other thing we should mention: Intel’s “sanity check” numbers to let the media know what to expect for Handbrake performance showed a tremendous advantage for the Haswell-E. Against a Devil’s Canyon chip, Haswell-E was 69 percent faster and 34 percent faster than the Ivy Bridge-E chip. Why the difference? The workload. Intel uses a 4K-resolution file and transcodes it down to 1080p. We haven’t tried it at 4K, but we may, as Intel has provided the 4K-resolution sample files to the media. If true, and we have no reason to doubt it, it’s a good message for those who actually work at Ultra HD resolutions that the eight-cores can pay off. Overall, we’re declaring Haswell-E the winner here.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/x264pass1.png" alt="x264 pass 1" title="x264 pass 1" width="620" height="496" /></span></p> <p><span style="color: #000000;"><strong>X264 HD 5.01 Pass 1 performance</strong></span></p> <p><span style="color: #000000;">We’ve been using TechArp.com’s X264 HD 5.0.1 benchmark to measure performance on new PCs. The test does two passes using the freeware x264 encoding library. The first pass is seemingly a little more sensitive to clock speeds and memory bandwidth rather than just pure core count. A higher frame rate is better. The first pass isn’t as core-sensitive, and memory bandwidth clock speed have more dividends here. Haswell still gives you a nice 36 percent boost over the Devil’s Canyon but that Ivy Bridge-E chip, despite its older core microarchitecture, comes is only beaten by 12 percent—too close for comfort. Of course, we’d throw in the usual caveat about the very large clock differences between the chips, but we’ve already said that three times. Oh, and yes, we did actually plagiarize by lifting two sentences from a previous CPU review for our description. That’s OK, we gave ourselves permission.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X but not by much</span></p> <p><span style="color: #000000;"><img src="/files/u154082/x264pass2.png" alt="x264 pass 2" title="x264 pass 2" width="620" height="499" /></span></p> <p><span style="color: #000000;"><strong>X264 HD 5.01 Pass 2 performance</strong></span></p> <p><span style="color: #000000;">Pass two of the X264 HD 5.01 benchmark is more sensitive to core and thread counts, and we see the Haswell-E come in with a nice 46 percent performance advantage against the Devil’s Canyon chip. The Ivy Bridge-E, though, still represents well. The Haswell-E chip is “only” 22 percent faster than it. Still, this is a solid win for the Haswell-E chip. We also like how we’re seeing very similar scaling in multiple encoding tests of roughly 45 percent. With Intel saying it’s seeing 69 percent in 4K resolution content in Handbrake, we’re wondering if the Haswell-E would offer similar scaling if we just moved all of our tests up to 4K.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><strong>Click the next page for even more Haswell-E benchmarks.</strong></p> <hr /> <p>&nbsp;</p> <p><span style="color: #000000;"><img src="/files/u154082/stitch.png" alt="stitch" title="stitch" width="620" height="473" /></span></p> <p><span style="color: #000000;"><strong>Stitch.EFx 2.0 Performance&nbsp;</strong></span></p> <p><span style="color: #000000;">Again, we like to mix up our workloads to stress different tasks that aren’t always multi-threaded to take advantage of a 12-core Xeon chip. For this test, we shot about 200 images with a Canon EOS 7D using a GigaPan motorized head. That’s roughly 1.9GB in images to make our gigapixel image using Stitch.EFx 2.0. The first third of the render is single-threaded as it stitches together the images. The final third is multi-threaded as it does the blending, perspective correction, and other intensive image processing. It’s a good blend of single-threaded performance and multi-threaded, but we expected the higher clocked parts to take the lead. No surprise, the Devil’s Canyon 4.4GHz advantage puts it in front, and the Haswell-E comes in about 14 percent slower with its 1.1GHz clock disadvantage. The clock speed advantage of the 4GHz Ivy Bridge-E also pays dividends, and we see the Haswell-E losing by about 10 percent. The good news? A dual-core Pentium K running at 4.7GHz coughed up a score of 1,029 seconds (not represented on the chart) and is roughly 22 percent slower than the CPU that costs about 11 times more.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/7zip.png" alt="7-zip" title="7-zip" width="620" height="477" /></span></p> <p><span style="color: #000000;"><strong>7-Zip Performance</strong></span></p> <p><span style="color: #000000;">The popular and free zip utility, 7-Zip, has a nifty built-in benchmark that tells you the theoretical file-compression performance a CPU. You can pick the workload size and the number of threads. For our test, we maxed it out at 16-threads using an 8MB workload. That gives the Haswell-E familiar advantage in performance—about 45 percent—over the Devil’s Canyon part. Against that Ivy Bridge-E part though, it’s another uncomfortably close one at 8 percent. Still, a win is a win even if we have to say that if you have a shiny Core i7-4960X CPU in your system, you’re still doing fine.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/sandra.png" alt="sisoft sandra" title="sisoft sandra" width="620" height="421" /></span></p> <p><span style="color: #000000;"><strong>Sisoft Sandra Memory Bandwidth (GB/s)</strong></span></p> <p>Since this is the first time we’re seeing DDR4 in a desktop part, we wanted to see how it stacked up in benchmarks. But, before you get too excited, remember that we set all three systems to 2133 data rates. The Devil’s Canyon part is dual-channel and the Ivy Bridge-E and Haswell-E are both quad-channel. With the memory set at 2133, we expected Haswell-E to be on par with the Ivy Bridge-E chip, but oddly, it was slower, putting out about 40GB/s of bandwidth. It’s still more than the 27GB/s the Devil’s Canyon could hit, but we expected it to be closer to double of what the Ivy Bridge-E was producing. For what it’s worth, we did double-check that we were operating in quad-channel mode and the clock speeds of our DIMMs. It’s possible this may change as the hardware we see becomes more final. We’ll also note that even at the same clock, DDR4 does suffer a latency penalty over DDR3. That would also be missing the point of DDR4, though. The new memory should give us larger modules and hit higher frequencies far easier, too, which will nullify that latency issue. Still, the winner is Ivy Bridge-E.</p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/3dmarkgpu.png" alt="3d mark" title="3d mark" width="620" height="457" /></span></p> <p><span style="color: #000000;"><strong>3DMark Firestrike Overall Performance</strong></span></p> <p><span style="color: #000000;">Even though 3DMark Firestrike is primarily a graphics benchmark, not having a 3DMark Firestrike score is like not having coffee in the morning. Basically, it’s a tie between all three chips, and 3DMark Firestrike is working exactly as you expect it to: as a GPU benchmark.</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/3dmarkphysics.png" alt="3d mark physics" title="3d mark physics" width="620" height="477" /></span></p> <p><span style="color: #000000;"><strong>3DMark Firestrike Physics Performance</strong></span></p> <p><span style="color: #000000;">3DMark does factor in the CPU performance for its physics tests. It’s certainly not weighted for multi-core counts as other tests are, but we see the Haswell-E with a decent 29 percent bump over the Devil’s Canyon chip. But, breathing down the neck of the Haswell-E is the Ivy Bridge-E chip. To us, that’s damned near a tie. Overall, the Haswell-E wins, but in gaming tasks—at stock clocks—paying for an 8-core monster is unnecessary except for those running multi-GPU setups.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/valveparticle.png" alt="valve particle" title="valve particle" width="620" height="451" /></span></p> <p><span style="color: #000000;"><strong>Valve Particle Benchmark Performance</strong></span></p> <p><span style="color: #000000;">Valve’s Particle test was originally developed to show off quad-core performance to the world. It uses the company’s own physics magic, so it should give some indication of how well a chip will run. We’ve long suspected the test is cache and RAM latency happy. That seems to be backed by the numbers because despite the 1.1GHz advantage the Devil’s Canyon chip has, the Haswell-E is in front to the tune of 15 percent. The Ivy Bridge-E chip though, with its large cache, lower latency DDR3, and assloads of memory bandwidth actually comes out on top by about 3 percent. We’ll again note the Ivy Bridge-E part has a 700MHz advantage, so this is a very nice showing for the Haswell-E part.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/dirtlow.png" alt="dirt showdown low" title="dirt showdown low" width="620" height="438" /></span></p> <p><span style="color: #000000;"><strong>Dirt Showdown low-resolution performance</strong></span></p> <p><span style="color: #000000;">For our gaming tests, we decided to run the games at 1366x768 resolution and at very low settings to take the graphics card out of the equation. In one way, you imagine this as what it would look like if you had infinitely powerful graphics cards in your system. As most games are not multi-threaded and are perfectly fine with a quad-core with Hyper-Threading, we fully expected the parts with the highest clock speeds to win all of our low-resolution, low-quality tests. No surprise, the Devil’s Canyon part at 4.4GHz private schools the 3.3GHz Haswell-E chip. And, no surprise, the 4GHz Ivy Bridge-E also eats the Haswell-E’s lunch and drinks its milk, too.</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/dirtultra.png" alt="dirt showdown ultra performance" title="dirt showdown ultra performance" width="620" height="475" /></span></p> <p><span style="color: #000000;"><strong>Dirt Showdown 1080p, ultra performance</strong></span></p> <p><span style="color: #000000;">To make sure we put everything in the right context, we also ran the Dirt Showdown at 1920x1080 resolution at Ultra settings. This puts most of the load on the single GeForce GTX 780 we used for our tests. Interestingly, we saw the Haswell-E with a slight edge over the Devil’s Canyon and Ivy Bridge-E parts. We’re not sure, but we don’t think it’s a very significant difference, but it’s still technically a win for Haswell-E.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/hitmanlow.png" alt="hitman low" title="hitman low" width="620" height="502" /></span></p> <p><span style="color: #000000;"><strong>Hitman: Absolution, low quality, low performance&nbsp;</strong></span></p> <p><span style="color: #000000;">We did the same with Hitman: Absolution, running it at low resolution and its lowest settings. The Haswell-E came in about 12 percent slower the Devil’s Canyon part and 13 percent slower than the Ivy Bridge-E.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/hitmanultra.png" alt="hitman ultra" title="hitman ultra" width="620" height="479" /></span></p> <p><span style="color: #000000;"><strong>Hitman: Absolution, 1080p, ultra quality</strong></span></p> <p><span style="color: #000000;">Again, we tick the settings to an actual resolution and quality at which people actually play. Once we do that, the gap closes slightly, with the Haswell-E trailing the Devil’s Canyon by about 8 percent and the Ivy Bridge-E by 9 percent. Still, these are all very playable frame rates and few could tell the difference.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/tombraider.png" alt="tomb raider low" title="tomb raider low" width="620" height="465" /></span></p> <p><span style="color: #000000;"><strong>Tomb Raider, low quality, low resolution.</strong></span></p> <p><span style="color: #000000;">We did the same low quality, low resolution trick with Tomb Raider and while need to see 500 frames per second, it’s pretty much a wash here.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/tomraiderulti.png" alt="tomb raider ultra" title="tomb raider ultra" width="620" height="472" /></span></p> <p><span style="color: #000000;"><strong>Tomb Raider, 1080p, Ultimate</strong></span></p> <p><span style="color: #000000;">At normal resolutions and settings we were a little surprised, as the Haswell-E actually had a 15 percent advantage over the Devil’s Canyon CPU. We’re not exactly sure why, as the only real advantage we can see is memory bandwidth and large caches on the Haswell-E part. We seriously doubt it’s due to the number of CPU cores. The Haswell-E also has a very, very slight lead against the Ivy Bridge-E part, too. That’s not bad considering the clock penalty it’s running at.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/metrolastlight.png" alt="metro last light low" title="metro last light low" width="620" height="503" /></span></p> <p><span style="color: #000000;"><strong>Metro Last Light, low resolution, low quality</strong></span></p> <p><span style="color: #000000;">In Metro Last light, at low settings it’s a wash between all of them.</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/metroveryhigh.png" alt="metro last light high" title="metro last light high" width="620" height="502" /></span></p> <p><span style="color: #000000;"><strong>Metro Last Light, 1080p, Very High quality</strong></span></p> <p><span style="color: #000000;">Metro at high-quality settings mirrors that of Hitman: Absolution, and we think favors the parts with higher clock speeds. We should also note that none of the chips with the $500 graphics card could run Metro at 1080p at high-quality settings. That is, of course, you consider 30 to 40 fps to be “smooth.” We don’t. Interestingly, the Core i7-4690X was the overall winner.</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><strong>Conclusion:</strong> If you skipped to the very last page to read the conclusion, you’re in the wrong place. You need to go back to page 4 to read our conclusions and what you should buy. And no, we didn’t do this to generate just one more click either though that would be very clever of us wouldn’t it?</p> http://www.maximumpc.com/haswell-e_review_2014#comments benchmarks cpu haswell e intel ivy bridge e maximum pc processor Review Specs News Reviews Features Tue, 09 Sep 2014 23:03:30 +0000 Gordon Mah Ung 28431 at http://www.maximumpc.com Maingear Epic Force Video Review http://www.maximumpc.com/maingear_epic_force_video_review_2014 <!--paging_filter--><h3>See what a $12,000 gaming rig looks like</h3> <p>One of the best parts of this job is getting to play with hardware we can’t afford. For this video, Gordon walks you through Maingear’s Epic Force which is a tour de force of beautiful plumbing even Mario would be proud of. The machine, delivered to us before Intel’s epic Core i7-5960X “<a title="haswell e" href="http://www.maximumpc.com/haswell-e_review_2014" target="_blank">Haswell-E</a>” is built on an overclocked Core i7-4790K “Devil’s Canyon” chip and packs a pair of water cooled Radeon R9 295 X2 graphics cards.</p> <p><iframe src="//www.youtube.com/embed/yNoxJJ70se0" width="620" height="349" frameborder="0"></iframe></p> <p>What do you think of the Maingear Epic Force PC? Let us know in the comments below.</p> http://www.maximumpc.com/maingear_epic_force_video_review_2014#comments big chassis Desktop Hardware maingear epic force maximum pc MPCTV pc Review video Reviews Systems Mon, 08 Sep 2014 21:05:28 +0000 Gordon Mah Ung 28498 at http://www.maximumpc.com Build it: Real-World 4K Gaming Test Bench http://www.maximumpc.com/build_it_real-world_4k_gaming_test_bench_2014 <!--paging_filter--><h3>This month, we find out what it takes to run games at 4K, and do so using a sweet open-air test bench</h3> <p>The computer world loves it when specs double from one generation to the next. We’ve gone from 16-bit to 32-bit, and finally 64-bit computing. We had 2GB RAM sticks, then 4GB, then 8GB. With monitor resolutions, 1920x1080 has been the standard for a while, but we never quite doubled it, as 2560x1600 was a half-step, but now that 4K resolution has arrived, it’s effectively been exactly doubled, with the panels released so far being 3840x2160. We know it’s not actually 4,000 pixels, but everyone is still calling it “4K.” Though resolution is doubled over 1080p, it’s the equivalent number of pixels as four 1080p monitors, so it takes a lot of horsepower to play games smoothly. For example, our 2013 Dream Machine used four Nvidia GeForce GTX Titans and a CPU overclocked to 5GHz to handle it. Those cards cost $4,000 altogether though, so it wasn’t a scenario for mere mortals. This month, we wanted to see what 4K gaming is like with more-affordable parts. We also wanted to try a distinctive-looking open test bench from DimasTech. This type of case is perfect for SLI testing, too, since it makes component installation and swapping much quicker.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/beauty_shot_small_29.jpg"><img src="/files/u152332/beauty_shot_small_28.jpg" width="620" height="417" /></a></p> <h3>Triple Threat</h3> <p>Instead of GTX Titans, we’re stepping it down a couple of notches to Nvidia GTX 780s. They provide similar gaming performance, but at half the cost. We’re also using “only” three cards instead of four, so the price difference from Dream Machine to this rig is a whopping $2500 (even more if you count the fact that the Dream Machine cards were water-cooled). These cards still need a lot of bandwidth, though, so we’re sticking with an Intel LGA 2011 motherboard, this time an Asus X79 Deluxe. It’s feature-packed and can overclock a CPU like nobody’s business. The X79 Deluxe is running Intel’s Core i7-4960X CPU, which has six cores and twelve processing threads. It’s kind of a beast. We’re cooling it with a Cooler Master Glacer 240L water cooler, which comes with a 240mm radiator.</p> <p>We’ll also need a boatload of power, so we grabbed a Corsair AX1200 PSU which, as its name suggests, supplies up to 1200 watts. It’s also fully modular, meaning that its cables are all detachable. Since we’re only using one storage device in this build, we can keep a lot of spare cables tucked away in a bag, instead of cluttering up the lower tray.</p> <p>All of this is being assembled on a DimasTech Easy V3 test bench, which is a laser-cut steel, hand-welded beauty made in Italy and painted glossy red. It can handle either a 360mm or 280mm radiator as well, and it comes with an articulating arm to move a case fan around to specific areas. It seems like the ultimate open-air test bench, so we’re eager to see what we can do with it.&nbsp;&nbsp; \</p> <h4>1. Case Working</h4> <p>The DimasTech Easy V3 comes in separate parts, but the bulk of it is an upper and lower tray. You slide the lower one in and secure it with a bundled set of six aluminum screws. The case’s fasteners come in a handy plastic container with a screw-on lid. Shown in the photo are the two chromed power and reset buttons, which are the last pieces to be attached. They have pre-attached hexagonal washers, which can be a bit tricky to remove. We had to use pliers on one of them. You’ll need to wire them up yourself, but there’s a diagram included. Then, connect the other head to the motherboard’s front panel header, which has its own diagram printed on the board.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/a_small_29.jpg"><img src="/files/u152332/a_small_28.jpg" title="Image A" width="620" height="413" /></a></p> <h4>2. Getting Testy</h4> <p>Unfortunately, the Easy V3 does not ship with a 2.5-inch drive bay, nor do standard 3.5-inch to 2.5-inch adapters fit inside the bays. If you want to install a solid-state drive, you need to purchase the correctly sized bay or adapter separately from DimasTech. Since this is an open test bench, which is designed for swapping parts quickly, we chose to just leave the drive unsecured. It has no moving parts, so it doesn’t need to be screwed down or even laid flat to work properly. We also moved the 5.25-inch drive bay from the front to the back, to leave as much room as possible to work with our bundle of PSU cables. The lower tray has a number of pre-drilled holes to customize drive bay placement. Meanwhile, our power supply must be oriented just like this to properly attach to the case’s specified bracket. It’s not bad, though, because this positions the power switch higher up, where it’s less likely to get bumped accidentally.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/b_small_24.jpg"><img src="/files/u152332/b_small_23.jpg" title="Image B" width="620" height="413" /></a></p> <h4>3. Able Cables</h4> <p>The best way to install a modular power supply is to attach your required cables first. This time, we got a kit from Corsair that has individually sleeved wires. It costs $40, and also comes in red, white, or blue. Each of these kits is designed to work with a specific Corsair power supply. They look fancier than the stock un-sleeved cables, and the ones for motherboard and CPU power are a lot more flexible than the stock versions. All of the connectors are keyed, so you can’t accidentally plug them into the wrong socket. We used a few black twist ties to gather in the PCI Express cables.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/c_small_27.jpg"><img src="/files/u152332/c_small_26.jpg" title="Image C" width="620" height="413" /></a></p> <h4>4. Taking a Stand(off)</h4> <p>The Easy V3 comes with an unusually tall set of metal motherboard standoffs. These widgets prevent the motherboard from touching the tray below and possibly creating a short circuit. You can screw these in by hand, optionally tightening them up with a pair of pliers. Once those were in, we actually used some thumbscrews bundled with the case to screw the board down on the standoffs. You can use more standard screws, but we had plenty to spare, and we liked the look. The tall standoffs also work nicely with custom liquid cooling loops, because there is enough clearance to send thick tubing underneath (and we’ve seen lots of photos on the Internet of such setups). For us, it provided enough room to install a right-angle SATA cable and send it through the oval cut-out in the tray and down to the SSD below.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/d_small_23.jpg"><img src="/files/u152332/d_small_22.jpg" title="Image D" width="620" height="413" /></a></p> <p style="text-align: center;">&nbsp;</p> <hr /> <p>&nbsp;</p> <h4>5. Triple Play</h4> <p>This bench has a black bracket that holds your PCIe cards and can be slid parallel to the motherboard to accommodate different board layouts. It will take up to four two-slot cards, and DimasTech sells a longer 10-slot bracket on its website for workstation boards. We had to use the provided aluminum thumbscrews to secure the cards, since all of the screws we had in The Lab were either too coarsely threaded or not the right diameter, which is unusual. Installing cards is easy, because your view of the board slot is not blocked by a case. The video cards will end up sandwiched right next to each other, though, so you’ll need a tool to release the slot-locking mechanism on two of them (we used a PCI slot cover). The upper two cards can get quite toasty, so we moved the bench’s built-in flexible fan arm right in front of their rear intake area, and we told the motherboard to max out its RPM. We saw an immediate FPS boost in our tests, because by default these cards will throttle once they get to about 83 C.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/e_small_21.jpg"><img src="/files/u152332/e_small_20.jpg" title="Image E" width="620" height="413" /></a></p> <h4>6. Cool Under Pressure</h4> <p>Since the Glacer 240L cooler has integrated tubing that’s relatively short, the orientation pictured was our only option. We could have put the fans on the other side of the radiator, but since performance was already superb, we decided we liked the looked of them with the grills on top. To mount the radiator, we used the bundled screws, which became the right length when we added some rubber gaskets, also included.&nbsp; The radiator actually doesn’t give off much heat, even when the CPU is overclocked and firing on all cylinders, so we didn’t have to worry about the nearby power supply fan pulling in a lot of hot intake. In fact, the CPU never crossed 65C in all of our benchmarks, even when overclocked to 4.5GHz. We even threw Prime95 at it, and it didn’t break a sweat. Temperatures are also affected by ambient temperatures, though. With our open-air layout, heat coming out of the GPUs doesn’t get anywhere near the radiator, and The Lab’s air conditioning helps keep temperatures low, so it’s pretty much an ideal environment, short of being installed in a refrigerator. Your mileage may vary.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/f_small_22.jpg"><img src="/files/u152332/f_small_21.jpg" title="Image F" width="620" height="413" /></a></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/main_image_small_18.jpg"><img src="/files/u152332/main_image_small_17.jpg" title="Main Image" width="620" height="382" /></a></p> <h3>A Golden Triangle</h3> <p>Despite our penchant for extreme performance, we rarely build triple-GPU systems, so we weren’t sure how well they’d handle 4K, but we figured they’d kick ass. Thankfully, they handled UHD quite well. So well, in fact, that we also tested the system with “only” two GTX 780s and still got respectable gaming performance. For example, with two cards, the Bioshock Infinite benchmark reported an average of a little over 60 FPS on its highest settings. In Tomb Raider, we disabled anti-aliasing and TressFX, maxing out all the other settings, and we still averaged 62 FPS. We benchmarked the opening sequence of Assassin’s Creed 4 with AA and PhysX disabled and everything else maxed out, and we averaged 47 FPS. The Metro: Last Light benchmark, however, averaged 25FPS on max settings, even with PhysX disabled.</p> <p>Unfortunately, we had trouble getting Hitman: Absolution and Metro: Last Light to recognize the third card. This issue is not unheard of, and made us think: If you stick with two GPUs, you no longer need the PCI Express bandwidth of expensive LGA 2011 CPUs, or their equally expensive motherboards, or a huge power supply. That potentially cuts the cost of this system in half, from around $4200 to roughly $2100. You could also save money by going with, say, a Core i7-4930K instead, and a less expensive LGA 2011 motherboard and a smaller SSD. But it’s still a pretty steep climb in price when going from two cards to three.</p> <p>The test bench itself feels sturdy and looks sweet, but we wish that it accepted standard computer-type screws, and that it came with a 2.5-inch drive bay or could at least fit a standard 3.5-to-2.5 adapter. We’d also recommend getting a second articulating fan arm if you’re liquid-cooling, so that one could provide airflow to the voltage regulators around the CPU, and the other could blow directly on your video cards. With the fan aimed at our cards, we instantly gained another 10 FPS in the Tomb Raider benchmark.</p> <p>The Seagate 600 SSD was nice and speedy, although unzipping compressed files seemed to take longer than usual. The X79 Deluxe motherboard gave us no trouble, and the bundled “Asus AI Suite III” software has lots of fine-grained options for performance tuning and monitoring, and it looks nice. Overall, this build was not only successful but educational, too.</p> <div class="module orange-module article-module"><strong><span class="module-name">Benchmarks</span></strong><br /> <div class="spec-table orange"> <table style="width: 627px; height: 270px;" border="0"> <thead> <tr> <th class="head-empty"> </th> <th class="head-light"> <p style="font-size: 10px; font-weight: normal; text-align: start;"><strong>ZERO</strong></p> <p style="font-size: 10px; font-weight: normal; text-align: start;"><strong>POINT</strong></p> </th> <th></th> </tr> </thead> <tbody> <tr> <td class="item">Premiere Pro CS6 (sec)</td> <td class="item-dark">2,000</td> <td><span style="text-align: center;">1,694</span><strong>&nbsp;</strong></td> </tr> <tr> <td>Stitch.Efx 2.0 (sec)</td> <td>831</td> <td><span style="text-align: center;">707</span><strong>&nbsp;</strong></td> </tr> <tr> <td class="item">ProShow Producer 5.0 (sec)</td> <td class="item-dark">1,446</td> <td>1,246</td> </tr> <tr> <td>x264 HD 5.0 (fps)</td> <td>21.1</td> <td>25.6<strong></strong></td> </tr> <tr> <td>Batmans Arkam City (fps)</td> <td>76</td> <td>169<strong></strong></td> </tr> <tr> <td class="item">3DMark11 Extreme</td> <td class="item-dark">5,847&nbsp;</td> <td>12,193</td> </tr> </tbody> </table> </div> </div> <p><span style="font-size: 10px; font-weight: bold;"><em>The zero-point machine compared here consists of a 3.2GHz Core i7-3930K and 16GB of Corsair DDR3/1600 on an Asus P9X79 Deluxe motherboard. It has a GeForce GTX 690, a Corsair Neutron GTX SSD, and 64-bit Windows 7 Professional.</em></span></p> http://www.maximumpc.com/build_it_real-world_4k_gaming_test_bench_2014#comments 4k computer gaming pc geforce Hardware maximum pc May issues 2014 nvidia open Test Bench Features Wed, 03 Sep 2014 19:29:01 +0000 Tom McNamara 28364 at http://www.maximumpc.com