Review http://www.maximumpc.com/taxonomy/term/367/ en Nvidia GeForce GTX 980 Review http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014 <!--paging_filter--><h3><span style="font-size: 1.17em;">4K and SLI tested on Nvidia's high-end Maxwell card</span></h3> <p>Sometimes things don't go according to plan. Both AMD and Nvidia were supposed to have shifted to 20-nanometer parts by now. In theory, that's supposed to get you lower temperatures, higher clock speeds and quieter operation. Due to circumstances largely out of its control, Nvidia has had to go ahead with a 28nm high-end Maxwell part instead, dubbed GM204. This is not a direct successor to the GTX 780, which has more transistors, texture mapping units, and things like that. The 980 is actually the next step beyond the GTX 680, aka GK104, which was launched in March 2012.</p> <p>Despite that, our testing indicates that the GTX 980 can still be meaningfully faster than the GTX 780 and 780 Ti (and AMD’s Radeon R9 290 and 290X, for that matter, though there are a of couple games better optimized for Radeon hardware). When 20nm processes become available sometime next year, we’ll probably see the actual successor to the GTX 780. But right now, the GTX 980 is here, and comes in at $500. That seems high at first, but recall that the GTX 680, 580, and 480 all launched at this price. And keep in mind that it’s a faster card than the 780 and 780 Ti, which currently cost more. (As we wrote this, AMD announced that it was dropping the base price of the R9 290X from $500 to $450, so that war rages on.) The GTX 970 at $329 may be a better deal, but we have not yet obtained one of those for testing.</p> <p>In other news, Nvidia told us that they were dropping the price of the GTX 760 to $219, and the GTX 780 Ti, 780 and 770 are being officially discontinued. So if you need a second one of those for SLI, now is a good time.</p> <p>Let's take a look at the specs:</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 970</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Generation</td> <td>GM204</td> <td>GM204&nbsp;</td> <td>GK104&nbsp;</td> <td>GK104&nbsp;</td> <td class="item-dark">GK110</td> <td>&nbsp;Hawaii</td> </tr> <tr> <td>Core Clock (MHz)</td> <td>&nbsp;1126</td> <td>&nbsp;1050</td> <td>&nbsp;1006</td> <td>&nbsp;863</td> <td>876</td> <td>&nbsp;"up to" 1GHz</td> </tr> <tr> <td class="item">Boost Clock (MHz)</td> <td>&nbsp;1216</td> <td>&nbsp;1178</td> <td>&nbsp;1058</td> <td>&nbsp;900</td> <td class="item-dark">928</td> <td>&nbsp;N/A</td> </tr> <tr> <td>VRAM Clock (MHz)</td> <td>&nbsp;7000</td> <td>&nbsp;7000</td> <td>&nbsp;6000</td> <td>&nbsp;6000</td> <td>7000</td> <td>&nbsp;5000</td> </tr> <tr> <td>VRAM Amount</td> <td>&nbsp;4GB</td> <td>&nbsp;4GB</td> <td>&nbsp;2GB/4GB</td> <td>&nbsp;3GB/6GB</td> <td>3GB</td> <td>&nbsp;4GB</td> </tr> <tr> <td>Bus</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;256-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;384-bit</td> <td>&nbsp;512-bit</td> </tr> <tr> <td>ROPs</td> <td>&nbsp;64</td> <td>&nbsp;64</td> <td>&nbsp;32</td> <td>&nbsp;48</td> <td>48</td> <td>&nbsp;64</td> </tr> <tr> <td>TMUs</td> <td>&nbsp;128</td> <td>&nbsp;104</td> <td>&nbsp;128</td> <td>&nbsp;192</td> <td>240</td> <td>&nbsp;176</td> </tr> <tr> <td>Shaders</td> <td>&nbsp;2048</td> <td>&nbsp;1664</td> <td>&nbsp;1536</td> <td>&nbsp;2304</td> <td>2880</td> <td>&nbsp;2816</td> </tr> <tr> <td>SMs</td> <td>&nbsp;16</td> <td>&nbsp;13</td> <td>&nbsp;8</td> <td>&nbsp;12</td> <td>&nbsp;15</td> <td>&nbsp;N/A</td> </tr> <tr> <td>TDP (watts)</td> <td>&nbsp;165</td> <td>&nbsp;145</td> <td>&nbsp;195</td> <td>&nbsp;250</td> <td>&nbsp;250</td> <td>&nbsp;290</td> </tr> <tr> <td>Launch Price</td> <td>&nbsp;$549</td> <td>&nbsp;$329</td> <td>&nbsp;$499</td> <td>&nbsp;$649</td> <td>&nbsp;$699</td> <td>&nbsp;$549</td> </tr> </tbody> </table> </div> <p>On paper, the 980 and 970 don't look like much of a jump from the 680. In fact, the 980 has only 128 shaders (aka "CUDA cores") per streaming multi-processor (SM). Performance tends to increase with a higher number of shaders per SM, so how did the 980 GTX perform so well in our benches, despite having a worse ratio than all the other cards? Well, Nvidia claims that they've improved the performance of each CUDA core by 40%. Provided that this calculation is accurate, the GTX 980 effectively has about as many CUDA cores as a 780 Ti. Add the GTX 980's bigger clock speeds, and performance should be higher.</p> <p><img src="/files/u160416/7g0a0209_620.jpg" width="620" height="349" /></p> <p>You probably also noticed the unusually low price for the GTX 970. The GTX 670 launched at $400 in May 2012, and the GTX 570 launched at $350 in December 2010. These earlier two cards were also had more similar specs compared to their bigger brothers. For example, the GTX 570 had 480 CUDA cores, while the 580 had 512 cores. This is a difference of just 6.25%, although the memory bus was reduced from 384-bits to 320-bits. In contrast, the 970 gets nearly 20% fewer CUDA cores than the 980, though its memory bus remains unchanged. As we said, we haven't gotten a 970 in yet, but, based on its specs, we doubt that we can compensate with overclocking, as we've been able to do in the past with the GTX 670 and 760, and the Radeon R9 290.</p> <p>Nvidia also says that the official boost clock on these new Maxwell cards is not set in stone. We witnessed our cards boosting up to 1,253MHz for extended periods of time (i.e., 20 seconds here, 30 seconds there). When the cards hit their thermal limit of 80 degrees Celsius, they would fall down as low as 1,165Mhz, but we never saw them throttle below the official base clock of 1,126MHz. In SLI, we also noted that the upper card would go up to 84 C. According to Nvidia, these cards have an upper boundary of 95 C, at which point they will throttle below the base clock to avoid going up in smoke. We were not inclined to test that theory, for now.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,1" target="_blank">Next Page: Voxels, new anti-aliasing, and VR</a></h4> <hr /> <p>The company also says that its delta color compression algorithms have improved bandwidth requirements by about 25 percent on average (it varies from game to game). This extra headroom provides more space for increased frame rates. Since DCC directly affects pixels, this effect should scale with your resolution, becoming increasingly helpful as you crank your res higher.</p> <p>You can also combine these gains with Nvidia’s new Multi-Frame Sampled Anti-Aliasing (MFAA). This technique rotates a pixel’s sampling points from one frame to the next, so that two of these points can simulate the visual results of four sampling points whose locations remain static. The effect starts to shimmer at about 20FPS, whereupon it’s automatically disabled. But when running well, Nvidia claims that it can be 30 percent faster, on average, than the visually equivalent level of Multi-Sample Anti-Aliasing (MSAA). Like TXAA (Temporal Anti-Aliasing), this technique won’t be available on AMD cards (or if it is, it will be built by AMD from the ground up and called something else).</p> <p><img src="/files/u160416/7g0a0238_resize.jpg" width="620" height="349" /></p> <p>Unfortunately, MFAA was not available in the version 344.07 beta drivers given to us for testing, but Nvidia said it would be in the driver after this one. This means that the package will not be complete on launch day. Support will trickle down to the older Kepler cards later on. Nvidia hasn’t been specific about timelines of specific cards, but it sounded like the 750 and 750 Ti (also technically Maxwell cards), will not be invited to this party.</p> <p>Another major upgrade is Voxel Global Illumination, or VXGI. Nvidia positions this as the next step beyond ambient occlusion. With VXGI, light bounces off of surfaces to illuminate nooks and crannies that would otherwise not be lit realistically, in real time. Ordinarily, light does not bounce around in a 3D game engine like it does in meatspace. It simply hits a surface, illuminates it, and that’s the end. Sometimes the lighting effect is just painted onto the texture. So there’s a lot more calculation going on with VXGI.</p> <p><img src="/files/u160416/maxwell_die_620.jpg" width="620" height="349" /></p> <p>But Nvidia has not made specific performance claims because the effect is highly scalable. A developer can choose how many cones of light they want to use, and the degree of bounced light resolution (you can go for diffused/blurry spots of light, or a reflection that’s nearly a mirror image of the bounced surface), and they balance this result against a performance target. Since this is something that has to be coded into the game engine, we won’t see that effect right away by forcing it in the drivers, like Nvidia users can with ambient occlusion.</p> <p>Next is Dynamic Super Resolution (in the 344.11 drivers released today, so we'll be giving this one a peek soon). This tech combines super-sampling with a custom filter. Super sampling takes a higher resolution that your monitor can display and squishes it down. This is a popular form of anti-aliasing, but the performance hit is pretty steep. The 13-tap Gaussian filter that the card lays on top can further smooth out jaggies. It's a post-process effect that's thankfully very light, and you can also scale DSR down from 3840x2160 to 2560x1440. It's our understanding that this effect is only available to owners of the 980 and 970, at least for now, but we'll be checking on that ASAP.</p> <p>Nvidia is also investing more deeply into VR headsets with an initiative called VR Direct. Their main bullet point is a reduction in average latency from 50ms to 25ms, using a combination of code optimization, MFAA, and another new feature called Auto Asynchronous Warp (AAW). This displays frames at 60fps even when performance drops below that. Since each eye is getting an independently rendered scene, your PC effectively needs to maintain 120FPS otherwise, which isn’t going to be common with more demanding games. AAW takes care of the difference. However, we haven’t had the opportunity to test the GTX 980 with VR-enabled games yet.</p> <p>Speaking of which, Nvidia is also introducing another new feature called Auto Stereo. As its name implies, it forces stereoscopic rendering in games that were not built with VR headsets in mind. We look forward to testing VR Direct at a later date.</p> <p>Lastly, we also noticed that GeForce Experience can now record at this resolution. It was previously limited to 2560x1600.</p> <p>Until we get our hands on MFAA and DSR, we have some general benchmarks to tide you over. We tested the GTX 980 in two-way SLI and by itself, at 2560x1600 and 3820x2160. We compared it to roughly equivalent cards that we've also run in solo and two-way configs.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,2" target="_blank">Next Page: SLI Benchmarks!</a></h4> <hr /> <p>Here's the system that we've been using for all of our recent GPU benchmarks:</p> <div class="spec-table orange" style="text-align: center;"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> </tr> <tr> <td class="item">CPU</td> <td class="item-dark">Intel Core i7-3960X (at stock clock speeds; 3.3GHz base, 3.9GHz turbo)</td> </tr> <tr> <td>CPU Cooler</td> <td>Corsair Hydro Series H100</td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Asus Rampage IV Extreme</td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.Skill Ripjaws X, 2133MHz CL9</td> </tr> <tr> <td>Power Supply</td> <td>Thermaltake Toughpower Grand (1,050 watts)</td> </tr> <tr> <td>SSD</td> <td>1TB Crucial M550</td> </tr> <tr> <td>OS</td> <td>Windows 8.1 Update 1</td> </tr> <tr> <td>Case</td> <td>NZXT Phantom 530&nbsp;</td> </tr> </tbody> </table> </div> <p class="MsoNormal" style="text-align: left;"><span style="text-align: center;">Now, let’s take a look at our results at 2560x1600 with 4xMSAA. For reference, this is twice as many pixels as 1920x1080. So gamers playing at 1080p on a similar PC can expect roughly twice the framerate, if they use the same graphical settings. We customarily use the highest preset provided by the game itself; for example, <em>Hitman: Absolution</em> is benchmarked with the “Ultra” setting. 3DMark runs the Firestrike test at 1080p, however. We also enable TressFX in Tomb Raider, and PhysX in Metro: Last Light.</span></p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>33</strong></td> <td>&nbsp;19</td> <td>25</td> <td class="item-dark">&nbsp;27</td> <td>&nbsp;26</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>46</strong></td> <td>&nbsp;21</td> <td>&nbsp;22</td> <td>&nbsp;32</td> <td>&nbsp;30</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;75</td> <td>&nbsp;51</td> <td>&nbsp;65</td> <td>&nbsp;<strong>78</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;27</td> <td>&nbsp;40</td> <td>&nbsp;45</td> <td>&nbsp;<strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;45</td> <td>&nbsp;30</td> <td>&nbsp;43</td> <td>&nbsp;<strong>48</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;64</td> <td>&nbsp;35</td> <td>&nbsp;<strong>39</strong></td> <td>&nbsp;34</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>11,490</strong></td> <td>&nbsp;6,719</td> <td>&nbsp;8,482</td> <td>&nbsp;9,976</td> <td>9,837</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong><strong>)</strong></p> <p class="MsoNormal" style="text-align: left;">To synthesize the results into a few sentences, we would say that the 980 is doing very well for its price. It’s not leapfrogging over the 780 and 780 Ti, but Nvidia indicates that it’s not supposed to anyway. It dominates the GTX 680, but that card is also two years old and discontinued, so the difference is not unexpected or likely to change buying habits. The R9 290X, meanwhile, is hitting $430, while the not-much-slower 290 can be had for as little as $340. And you can pick up a 780 Ti for $560. So the GTX 980's price at launch is going to be a bit of a hurdle for Nvidia.</p> <p class="MsoNormal" style="text-align: left;">Performance in Metro: Last Light has also vastly improved. (We run that benchmark with “Advanced PhysX” enabled, indicating that Nvidia has made some optimizations there. Further testing is needed.) Loyal Radeon fans will probably not be swayed to switch camps, at least on the basis of pure performance. Hitman in particular does not appear to favor the Green Team.</p> <p class="MsoNormal" style="text-align: left;">We were fortunate enough to obtain a second GTX 980, so we decided to set them up in SLI, at the same resolution of 2560x1600. Here, the differences are more distinct. We’ve honed the comparison down to the most competitive cards that we have SLI/CF benchmarks for. (Unfortunately, we do not have a second GTX 680 in hand at this time. But judging by its single-card performance, it's very unlikely to suddenly pull ahead.) For this special occasion, we brought in the Radeon R9 295X2, which has two 290X GPUs on one card and has been retailing lately for about a thousand bucks.</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;<strong>66</strong></td> <td>&nbsp;45</td> <td>&nbsp;56</td> <td>&nbsp;50</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>70</strong></td> <td>&nbsp;52</td> <td>&nbsp;53</td> <td>&nbsp;48</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;131</td> <td>&nbsp;122</td> <td>&nbsp;<strong>143</strong></td> <td>&nbsp;90</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;77</td> <td>&nbsp;74</td> <td>&nbsp;<strong>79</strong></td> <td>&nbsp;79</td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;80</td> <td>&nbsp;72</td> <td>&nbsp;<strong>87</strong></td> <td>&nbsp;41</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;73</td> <td>&nbsp;60</td> <td><strong>&nbsp;77</strong></td> <td>&nbsp;65</td> </tr> <tr> <td>3DMark Firestrike</td> <td>&nbsp;<strong>17,490</strong></td> <td>&nbsp;14,336</td> <td>&nbsp;16,830</td> <td>&nbsp;15,656</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p class="MsoNormal" style="text-align: left;">While a solo 980 GTX is already a respectable competitor for the price, its success is more pronounced when we add a second card—as is the gap between it and the 780 Ti. It still continues to best the GTX 780, getting us over 60 FPS in each game with all visual effects cranked up. That's an ideal threshold. It also looks like Nvidia's claim of 40 percent improved CUDA core performance may not be happening consistently. Future driver releases should reveal if this is a matter of software optimization, or if it's a limitation in hardware. Or just a random cosmic anomaly.</p> <h4 style="text-align: right;"><a href="http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014?page=0,3" target="_blank">Next Page: 4K benchmarks and conclusion</a></h4> <hr /> <p class="MsoNormal" style="text-align: left;">So, what happens when we scale up to 3840x2160, also known as “4K”? Here we have almost twice as many pixels as 2560x1600, and four times as many as 1080p. Can the GTX 980’s 256-bit bus really handle this much bandwidth?</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 680</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;16</td> <td>&nbsp;8.7*</td> <td>&nbsp;26</td> <td class="item-dark">&nbsp;<strong>28</strong></td> <td>&nbsp;28</td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>36</strong></td> <td>&nbsp;12</td> <td>&nbsp;18</td> <td>&nbsp;19</td> <td>&nbsp;18</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;35</td> <td>&nbsp;25</td> <td>&nbsp;33</td> <td>&nbsp;<strong>38</strong></td> <td>&nbsp;38</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;20</td> <td>&nbsp;15</td> <td>&nbsp;20</td> <td>&nbsp;24</td> <td><strong>&nbsp;28</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;19</td> <td>&nbsp;15</td> <td>&nbsp;<strong>30</strong></td> <td><strong>&nbsp;30</strong></td> <td>&nbsp;26</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;19</td> <td>&nbsp;11</td> <td>&nbsp;<strong>23</strong></td> <td><strong>&nbsp;23</strong></td> <td>&nbsp;18</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>*TressFX disabled</p> <p>The 980 is still scaling well, but the 384-bit 780 and 780 Ti are clearly scaling better, as is the 512-bit 290X. (<strong>Update:</strong>&nbsp;We've re-checked our test results for Hitman: Absolution, and the AMD cards weren't doing nearly as well as we originally thought, though they're still the best option for that particular game. The Batman tests have been re-done as well.) We had to disable TressFX when benchmarking the 680, because the test would crash otherwise, and it was operating at less than 1FPS anyway. At 4K, that card basically meets its match, and almost its maker.</p> <p>Here's 4K SLI/Crossfire. All tests are still conducted at 4xMSAA, which is total overkill at 4K, but we want to see just how hard we can push these cards. (Ironically, we have most of the SLI results for the 290X here, but not for 2560x1600. That's a paddlin'.)</p> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td></td> <td>GTX 980</td> <td>GTX 780</td> <td>GTX 780 Ti</td> <td>R9 290X</td> <td>R9 295X2</td> </tr> <tr> <td class="item">Tomb Raider</td> <td>&nbsp;33</td> <td>&nbsp;41</td> <td>&nbsp;44</td> <td class="item-dark">&nbsp;52</td> <td>&nbsp;<strong>53</strong></td> </tr> <tr> <td>Metro: Last Light</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;21</td> <td>&nbsp;27</td> <td>&nbsp;29</td> <td>&nbsp;26</td> </tr> <tr> <td>Batman: Arkham Origins</td> <td>&nbsp;<strong>68</strong></td> <td>&nbsp;60</td> <td>&nbsp;65</td> <td>&nbsp;67</td> <td>&nbsp;66</td> </tr> <tr> <td>Hitman: Absolution</td> <td>&nbsp;42</td> <td>&nbsp;40</td> <td>&nbsp;44</td> <td><strong>&nbsp;53</strong></td> <td><strong>&nbsp;</strong><strong>50</strong></td> </tr> <tr> <td>Unigine Valley</td> <td>&nbsp;39</td> <td>&nbsp;<strong>43</strong></td> <td>&nbsp;40</td> <td>&nbsp;24</td> <td>&nbsp;19</td> </tr> <tr> <td>Unigine Heaven</td> <td>&nbsp;34</td> <td>&nbsp;33</td> <td>&nbsp;<strong>44</strong></td> <td>&nbsp;17</td> <td>&nbsp;34</td> </tr> </tbody> </table> </div> <p>(best scores <strong>bolded</strong>)</p> <p>It does appear that the raw memory bandwidth of the 780, 780 Ti, and 290X come in handy at this resolution, despite the optimizations of Maxwell CUDA cores. That Metro: Last Light score remains pretty interesting. It's the only one we run with PhysX enabled (to balance out using TressFX in Tomb Raider). It really does look like Maxwell is much better at PhysX than any other GPU before it. That tech isn't quite common enough to change the game. But if the difference is as good as our testing indicates, more developers may pick it up.</p> <p>Even a blisteringly fast card can be brought down by high noise levels or prodigious heat. Thankfully, this reference cooler is up to the task. Keep in mind that this card draws up to 165 watts, and its cooler is designed to handle cards that go up to 250W. But even with the fan spinning up to nearly 3,000rpm, it’s not unpleasant. With the case side panels on, you can still hear the fan going like crazy, but we didn’t find it distracting. These acoustics only happened in SLI, by the way. Without the primary card sucking in hot air from the card right below it, its fan behaved much more quietly. The GTX 980’s cooling is nothing like the reference design of the Radeon R9 290 or 290X.</p> <p><img src="/files/u160416/key_visual_620.jpg" width="620" height="349" /></p> <p>With a TDP of just 165W, a respectable 650-watt power supply should have no trouble powering two 980 GTXs. Meanwhile, the 290-watt R9 290X really needs a nice 850-watt unit to have some breathing room, and even more power would not be unwelcome.</p> <p>Since MFAA and DSR were not available in the driver that was supplied for testing, there’s more story for us to tell over the coming weeks. (<strong>Update</strong>: DSR settings are actually in this driver, just not in the location that we were expecting.) And we still need to do some testing with VR. But as it stands right now, the GTX 980 is another impressive showing for Nvidia. Its 4K scaling isn't as good as we'd like, especially since Maxwell is currently the only tech that will have Dynamic Super Resolution. If you want to play at that level, it looks like the 290 and 290X are better choices, price-wise, while the overall performance crown at 4K still belongs to the 780 and 780 Ti. But considering the price difference between the 980 and the 780, its similar performance is commendable.</p> <p>For 2560x1600 or lower resolutions, the 980 GTX emerges as a compelling option, but we're not convinced that it's over $100 better than a 290X. Then again, you have MFAA, DSR, and VR Direct, (and the overall GeForce Experience package that's a bit slicker than AMD's Gaming Evolved) which might work some people, or for Nvidia loyalists who've been waiting for an upgrade from their 680 that's not quite as expensive as the 780 or 780 Ti.</p> <p><a href="http://www.pcgamer.com/2014/09/19/nvidia-gtx-980-tested-sli-4k-and-single-gpu-benchmarks-and-impressions/" target="_blank">Our amigo Wes Fenlon over at PC Gamer has a write-up of his own</a>, so go check it out.</p> http://www.maximumpc.com/nvidia_geforce_gtx_980_review2014#comments 4k 980 GTX benchmarks comparison geforce gpu nvidia performance Review sli Video Card Videocards Fri, 19 Sep 2014 03:04:15 +0000 Tom McNamara 28564 at http://www.maximumpc.com Haswell-E Review http://www.maximumpc.com/haswell-e_review_2014 <!--paging_filter--><h3>UPDATE: We've updated our Haswell- E story to include our video on Haswell-E (X99) motheboards</h3> <p>After three long years of going hungry with quad-cores, red meat is finally back on the menu for enthusiasts. And not just any gamey slab full of gristle with shared cores, either. With its new eight-core Haswell-E CPU, Intel may have served up the most mouth-watering, beautifully seared piece of red meat in a long time.</p> <p><iframe src="//www.youtube.com/embed/aNTMIHr9Ha0" width="620" height="349" frameborder="0"></iframe></p> <p>And it’s a good thing, too, because enthusiast’s stomachs have been growling. Devil’s Canyon? That puny quad-core was just an appetizer. And that dual-core highly overclockable Pentium K CPU? It’s the mint you grab on your way out of the steak house.</p> <p><iframe src="//www.youtube.com/embed/_h9ggGZHFtU" width="620" height="349" frameborder="0"></iframe></p> <p>No, what enthusiasts have craved and wanted ever since Intel’s original clock-blocking job on the original Sandy Bridge-E was a true, overclockable enthusiast chip with eight cores. So if you’re ready for a belt loosening, belly full of enthusiast-level prime rib, pass the horse radish, get that damned salad off our table, and read on to see if Intel’s Haswell-E is everything we hoped it would be.&nbsp;</p> <p><strong>Meet the Haswell-E parts</strong></p> <p><span style="color: #ff0000;"><img src="/files/u154082/haswell-e_comparison_chart.png" alt="haswell e comparison chart" title="haswell e comparison chart" width="620" height="241" /></span></p> <p>&nbsp;</p> <p><img src="/files/u154082/lga2011v3socket.jpg" alt="haswell e socket" title="haswell e socket" width="620" height="626" /></p> <p><strong>Despite its name, the LGA2011-v3 socket is not same as the older LGA2011 socket. Fortunately, the cooling offsets are exactly the same, so almost all older coolers and accessories should work just fine.&nbsp;</strong></p> <p><img src="/files/u154082/lga2011socket1.jpg" alt="lga2011" title="lga2011" width="620" height="556" /></p> <p><strong>Though they look the same, LGA2011’s socket has arms that are actually arranged differently than the new LGA2011-v3 that replaces it. And no, you can’t drop a newer Haswell-E into this socket and make it work.</strong></p> <h4>Haswell-E</h4> <p><strong>The first consumer Intel eight-core arrives at last</strong></p> <p>Being a card-carrying member of the PC enthusiast class is not an easy path to follow. Sure, you get the most cores and priciest parts, but it also means you get to wait a hell of a long time in between CPU upgrades. And with Intel’s cadence the last few years, it also means you get the leftovers. It’s been that way ever since Intel went with its two-socket strategy with the original LGA1366/LGA1156. Those who picked the big-boy socket and stuck to their guns on Pure PC performance always got the shaft.&nbsp;</p> <p>The original Ivy Bridge in LGA1156 socket, for example, hit the streets in April of 2012. As a reward for having the more efficient and faster CPU, Intel rewarded the small-socket crowd with its Haswell in June of 2013. It wasn’t until September of 2013 that big-boy socket users finally got Ivy Bridge-E for their LGA2011s. But with Haswell already out and tearing up the benchmarks, who the hell cared?</p> <p>Well, that time has come with Haswell-E, Intel’s first replacement for the aging LGA2011 platform since 2011. This time though, Intel isn’t just shuffling new parts into its old stack. For the first since the original Pentium 4 Extreme Edition, paying the price premium actually nets you more: namely, the company’s first consumer eight-core CPU.</p> <p><strong>Meet the T-Rex of consumer CPUs: The Core i7-5960X</strong></p> <p>We were actually a little leery of Haswell when it first launched last year. It was, after all, a chip seemingly tuned for the increasingly mobile/laptoppy world we were told was our post PC-apocalyptic future. Despite this, we recognized the chip as the CPU to have for new system builders. Clock for clock, its 22nm process, tri-gate transistors put everything else to shame—even the six-core Core i7-3930K chip in many tasks. So it’s no surprise that when Intel took a quad-core Haswell, put it in the Xerox machine, and hit the copy x2 button , we’d be ecstatic. Eight cores are decidedly better than six cores or four cores when you need them.&nbsp;</p> <p>The cores don’t come without a cost though, and we don’t mean the usual painful price Intel asks for its highest-end CPUs. It’s no secret that more cores means more heat, which means lower clock speeds. That’s one of the rationales Intel used with the original six-core Core i7-3960X. Although sold as a six-core, the original Sandy Bridge-E was built using an eight-core die on which Intel had permanently switched off two cores. Intel said it wanted to balance the needs of the many versus the needs of the few—that is, by turning off two of the cores, the part could hit higher clock speeds. Indeed, the Core i7-3960X had a base clock of 3.3GHz and Turbo Boost of 3.9GHz, and most could overclock it to 5GHz. The same chip packaged as a Xeon with all eight cores working—the Xeon E5-2687W—was locked down at 3.1GHz and mostly buzzed along at 3.4GHz.</p> <p>With the new Core i7-5960X—the only eight-core of the bunch—the chip starts at a seemingly pedestrian 3GHz with a Turbo Boost of one core up to 3.5GHz. Those subsonic clock speeds won’t impress against the Core i7-4790K, which starts at 4GHz. You’ll find more on how well Haswell-E performs against Haswell in our performance section, but that’s the price to be paid, apparently, to get a chip with this many cores under the heat spreader. Regarding thermals, in fact, Intel has increased the TDP rating to 140 watts versus 130 watts of Ivy Bridge-E and Sandy Bridge-E.&nbsp;</p> <p>If the low clocks annoy you, the good news is the part is fully unlocked, so the use of overclocking has been approved. For our test units, we had very early hardware and tight deadlines, so we didn’t get very far with our overclocking efforts. Talking with vendors, however, most seem very pleased with the clock speeds they were seeing. One vendor told us overclocks of all cores at 4.5GHz was already obtainable and newer microcode updates were expected to improve that. With even the vaunted Devil’s Canyon Core i7-4790K topping out at 4.7GHz to 4.8GHz, a 4.5GHz is actually a healthy overclock for an eight-core CPU.</p> <p><span style="white-space: pre;"> </span>When you dive down into the actual cores though, much is the same, of course. It’s based on a 22nm process. It has “3D” tri-gate transistors and integrated voltage regulation. Oh, and it’s also the first CPU to feature an integrated DDR4 memory controller.</p> <p><strong>Click the next page to read about DDR4</strong></p> <hr /> <p>&nbsp;</p> <h4>DDR4 details</h4> <p>If you think Haswell-E has been a long wait, just think about DDR3, which made its debut as main memory in systems since 2007. Yes, 2007. The only component that has lasted seven years in most enthusiasts systems might be the PSU, but it’s even rare to find anyone kicking a 500-watt PSU from 2007 these days.&nbsp;</p> <p><span style="white-space: pre;"> </span>DDR4 has been in gestation seemingly as long, so why the delay? From what we can tell, resistance to yet another new memory standard during a time when people thought the desktop PC and the PC in general were dying has been the root delay. It didn’t help that no one wanted to stick their head out first, either. RAM makers didn’t want to begin producing it DDR4 in volume until AMD or Intel made chipsets for it, and AMD and Intel didn’t want to support it because of the costs it would add to PCs at a time when people were trying to lower costs. The stalemate finally ends with Haswell-E, which integrates a quad-channel memory controller into its die.</p> <p>Initial launch speeds of DDR4 clock in at DDR4/2133. For those already running DDR3 at 3GHz or higher, a 2,133 data rate is a snooze, but you should realize that anything over 2133 is overclocked RAM. With DDR4, the JEDEC speeds (the body that sets RAM standards) already has target data rates of 3200 on the map. RAM vendors we’ve talked to are already shopping DIMMS near that speed.</p> <p>The best part of DDR4 may be its density message, though. For years, consumer DDR3 has topped out at 8GB on a DIMM. With DDR4, we should see 16B DIMMs almost immediately, and stacking of chips is built into the standard, so it’s possible we’ll see 32GB DIMMs over its lifetime. On a quad-channel, eight-DIMM motherboard, you should expect to be able to build systems with 128GB of RAM using non-ECC DIMMs almost immediately. DDR4 also brings power savings and other improvements, but the main highlights enthusiasts should expect are higher densities and higher clocks. Oh, and higher prices. RAM prices haven’t been fun for anyone of late, but DDR4 will definitely be a premium part for some time. In fact, we couldn’t even get exact pricing from memory vendors as we were going to press, so we’re bracing for some really bad news.</p> <h4>PCIe lanes: now a feature to be blocked</h4> <p>Over the years, we’ve come to expect Intel to clock-block core counts, clock speeds, Hyper-Threading, and even cache for “market segmentation” purposes. What that means is Intel has to find ways to differentiate one CPU from another. Sometimes that’s by turning off Hyper-Threading (witness Core i5 and Core i7) and sometimes its locking down clock speeds. With Haswell-E though, Intel has gone to new heights with its clock-blocking by actually turning off PCIe lanes on some Haswell-E parts to make them less desirable. At the top end, you have the 3GHz Core i7-5960X with eight cores. In the midrange you have the six-core 3.5GHz Core i7-5930K. And at the “low-end” you have the six-core 3.3GHz Core i7-5820K. The 5930K and the 5820K are virtually the same in specs except for one key difference: The PCIe lanes get blocked. Yes, while the Core i7-5960X and Core i7-5930K get 40 lanes of PCIe 3.0, the Core i7-5820K gets an odd 28 lanes of PCIe 3.0. That means those who had hoped to build “budget” Haswell-E boxes with multiple GPUs may have to think hard and fast about using the lowest-end Haswell-E chip. The good news is that for most people, it won’t matter. Plenty of people run Haswell systems with SLI or CrossFire, and those CPUs are limited to 16 lanes. Boards with PLX switches even support four-way GPU setups.</p> <p>Still, it’s a brain bender to think that when you populate an X99 board with the lowest-end Haswell-E, the PCIe configuration will change. The good news is at least they’ll work, just more slowly. Intel says it worked with board vendors to make sure all the slots will function with the budget Haswell-E part.&nbsp;</p> <p><img src="/files/u154082/mpc_haswell_front-back_1.jpg" alt="haswell e chip" title="haswell e chip" width="620" height="413" /></p> <p><strong>There have been clock-blocking rumors swirling around about the Haswell being a 12-core Xeon with four cores turned off. That’s not true and Intel says this die-shot proves it.&nbsp;</strong></p> <p><img src="/files/u154082/ivbe.jpg" alt="ivy bridge e" title="ivy bridge e" width="620" height="550" /></p> <p><strong>Ivy Bridge-E’s main advantage over Sandy Bridge-E was a native six-core die and greatly reduced power consumption. And, unfortunately, like its Ivy Bridge counterpart, overclocking yields on Ivy Bridge-E were greatly reduced over its predecessor, too, with few chips hitting more than 4.7GHz at best.</strong></p> <p><img src="/files/u154082/snbe.jpg" alt="sandy bridge e" title="sandy bridge e" width="308" height="260" /></p> <p><strong>Sandy Bridge-E and Sandy Bridge will long be remembered for its friendliness to overclocking and having two of its working cores killed Red Wedding–style by Intel.</strong></p> <p><strong>Click the next page to read about X99.</strong></p> <hr /> <p>&nbsp;</p> <h4>X99&nbsp;</h4> <p><strong>High-end enthusiasts finally get the chipset they want, sort of</strong></p> <p><img src="/files/u154082/x99blockdiagram.jpg" alt="x99 block diagram" title="x99 block diagram" width="620" height="381" /></p> <p><strong>Intel overcompensated in SATA on X99 but oddly left SATA Express on the cutting-room floor.</strong></p> <p>You know what we won’t miss? The X79 chipset. No offense to X79 owners, while the Core i7-4960X can stick around for a few more months, X79 can take its under-spec’ed butt out of our establishment. Think we’re being too harsh? We don’t.</p> <p>X79 has no native USB 3.0 support. And its SATA 6Gb/s ports? Only two. It almost reads like a feature set from the last decade to us. Fortunately, in a move we wholly endorse, Intel has gone hog wild in over-compensating for the weaknesses of X79.&nbsp;</p> <p>X99 has eight USB 2.0 ports and six USB 3.0 ports baked into the peripheral controller hub in it. For SATA 6Gb/s, Intel adds 10 ports to X99. Yes, 10 ports of SATA 6Gb/s. That gazongo number of SATA ports, however, is balanced out by two glaring omission in X99: no official SATA Express or M.2 support that came with Z97. Intel didn’t say why it left off SATA Express or M.2 in the chipset, but it did say motherboard vendors were free to implement it using techniques they gleaned from doing it on Z97 motherboards. If we had to hazard a guess, we’d say Intel’s conservative nature led it to leave the feature off the chipset, as the company is a stickler for testing new interfaces before adding official support. At this point, SATA Express has been a no-show. After all, motherboards with SATA Express became available in May with Z97, yet we still have not seen any native SATA Express drives. We expect most motherboard vendors to simply add it through discrete controllers; even our early board sample had a SATA Express port.&nbsp;</p> <p>One potential weakness of X99 is Intel’s use of the DMI 2.0. That offers roughly 2.5GB/s of transfer speed between the CPU and the south bridge or PCH, but with the board hanging 10 SATA devices, USB 3.0, Gigabit Ethernet, and 8 PCIe Gen 2.0 lanes off that link, there is the potential for massive congestion—but only in a worst-case scenario. You’d really have to a boat load of hardware lit up and sending and receiving data at once to cause the DMI 2.0 to bottleneck. Besides, Intel says, you can just hang the device off the plentiful PCIe Gen 3.0 from the CPU.</p> <p>That does bring up our last point on X99: the PCIe lanes. As we mentioned earlier, there will be some confusion over the PCIe lane configuration on systems with Core i7-5820K parts. With only 28 lanes of PCIe lanes available from that one chip, there’s concern that whole slots on the motherboard will be turned off. That won’t happen, Intel says. Instead, if you go with the low-rent ride, you simply lose bandwidth. Take an X99 mobo and plug in the Core i7-5930K and you get two slots at x16 PCIe, and one x8 slot. Remove that CPU and install the Core i7-5820K, and the slots will now be configured as one x16, one x8 and one x4. It’s still more bandwidth than you can get from a normal LGA1150-based Core i7-4770K but it will be confusing nonetheless. We expect motherboard vendors to sort it out for their customers, though.</p> <p>Haswell-E does bring one more interesting PCIe configuration though: the ability to run five graphics cards in the PCIe slots at x8 speeds. Intel didn’t comment on the reasons for the option but there only a few apparent reasons. The first is mining configurations where miners are already running six GPUs. Mining, however, doesn’t seem to need the bandwidth a x8 slot would provide. The other possibility is a five-way graphics card configuration being planned by Nvidia or AMD. At this point it’s just conjecture, but one thing we know is that X99 is a welcome upgrade. Good riddance X79.&nbsp;</p> <h4>Top Procs Compared</h4> <p><span style="color: #ff0000;"><span style="white-space: pre;"><img src="/files/u154082/top_processors.png" alt="top processors compared" title="top processors compared" width="620" height="344" /></span></span></p> <h4>Core Competency&nbsp;</h4> <p><strong>How many cores do you really need?</strong></p> <p><img src="/files/u154082/haswelletaskamanger.png" alt="haswell task manager" title="haswell task manager" width="620" height="564" /></p> <p><strong>It is indeed a glorious thing to see a task manager with this many threads, but not everyone needs them.</strong></p> <p>Like the great technology philosopher Sir Mix-A-Lot said, we like big cores and we cannot lie. We want as many cores as legally available. But we recognize that not everyone rolls as hard as we do with a posse of threads. With Intel’s first eight-core CPU, consumers can now pick from two cores all the way to eight on the Intel side of the aisle—and then there’s Hyper-Threading to confuse you even more. So, how many cores do you need? We’ll give you the quick-and-dirty lowdown.</p> <p><strong>Two cores</strong></p> <p>Normally, we’d completely skip dual-cores without Hyper-Threading because the parts tend to be the very bottom end of the pool Celerons. Our asterisk is the new Intel Pentium G3258 Anniversary Edition, or “Pentium K,” which is a real hoot of a chip. It easily overclocks and is dead cheap. It’s not the fastest in content creation by a long shot, but if we were building an ultra-budget gaming rig and needed to steal from the CPU budget for a faster GPU, we’d recommend this one. Otherwise, we see dual-cores as purely ultra-budget parts today.</p> <p><strong>Two cores with Hyper-Threading</strong></p> <p>For your parents who need a reliable, solid PC without overclocking (you really don’t want to explain how to back down the core voltage in the BIOS to grandma, do you?), the dual-core Core i3 parts fulfill the needs of most people who only do content creation on occasion. Hyper-Threading adds value in multi-threaded and multi-tasking tasks. You can almost think of these chips with Hyper-Threading as three-core CPUs.&nbsp;</p> <p><strong>Four cores</strong></p> <p>For anyone who does content creation such as video editing, encoding, or even photo editing with newer applications, a quad-core is usually our recommended part. Newer game consoles are also expected to push min specs for newer games to quad-cores or more as well, so for most people who carry an Enthusiast badge, a quad-core part is the place to start.</p> <p><strong>Four cores with Hyper-Threading</strong></p> <p>Hyper-Threading got a bad name early on from the Pentium 4 and existing software that actually saw it reduce performance when turned on. Those days are long behind us though, and Hyper-Threading offers a nice performance boost with its virtual cores. How much? &nbsp;A 3.5GHz Core i7 quad-core with Hyper-Threading generally offers the same performance on multi-threaded tasks as a Core i5 running at 4.5GHz. The Hyper-Threading helps with content creation and we’d say, if content creation is 30 percent or less of your time, this is the place to be and really the best fit for 90 percent of enthusiasts.</p> <p><strong>Six cores with Hyper-Threading</strong></p> <p>Once you pass the quad-core mark, you are moving pixels professionally in video editing, 3D modeling, or other tasks that necessitate the costs of a six-core chip or more. We still think that for 90 percent of folks, a four-core CPU is plenty, but if losing time rendering a video costs you money (or you’re just ADD), pay for a six-core or more CPU. How do you decide if you need six or eight cores? Read on.&nbsp;</p> <p><strong>Eight cores with Hyper-Threading</strong></p> <p>We recognize that not everyone needs an eight-core processor. In fact, one way to save cash is to buy the midrange six-core chip instead, but if time is money, an eight-core chip will pay for itself. For example, the eight-core Haswell-E is about 45 percent faster than the four-core Core i7-4790K chip. If your render job is three hours, that’s more time working on other paying projects. The gap gets smaller between the six-core and the eight-core of course, so it’s very much about how much your time is worth or how short your attention span is. But just to give you an idea, the 3.3GHz Core i7-5960X is about 20 percent faster than the Core i7-4960X running at 4GHz.</p> <p><strong>Click the next page to see how Haswell-E stacks up against Intel's other top CPUs.</strong></p> <hr /> <p>&nbsp;</p> <h4 style="font-size: 10px;">Intel’s Top Guns Compared</h4> <p><img src="/files/u154082/cpus17918.jpg" alt="haswell" title="haswell" width="620" height="413" /></p> <p><strong><strong>The LGA2011-based Core i7-4960X (left) and the LGA2011-v3-based Core i7-5960X (middle) dwarf the Core i7-4790K chip (right). Note the change in the heat spreader between the older 4960X and 5960X, which now has larger “wings” that make it easier to remove the CPU by hand. The breather hole, which allows for curing of the thermal interface material (solder in this case), has also been moved. Finally, while the chips are the same size, they are keyed differently to prevent you from installing a newer Haswell-E into an older Ivy Bridge-E board.</strong></strong></p> <h4>Benchmarks</h4> <p><strong>Performance junkies, rejoice! Haswell-E hits it out of the ballpark</strong></p> <p><img src="/files/u154082/x99-gaming_5-rev10.jpg" alt="x99 gigabyte" title="x99 gigabyte" width="620" height="734" /></p> <p><strong>We used a Gigabyte X99 motherboard (without the final heatsinks for the voltage-regulation modules) for our testing.</strong></p> <p>For our testing, we set up three identical systems with the fastest available CPUs for each platform. Each system used an Nvidia GeForce GTX 780 with the same 340.52 drivers, Corsair 240GB Neutron GTX SSDs, and 64-bit Windows 8.1 Enterprise. Since we’ve had issues with clock speeds varying on cards that physically look the same, we also verified the clock speeds of each GPU manually and also recorded the multiplier, bclock, and speeds the parts run at under single-threaded and multi-threaded loads. So you know, the 3GHz Core i7-5960X’s would run at 3.5GHz on single-threaded tasks but usually sat at 3.33GHz on multi-threaded tasks. The 3.6GHz Core i7-4960X ran everything at 4GHz, including multi-threading tasks. The 4GHz Core i7-4790K part sat at 4.4GHz on both single- and multi-threaded loads.</p> <p>For Z97, we used a Gigabyte Z97M-D3H mobo with a Core i7-4790K “Devil’s Canyon” chip aboard. &nbsp;An Asus Sabertooth X79 did the duty for our Core i7-4960X “Ivy Bridge-E” chip. Finally, for our Core i7-5960X chip, we obtained an early Gigabyte X99-Gaming 5 motherboard. The board was pretty early but we feel comfortable with our performance numbers as Intel has claimed the Core i7-5960X was “45 percent” faster than a quad-core chip, and that’s what we saw in some of our tests.&nbsp;</p> <p>One thing to note: The RAM capacities were different but in the grand scheme of things and the tests we run, it has no impact. The Sabertooth X79 &nbsp;had 16GB of DDR3/2133 in quad-channel mode, the Z97M-D3H had 16GB of DDR3/2133 in dual-channel mode. Finally, the X99-Gaming 5 board had 32GB of Corsair DDR4/2133. All three CPUs will overclock, but we tested at stock speeds to get a good baseline feel.&nbsp;</p> <p>For our benchmarks, we selected from a pile of real-world games, synthetic tests, as well as real-world applications across a wide gamut of disciplines. Our gaming tests were also run at very low resolutions and low-quality settings to take the graphics card out of the equation. We also acknowledge that people want to know what they can expect from the different CPUs at realistic settings and resolutions, so we also ran all of the games at their highest settings at 1920x1080 resolution, which is still the norm in PC gaming.&nbsp;</p> <p><strong>The results</strong></p> <p>We could get into a multi-sentence analysis of how it did and slowly break out with our verdict but in a society where people get impatient at the microwave, we’ll give you the goods up front: Holy Frakking Smokes, this chip is fast! The Core i7-5960X is simply everything high-end enthusiasts have been dreaming about.&nbsp;</p> <p>Just to give you an idea, we’ve been recording scores from $7,000 and $13,000 PCs in our custom Premiere Pro CS6 benchmark for a couple of years now. The fastest we’ve ever seen is the Digital Storm Aventum II that we reviewed in our January 2014 issue. The 3.3GHz Core i7-5960X was faster than the Aventum II’s Core i7-4960X running at 4.7GHz. Again, at stock speeds, the Haswell-E was faster than the fastest Ivy Bridge-E machine we’ve ever seen.</p> <p>It wasn’t just Premiere Pro CS6 we saw that spread in either. In most of our tests that stress multi-threading, we saw roughly a 45 percent to 50 percent improvement going from the Haswell to the Haswell-E part. The scaling gets tighter when you’re comparing the six-core Core i7-4960X but it’s still a nice, big number. We generally saw a 20 percent to 25 percent improvement in multi-threaded tasks.&nbsp;</p> <p>That’s not even factoring in the clock differences between the parts. The Core i7-4790K buzzes along at 4.4GHz—1.1GHz faster than the Core i7-5960X in multi-threaded tasks—yet it still got stomped by 45 to 50 percent. The Core i7-4960X had a nearly 700MHz clock advantage as well over the eight-core chip.</p> <p>The whole world isn’t multi-threaded, though. Once we get to workloads that don’t push all eight cores, the higher clock speeds of the other parts predictably take over. ProShow Producer 5.0, for example, has never pushed more than four threads and we saw the Core i7-5960X lose by 17 percent. The same happened in our custom Stitch.Efx 2.0 benchmark, too. In fact, in general, the Core i7-4790K will be faster thanks to its clock speed advantage. If you overclocked the Core i7-5960X to 4GHz or 4.4GHz on just four cores, the two should be on par in pure performance on light-duty workloads.</p> <p>In gaming, we saw some results from our tests that are a little bewildering to us. At low-resolution and low-quality settings, where the graphics card was not the bottleneck, the Core i7-4790K had the same 10 percent to 20 percent advantage. When we ran the same tests at ultra and 1080p resolution, the Core i7-5960X actually had a slight advantage in some of the runs against the Core i7-4790K chip. We think that may be from the bandwidth advantage the 5960X has. Remember, we ran all of the RAM at 2,133, so it’s not DDR4 vs. DDR3. It’s really quad-channel vs. dual-channel.</p> <p>We actually put a full breakdown of each of the benchmarks and detailed analysis on MaximumPC.com if you really want to nerd out on the performance.</p> <p><strong>What you should buy</strong></p> <p>Let’s say it again: The Core i7-5960X stands as the single fastest CPU we’ve seen to date. It’s simply a monster in performance in multi-threaded tasks and we think once you’ve overclocked it, it’ll be as fast as all the others in tasks that aren’t thread-heavy workloads.</p> <p>That, however, doesn’t mean everyone should start saving to buy a $1,000 CPU. No, for most people, the dynamic doesn’t change. For the 80 percent of you who fall into the average Joe or Jane nerd category, a four-core with Hyper-Threading still offers the best bang for the buck. It won’t be as fast as the eight-core, but unless you’re really working your rig for a living, made of money, or hate for your Handbrake encodes to take that extra 25 minutes, you can slum it with the Core i7-4790K chip. You don’t even have to heavily overclock it for the performance to be extremely peppy.</p> <p>For the remaining 20 percent who actually do a lot of encoding, rendering, professional photo editing, or heavy multi-tasking, the Core i7-5960X stands as the must-have CPU. It’s the chip you’ve been waiting for Intel to release. Just know that at purely stock speeds, you do give up performance to the Core i7-4790K part. But again, the good news is that with minor overclocking tweaks, it’ll be the equal or better of the quad-core chip.</p> <p>What’s really nice here is that for the first time, Intel is giving its “Extreme” SKU something truly extra for the $999 they spend. Previous Core i7 Extreme parts have always been good overclockers, but a lot of people bypassed them for the midrange chips such as the Core i7-4930K, which gave you the same core counts and overclocking to boot. The only true differentiation Extreme CPU buyers got was bragging rights. With Haswell-E, the Extreme buyers are the only ones with eight-core parts.</p> <p>Bang-for-the-buck buyers also get a treat from the six-core Core i7-5820K chip. At $389, it’s slightly more expensive than the chip it replaces—the $323 Core i7-4820K—but the extra price nets you two more cores. Yes, you lose PCIe bandwidth but most people probably won’t notice the difference. We didn’t have a Core i7-5820K part to test, but we &nbsp;believe on our testing with the Core i7-5960X that minor overclocking on the cheap Haswell-E would easily make it the equal of Intel’s previous six-core chips that could never be had for less than $580.</p> <p>And that, of course, brings us to the last point of discussion: Should you upgrade from your Core i7-4960X part? The easy answer is no. In pure CPU-on-CPU &nbsp;showdowns, the Core i7-4960X is about 20 percent slower in multi-threaded tasks, and in light-duty threads it’s about the same, thanks to the clock-speed advantage the Core i7-4960X has. There are two reasons we might want to toss aside the older chip, though. The first is the pathetic SATA 6Gb/s ports, which, frankly, you actually need on a heavy-duty work machine. The second reason would be the folks for whom a 20 percent reduction in rendering time would actually be worth paying for.&nbsp;</p> <p><strong>Click the next page to check out our Haswell-E benchmarks.</strong></p> <hr /> <h4><span style="font-size: 1.17em;">Haswell-E Benchmarks</span></h4> <p><strong>Haswell-E benchmarks overview</strong></p> <p><span style="font-size: 1.17em;">&nbsp;</span><img src="/files/u154082/haswell_e_benchmarks.png" alt="haswell e benchmarks" title="haswell e benchmarks" width="541" height="968" /></p> <p>&nbsp;</p> <p>&nbsp;</p> <p><strong>Benchmark Breakdown</strong></p> <p>We like to give you the goods on a nice table but not everyone is familiar with what we use to test and what exactly the numbers means so let’s break down some of the more significant results for you.&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p><img src="/files/u154082/cinebenchsinglethreaded.png" alt="cinebench 15 single" title="cinebench 15 single" width="620" height="472" /></p> <p><strong>Cinebench 15 single-threaded performance</strong></p> <p><span style="color: #000000;">We used Maxon’s Cinebench 15 benchmark to see just how fast the trio of chips would run this 3D rendering test. Cinebench 15 allows you to restrict it from using all of the cores or just one core. For this test, we wanted to see how the Core i7-5960X “Haswell-E” would do against the others by measuring a single core. The winner here is the Core i7-4790K “Devil’s Canyon” chip. That’s no surprise—it uses the same microarchitecture as the big boy Haswell-E but it has a ton more clock speed on default. The Haswell-E is about 21 percent slower running at 3.5GHz. The Devil’s Canyon part is running about 900MHz faster at 4.4GHz. Remember, on default, the Haswell-E only hits 3.5GHz on single-core loads. The Haswell-E better microarchitecture also loses to the Core i7-4960X “Ivy Bridge-E,” but not by much and that’s with the Ivy Bridge-E’s clock speed advantage of 500MHz. Still, the clear winner in single-threaded performance is the higher-clocked Devil’s Canyon chip.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/cinebenchmulti.png" alt="cinebench 15 multi" title="cinebench 15 multi" width="620" height="428" /></span></p> <p><span style="color: #000000;"><strong>Cinebench 15 multi-threaded performance</strong></span></p> <p><span style="color: #000000;">You don’t buy an eight-core CPU and then throw only single-thread workloads at it, so we took the handcuffs off of Cinebench 15 and let it render with all available threads. On the Haswell-E part, that’s 16 threads of fun, on Ivy Bridge-E it’s 12-threads, and on Devil’s Canyon we’re looking at eight-threads. The winner by a clear margin is the Haswell-E part. Its performance is an astounding 49 percent faster than the Devil’s Canyon and about 22 percent faster than Ivy Bridge-E. We’ll just have to continue to remind you, too: this is with a severe clock penalty. That 49-percent-faster score is with all eight cores running at 3.3GHz vs all four of the Devil’s Canyon cores buzzing along at 4.4GHz. That’s an 1,100MHz clock speed advantage. Ivy Bridge-E also has a nice 700MHz clock advantage than Haswell-E. Chalk this up as a big, huge win for Haswell-E.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/povray.png" alt="pov-ray" title="pov-ray" width="620" height="491" /></span></p> <p><span style="color: #000000;"><strong>POV-Ray performance</strong></span></p> <p><span style="color: #000000;">We wanted a second opinion on rendering performance, so we ran POV-Ray, a freeware ray tracer that has roots that reach back to the Amiga. Again, Haswell-E wins big-time with a 47 percent performance advantage over Devil’s Canyon and a 25 percent advantage over Ivy Bridge-E. Yeah, and all that stuff we said about the clock speed advantage the quad-core and six-core had, that applies here, too. Blah, blah, blah.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/premierepro.png" alt="premiere pro" title="premiere pro" width="620" height="474" /></span></p> <p><span style="color: #000000;"><strong>Premiere Pro CS6 performance</strong></span></p> <p><span style="color: #000000;">One sanity check (benchmark results Intel produces to let you know what kind of performance to expect) said Haswell-E would outperform quad-core Intel parts by 45 percent in Premiere Pro Creative Cloud when working with 4K content. Our benchmark, however, doesn’t use 4K content yet, so we wondered if our results would be similar. For our test, we render out a 1080p-resolution file using source material shot by us on a Canon EOS 5D Mk II using multiple timelines and transitions. We restrict it to the CPU rather than using the GPU as well. Our result? The 3.3GHz Haswell-E was about 45 percent faster than the 4.4GHz Devil’s Canyon chip. Bada-bing! The two extra cores also spit out the render about 19 percent faster than the six-core Ivy Bridge-E. That’s fairly consistent performance we’re seeing between the different workload disciplines of 3D rendering and video encoding so far, and again, big, big wins for the Haswell-E part.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/handbrake.png" alt="handbrake" title="handbrake" width="620" height="407" /></span></p> <p><span style="color: #000000;"><strong>Handbrake Encoding performance</strong></span></p> <p><span style="color: #000000;">For our encoding test, we took a 1080p-resolution video file and used Handbrake 0.9.9 to transcode it into a file using the Android tablet profile. Handbrake is very multi-threaded and leverages the CPU for its encoding and transcoding. Our results were still fairly stellar, with Haswell-E CPU performing about 38 percent faster than the Devil’s Canyon part. Things were uncomfortably close with the Ivy Bridge-E part though, with the eight-core chip coming in only about 13 percent faster than the six-core chip. Since the Ivy Bridge-E cores are slower than Haswell cores clock-for-clock, we were a bit surprised at how close they were. In the past, we have seen memory bandwidth play a role in encoding, but not necessarily Handbrake. Interestingly, despite locking all three parts down at 2,133MHz, the Ivy Bridge-E does provide more bandwidth than the Haswell-E part. One other thing we should mention: Intel’s “sanity check” numbers to let the media know what to expect for Handbrake performance showed a tremendous advantage for the Haswell-E. Against a Devil’s Canyon chip, Haswell-E was 69 percent faster and 34 percent faster than the Ivy Bridge-E chip. Why the difference? The workload. Intel uses a 4K-resolution file and transcodes it down to 1080p. We haven’t tried it at 4K, but we may, as Intel has provided the 4K-resolution sample files to the media. If true, and we have no reason to doubt it, it’s a good message for those who actually work at Ultra HD resolutions that the eight-cores can pay off. Overall, we’re declaring Haswell-E the winner here.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/x264pass1.png" alt="x264 pass 1" title="x264 pass 1" width="620" height="496" /></span></p> <p><span style="color: #000000;"><strong>X264 HD 5.01 Pass 1 performance</strong></span></p> <p><span style="color: #000000;">We’ve been using TechArp.com’s X264 HD 5.0.1 benchmark to measure performance on new PCs. The test does two passes using the freeware x264 encoding library. The first pass is seemingly a little more sensitive to clock speeds and memory bandwidth rather than just pure core count. A higher frame rate is better. The first pass isn’t as core-sensitive, and memory bandwidth clock speed have more dividends here. Haswell still gives you a nice 36 percent boost over the Devil’s Canyon but that Ivy Bridge-E chip, despite its older core microarchitecture, comes is only beaten by 12 percent—too close for comfort. Of course, we’d throw in the usual caveat about the very large clock differences between the chips, but we’ve already said that three times. Oh, and yes, we did actually plagiarize by lifting two sentences from a previous CPU review for our description. That’s OK, we gave ourselves permission.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X but not by much</span></p> <p><span style="color: #000000;"><img src="/files/u154082/x264pass2.png" alt="x264 pass 2" title="x264 pass 2" width="620" height="499" /></span></p> <p><span style="color: #000000;"><strong>X264 HD 5.01 Pass 2 performance</strong></span></p> <p><span style="color: #000000;">Pass two of the X264 HD 5.01 benchmark is more sensitive to core and thread counts, and we see the Haswell-E come in with a nice 46 percent performance advantage against the Devil’s Canyon chip. The Ivy Bridge-E, though, still represents well. The Haswell-E chip is “only” 22 percent faster than it. Still, this is a solid win for the Haswell-E chip. We also like how we’re seeing very similar scaling in multiple encoding tests of roughly 45 percent. With Intel saying it’s seeing 69 percent in 4K resolution content in Handbrake, we’re wondering if the Haswell-E would offer similar scaling if we just moved all of our tests up to 4K.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><strong>Click the next page for even more Haswell-E benchmarks.</strong></p> <hr /> <p>&nbsp;</p> <p><span style="color: #000000;"><img src="/files/u154082/stitch.png" alt="stitch" title="stitch" width="620" height="473" /></span></p> <p><span style="color: #000000;"><strong>Stitch.EFx 2.0 Performance&nbsp;</strong></span></p> <p><span style="color: #000000;">Again, we like to mix up our workloads to stress different tasks that aren’t always multi-threaded to take advantage of a 12-core Xeon chip. For this test, we shot about 200 images with a Canon EOS 7D using a GigaPan motorized head. That’s roughly 1.9GB in images to make our gigapixel image using Stitch.EFx 2.0. The first third of the render is single-threaded as it stitches together the images. The final third is multi-threaded as it does the blending, perspective correction, and other intensive image processing. It’s a good blend of single-threaded performance and multi-threaded, but we expected the higher clocked parts to take the lead. No surprise, the Devil’s Canyon 4.4GHz advantage puts it in front, and the Haswell-E comes in about 14 percent slower with its 1.1GHz clock disadvantage. The clock speed advantage of the 4GHz Ivy Bridge-E also pays dividends, and we see the Haswell-E losing by about 10 percent. The good news? A dual-core Pentium K running at 4.7GHz coughed up a score of 1,029 seconds (not represented on the chart) and is roughly 22 percent slower than the CPU that costs about 11 times more.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/7zip.png" alt="7-zip" title="7-zip" width="620" height="477" /></span></p> <p><span style="color: #000000;"><strong>7-Zip Performance</strong></span></p> <p><span style="color: #000000;">The popular and free zip utility, 7-Zip, has a nifty built-in benchmark that tells you the theoretical file-compression performance a CPU. You can pick the workload size and the number of threads. For our test, we maxed it out at 16-threads using an 8MB workload. That gives the Haswell-E familiar advantage in performance—about 45 percent—over the Devil’s Canyon part. Against that Ivy Bridge-E part though, it’s another uncomfortably close one at 8 percent. Still, a win is a win even if we have to say that if you have a shiny Core i7-4960X CPU in your system, you’re still doing fine.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/sandra.png" alt="sisoft sandra" title="sisoft sandra" width="620" height="421" /></span></p> <p><span style="color: #000000;"><strong>Sisoft Sandra Memory Bandwidth (GB/s)</strong></span></p> <p>Since this is the first time we’re seeing DDR4 in a desktop part, we wanted to see how it stacked up in benchmarks. But, before you get too excited, remember that we set all three systems to 2133 data rates. The Devil’s Canyon part is dual-channel and the Ivy Bridge-E and Haswell-E are both quad-channel. With the memory set at 2133, we expected Haswell-E to be on par with the Ivy Bridge-E chip, but oddly, it was slower, putting out about 40GB/s of bandwidth. It’s still more than the 27GB/s the Devil’s Canyon could hit, but we expected it to be closer to double of what the Ivy Bridge-E was producing. For what it’s worth, we did double-check that we were operating in quad-channel mode and the clock speeds of our DIMMs. It’s possible this may change as the hardware we see becomes more final. We’ll also note that even at the same clock, DDR4 does suffer a latency penalty over DDR3. That would also be missing the point of DDR4, though. The new memory should give us larger modules and hit higher frequencies far easier, too, which will nullify that latency issue. Still, the winner is Ivy Bridge-E.</p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/3dmarkgpu.png" alt="3d mark" title="3d mark" width="620" height="457" /></span></p> <p><span style="color: #000000;"><strong>3DMark Firestrike Overall Performance</strong></span></p> <p><span style="color: #000000;">Even though 3DMark Firestrike is primarily a graphics benchmark, not having a 3DMark Firestrike score is like not having coffee in the morning. Basically, it’s a tie between all three chips, and 3DMark Firestrike is working exactly as you expect it to: as a GPU benchmark.</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/3dmarkphysics.png" alt="3d mark physics" title="3d mark physics" width="620" height="477" /></span></p> <p><span style="color: #000000;"><strong>3DMark Firestrike Physics Performance</strong></span></p> <p><span style="color: #000000;">3DMark does factor in the CPU performance for its physics tests. It’s certainly not weighted for multi-core counts as other tests are, but we see the Haswell-E with a decent 29 percent bump over the Devil’s Canyon chip. But, breathing down the neck of the Haswell-E is the Ivy Bridge-E chip. To us, that’s damned near a tie. Overall, the Haswell-E wins, but in gaming tasks—at stock clocks—paying for an 8-core monster is unnecessary except for those running multi-GPU setups.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/valveparticle.png" alt="valve particle" title="valve particle" width="620" height="451" /></span></p> <p><span style="color: #000000;"><strong>Valve Particle Benchmark Performance</strong></span></p> <p><span style="color: #000000;">Valve’s Particle test was originally developed to show off quad-core performance to the world. It uses the company’s own physics magic, so it should give some indication of how well a chip will run. We’ve long suspected the test is cache and RAM latency happy. That seems to be backed by the numbers because despite the 1.1GHz advantage the Devil’s Canyon chip has, the Haswell-E is in front to the tune of 15 percent. The Ivy Bridge-E chip though, with its large cache, lower latency DDR3, and assloads of memory bandwidth actually comes out on top by about 3 percent. We’ll again note the Ivy Bridge-E part has a 700MHz advantage, so this is a very nice showing for the Haswell-E part.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/dirtlow.png" alt="dirt showdown low" title="dirt showdown low" width="620" height="438" /></span></p> <p><span style="color: #000000;"><strong>Dirt Showdown low-resolution performance</strong></span></p> <p><span style="color: #000000;">For our gaming tests, we decided to run the games at 1366x768 resolution and at very low settings to take the graphics card out of the equation. In one way, you imagine this as what it would look like if you had infinitely powerful graphics cards in your system. As most games are not multi-threaded and are perfectly fine with a quad-core with Hyper-Threading, we fully expected the parts with the highest clock speeds to win all of our low-resolution, low-quality tests. No surprise, the Devil’s Canyon part at 4.4GHz private schools the 3.3GHz Haswell-E chip. And, no surprise, the 4GHz Ivy Bridge-E also eats the Haswell-E’s lunch and drinks its milk, too.</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/dirtultra.png" alt="dirt showdown ultra performance" title="dirt showdown ultra performance" width="620" height="475" /></span></p> <p><span style="color: #000000;"><strong>Dirt Showdown 1080p, ultra performance</strong></span></p> <p><span style="color: #000000;">To make sure we put everything in the right context, we also ran the Dirt Showdown at 1920x1080 resolution at Ultra settings. This puts most of the load on the single GeForce GTX 780 we used for our tests. Interestingly, we saw the Haswell-E with a slight edge over the Devil’s Canyon and Ivy Bridge-E parts. We’re not sure, but we don’t think it’s a very significant difference, but it’s still technically a win for Haswell-E.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/hitmanlow.png" alt="hitman low" title="hitman low" width="620" height="502" /></span></p> <p><span style="color: #000000;"><strong>Hitman: Absolution, low quality, low performance&nbsp;</strong></span></p> <p><span style="color: #000000;">We did the same with Hitman: Absolution, running it at low resolution and its lowest settings. The Haswell-E came in about 12 percent slower the Devil’s Canyon part and 13 percent slower than the Ivy Bridge-E.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/hitmanultra.png" alt="hitman ultra" title="hitman ultra" width="620" height="479" /></span></p> <p><span style="color: #000000;"><strong>Hitman: Absolution, 1080p, ultra quality</strong></span></p> <p><span style="color: #000000;">Again, we tick the settings to an actual resolution and quality at which people actually play. Once we do that, the gap closes slightly, with the Haswell-E trailing the Devil’s Canyon by about 8 percent and the Ivy Bridge-E by 9 percent. Still, these are all very playable frame rates and few could tell the difference.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/tombraider.png" alt="tomb raider low" title="tomb raider low" width="620" height="465" /></span></p> <p><span style="color: #000000;"><strong>Tomb Raider, low quality, low resolution.</strong></span></p> <p><span style="color: #000000;">We did the same low quality, low resolution trick with Tomb Raider and while need to see 500 frames per second, it’s pretty much a wash here.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/tomraiderulti.png" alt="tomb raider ultra" title="tomb raider ultra" width="620" height="472" /></span></p> <p><span style="color: #000000;"><strong>Tomb Raider, 1080p, Ultimate</strong></span></p> <p><span style="color: #000000;">At normal resolutions and settings we were a little surprised, as the Haswell-E actually had a 15 percent advantage over the Devil’s Canyon CPU. We’re not exactly sure why, as the only real advantage we can see is memory bandwidth and large caches on the Haswell-E part. We seriously doubt it’s due to the number of CPU cores. The Haswell-E also has a very, very slight lead against the Ivy Bridge-E part, too. That’s not bad considering the clock penalty it’s running at.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/metrolastlight.png" alt="metro last light low" title="metro last light low" width="620" height="503" /></span></p> <p><span style="color: #000000;"><strong>Metro Last Light, low resolution, low quality</strong></span></p> <p><span style="color: #000000;">In Metro Last light, at low settings it’s a wash between all of them.</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/metroveryhigh.png" alt="metro last light high" title="metro last light high" width="620" height="502" /></span></p> <p><span style="color: #000000;"><strong>Metro Last Light, 1080p, Very High quality</strong></span></p> <p><span style="color: #000000;">Metro at high-quality settings mirrors that of Hitman: Absolution, and we think favors the parts with higher clock speeds. We should also note that none of the chips with the $500 graphics card could run Metro at 1080p at high-quality settings. That is, of course, you consider 30 to 40 fps to be “smooth.” We don’t. Interestingly, the Core i7-4690X was the overall winner.</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><strong>Conclusion:</strong> If you skipped to the very last page to read the conclusion, you’re in the wrong place. You need to go back to page 4 to read our conclusions and what you should buy. And no, we didn’t do this to generate just one more click either though that would be very clever of us wouldn’t it?</p> http://www.maximumpc.com/haswell-e_review_2014#comments benchmarks cpu haswell e intel ivy bridge e maximum pc processor Review Specs News Reviews Features Tue, 09 Sep 2014 23:03:30 +0000 Gordon Mah Ung 28431 at http://www.maximumpc.com Maingear Epic Force Video Review http://www.maximumpc.com/maingear_epic_force_video_review_2014 <!--paging_filter--><h3>See what a $12,000 gaming rig looks like</h3> <p>One of the best parts of this job is getting to play with hardware we can’t afford. For this video, Gordon walks you through Maingear’s Epic Force which is a tour de force of beautiful plumbing even Mario would be proud of. The machine, delivered to us before Intel’s epic Core i7-5960X “<a title="haswell e" href="http://www.maximumpc.com/haswell-e_review_2014" target="_blank">Haswell-E</a>” is built on an overclocked Core i7-4790K “Devil’s Canyon” chip and packs a pair of water cooled Radeon R9 295 X2 graphics cards.</p> <p><iframe src="//www.youtube.com/embed/yNoxJJ70se0" width="620" height="349" frameborder="0"></iframe></p> <p>What do you think of the Maingear Epic Force PC? Let us know in the comments below.</p> http://www.maximumpc.com/maingear_epic_force_video_review_2014#comments big chassis Desktop Hardware maingear epic force maximum pc MPCTV pc Review video Reviews Systems Mon, 08 Sep 2014 21:05:28 +0000 Gordon Mah Ung 28498 at http://www.maximumpc.com Dell UltraSharp UP2414Q Review http://www.maximumpc.com/dell_ultrasharp_up2414q_review <!--paging_filter--><h3>The 4K monitor you’ve been waiting for?</h3> <p>Call it 4K. Call it UltraHD. Either way, massive pixel counts are the next big thing. This year’s festival of rampant consumerism at CES in Las Vegas is a case in point. Inevitably, a ton of 4K HDTVs filled the field of view in every direction, but the show also included several 4K and UHD laptops. Meanwhile, phones with full 1080p grids are becoming commonplace. Likewise, tablets with panels over 1080p, including Google’s 2560x1600-pixel Nexus 10, are now almost routine.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/dell_ultrasharp_up2414q_small_0.jpg"><img src="/files/u152332/dell_ultrasharp_up2414q_small.jpg" width="620" height="545" /></a></p> <p>But what of the PC? Sadly, it’s been a bit of a 4K laggard to date. So far, we’ve only reviewed a single 4K PC monitor, the Asus PQ321. It’s absolutely, positively gorgeous, but also punitively priced at around $3,000. So expensive, in other words, that it’s pretty much irrelevant to most PC lovers.</p> <p>That’s actually rather ironic, because of all the devices out there, the PC is nearest to ready-and-able to make the most of 4K resolutions right now. 4K HDTVs, quite frankly, are a gimmick; there’s simply no content to watch on them yet. Super-high-resolution tablets and phones are marginal, too. But not PCs. Ramp up the res and you can immediately enjoy the boost in desktop elbow room, although you may run into scaling and DPI problems with Windows (more on that a bit later). Applications in the video and photo editing spheres certainly benefit from more pixels. Then there’s gaming, which is the biggie for us, though the argument here is more finely balanced.</p> <p>In theory, you can run pretty much any game at full 4K. Most will offer the option to render at the maximum resolution of your graphics subsystem. And render they will. The only snag involves achieving that at playable frame rates. As we explained in our Asus PQ321 review, 4K/UHD is essentially four times the resolution of a 1080p pixel grid, so that’s four times the workload for your GPU to cope with. Cripes. So, it’s into this broader context that we introduce our second-ever 4K PC monitor review.</p> <p>The specimen in question this time is Dell’s new UltraSharp UP2414Q. It sports the same 3840x2160 resolution as the groundbreaking Asus PQ321, but there are two significant differences. The first of these is price; the new Dell can be had for slightly under $1,300—less than half the cost of the Asus. That’s still not exactly cheap for a monitor, but it’s much, much more accessible.</p> <p>The second major change-up involves panel proportions. The Dell spans a mere 24 inches—so that’s $1,300 for a 24-inch monitor. Yikes. Of course, you could argue that it’s resolution and not size that determines desktop real estate, and you’d be right, but some people will still balk at the very notion of paying so much for a panel size that can be had for little more than $120 these days.</p> <p>The UP2414Q’s general metrics are your typical IPS fare, with 178-degree viewing angles for both the horizontal and vertical planes. Likewise, the claimed static contrast of 1,000:1 is very much par for the course, and the UP2414Q’s 8ms quoted response is the same as other cutting-edge IPS panels.</p> <p>Of course, all of that means there are some superior options available by some measures. IPS technology is all the rage, but in truth, TN tech is better for pixel response and VA panels offer far superior contrast. Overall, IPS is still the best compromise—just don’t fall into the trap of assuming it’s universally superior. It ain’t quite that simple.</p> <p>Elsewhere, there’s an LED backlight and brightness rated at 350cd/m2, and a super-fine pixel density of 185PPI. As for inputs, the UP2414Q has one HDMI, one DisplayPort, and one Mini DisplayPort. Thanks to the super-high resolution, it’s only the DisplayPort that offers full native operation. The lone HDMI port is limited to HDMI 1.4, and you need HDMI 2.0 for 4K at 60Hz. Finally, there’s a fully adjustable chassis, complete with tilt, rotate, swivel, and height tweakability.</p> <p>What is it actually like to look at? Utterly stunning, is the first impression. Even the epic Asus can’t match the crispness and sharpness that you get from cramming all those pixels into such a relatively small panel.</p> <p>As with super-high DPI phones and tablets, you almost don’t feel like you’re looking at an active display at all. You essentially can’t see the individual pixels—they’re simply too small—which gives the UP2414Q a wonderfully seamless feel.</p> <p>The colors are exquisite, too, though admittedly, no more so than many other high-end IPS screens; they all look spectacular these days. The same goes for the results in our objective image quality test. Gradient rendering, viewing angles, white and black scales—they’re all absolutely immaculate and super sexy&shy;—again, just like other pricey IPS screens.</p> <p>Then, there’s actually using this 4K beauty for multimedia entertainments. Not that there’s much 4K video content to watch, but what there is, by the lords of science, is gorgeous! It more or less ruins standard 1080p HD content for you. Once you’ve seen 4K, there’s almost no going back.</p> <p>The same goes for gaming, except this time round, the narrative is a little bit more complicated and depends on what kind of GPU you’re packing. We decided to take the UP2414Q for a spin courtesy of&nbsp; an Nvidia GeForce GTX 780 Ti, the fastest single graphics card you can buy right now, and it can only just cope with that colossal native resolution at full detail gaming in moderately demanding titles.</p> <p>Speaking of technologies that aren’t ready for 4K and super-high DPI displays, you can add Windows to the list. Even the latest 8.1 build of Windows does a poor job of scaling, and believe us, you really will want to enable some kind of scaling. If you try running the UP2414Q at native resolution, with standard Windows DPI and standard font size settings, everything on the screen looks preposterously tiny. It just isn’t usable. Even If you fiddle around with the fonts and text scaling, you’ll still hit problems. Sure, you can achieve something legible, and we’d even concede that many core elements of the Windows 8.1 desktop interface, including Windows Explorer, scale nicely and look superb. Unfortunately, most third-party apps look, if you’ll pardon the colloquialism, utterly ass. What you get is a blurred, blown-up bitmap that makes everything look soft and fuzzy. The same goes for nearly all web pages and the Steam interface. The harsh truth is that much of the computing world isn’t ready for high-DPI displays, and that becomes all too apparent as soon as you fire up the UP2414Q.</p> <p>Windows 8.1’s Modern UI is properly scalable, and looks crisp and clean for the most part, but it’s probably not the bit of Windows most people will be planning to use predominantly with a monitor that’s not touch-enabled.</p> <p>All of which makes this 24-inch 4K monitor a tricky proposition. It looks absolutely fantastic, but at this stage, it’s probably of more interest to content-creation professionals than PC performance and gaming enthusiasts. Instead, it could well be a TN panel that is larger and half the price that makes ultra-HD resolutions a practical, affordable prospect for gaming and other desktop PC applications.</p> <p><strong>$1,300,</strong> <a href="http://www.dell.com/">www.dell.com</a></p> http://www.maximumpc.com/dell_ultrasharp_up2414q_review#comments 4k Dell UltraSharp UP2414Q Hardware maximum pc May issues 2014 monitor panel Review screen Reviews Wed, 03 Sep 2014 18:45:11 +0000 Jeremy Laird 28459 at http://www.maximumpc.com Corsair Hydro H105 Review http://www.maximumpc.com/corsair_hydro_h105_review <!--paging_filter--><h3>The H75’s big brother is not too shabby</h3> <p>Over the past couple of years or so, we gearheads have transitioned from membrane keyboards to mechanical ones; from mechanical hard drives to SSDs; and from air-cooling our CPUs to using closed liquid loops. All favorable moves, though the latter group suffers from a lack of variety. You can get radiators in 120mm, 240mm, and 280mm sizes, but they’re almost all painted plain black with black tubing, although some include the small style concession of a glowing logo on the pump housing. Part of this has to do with just a handful of companies designing coolers for a large number of brands. This plainness may be a drag in a tricked-out rig, but in the case of the Corsair H105, we’ve discovered that a lack of fanciness can be an advantage.</p> <p>Corsair’s H105 radiator is thicker than usual (38mm instead of 27mm), and there’s a silver ring on the top of the pump that can be switched out for a red or blue one. But it’s not reinventing any wheels. Its tubing isn’t thick, and its pump isn’t very large. But you’ll notice how easily it installs in your system. There’s just one basic fan cable for the pump, which you can plug into any header on the motherboard, or directly into the power supply with a Molex adapter. The pump has two speeds: on and off. The fans use PWM control, so they’ll spin up and down smoothly, according to temperature readings. Just attach them to the bundled standard splitter cable, then connect that to the motherboard’s CPU fan header. And there’s no software this time; you just use your motherboard’s fan controls instead.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/corsair_hydro_h105_small_0.jpg"><img src="/files/u152332/corsair_hydro_h105_small.jpg" alt="Since this pump does not offer variable speeds, it can be plugged directly into the power supply for maximum effectiveness." title="Corsair Hydro H105" width="620" height="505" /></a></p> <p style="text-align: center;"><strong>Since this pump does not offer variable speeds, it can be plugged directly into the power supply for maximum effectiveness.</strong></p> <p>Our test bed’s Rampage IV Extreme motherboard has Windows-based software called “Fan Xpert” that intelligently controls fan speeds. We ran our torture test with the H105’s fans set to “Quiet” in Fan Xpert, and got a pretty respectable 70 Celsius. When pushed to “Turbo” mode, the fans spun up to about 2,000rpm and lowered CPU temps to 65C. These aren’t the lowest temperatures we’ve seen, but they’re still pretty respectable, and the H105’s noise levels were surprisingly good. However, we couldn’t get a clear picture of how much the thickness of the radiator compensated for the modest diameter of the tubing and size of the pump. Those two properties seem to give the Cooler Master Glacer 240L and Nepton 280L an edge. But at press time, the H105 cost less at most stores than the Glacer (we suspect partly because the Glacer is an expandable system), and the Nepton has a 280mm cooler that doesn’t fit in a lot of cases.</p> <p>If you want a liquid-cooling system with a 240mm radiator, and you don’t care about expandability, then the ease of installation, ease of use, and manageable noise levels of the H105 make it hard to beat for the price. And like all Corsair liquid coolers, it gets a five-year warranty, whereas the competition usually gives you two or three years of coverage. On the other hand, the radiator’s extra 11mm of thickness makes it too large for certain cases. Corsair says that the cooler is compatible with “the vast majority” of chassis, but its list leaves off a number of seemingly workable cases of its own, such as the Carbide 500R and the Graphite 600T. If you can spend more money, there are slightly better coolers out there, but the H105 is a well-rounded package.</p> <p><strong>$120,</strong> <a href="http://www.corsair.com/en">www.corsair.com</a></p> http://www.maximumpc.com/corsair_hydro_h105_review#comments Air Cooling Corsair Hydro H105 cpu Hardware maximum pc May issues 2014 Review water cooler Reviews Wed, 03 Sep 2014 18:39:38 +0000 Tom McNamara 28444 at http://www.maximumpc.com Maingear Pulse 17 Review http://www.maximumpc.com/maingear_pulse_17_review_2014 <!--paging_filter--><h3>A large, light gaming laptop marred by several flaws</h3> <p>Like the <a title="ibuypower battalion" href="http://www.maximumpc.com/ibuypower_battalion_m1771-2_review" target="_blank">iBuypower Battalion</a> laptop we previously reviewed, Maingear’s Pulse 17 is aimed at enthusiasts who want a large gaming laptop but don’t want to kill themselves lugging it around.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/maingear_pulse_17_small_0.jpg"><img src="/files/u152332/maingear_pulse_17_small.jpg" alt="You can customize the backlit LED keyboard’s colors to your heart’s content." title="Maingear Pulse" width="620" height="741" /></a></p> <p style="text-align: center;"><strong>You can customize the backlit LED keyboard’s colors to your heart’s content.</strong></p> <p>The 17.3-inch Pulse 17 fits the bill in screen real estate, and at 0.8-inches thick and 6 pounds, it’s in line with the thinnest laptops while maintaining a manageable weight. However, what it saves you in weight, it eats up in footprint, with a 16.5-inch by 11.2-inch spot on your desk. It’s large enough that it won’t fit in most normal-sized laptop backpacks.</p> <p>Though it uses the same chassis as iBuypower’s Battalion, Maingear says that unlike that machine, its Pulse 17 gives customers the option to opt for a wireless AC network card (for an additional $50), the widest array of SSD options (up to two MSATA 256GB SSDs in RAID 0), and a custom lid paint job. You can choose among a wide variety of colors (we opted for the “Alpine White” coat for a clean look), and while that’s certainly appreciated, the paint job is a little rough around the edges of the Main Gear logo on the lid. That cuts two ways, though; a decal would be cleaner, but then it’s a decal, not a custom paint job.</p> <p>Even though that’s certainly a small gripe, the build quality in general seemed a little subpar. The chassis allows quite a bit of flex, making it impossible to pick up the notebook without hearing it creak. In addition, one of its rubber feet fell off, which is annoying and further indication of a substandard build. The notebook’s keyboard is similarly lackluster; its keys lack a firm tactile response. We also had an issue with the space bar intermittently failing to register presses. Unfortunately, its ELAN trackpad was unresponsive when it came to two-finger scrolling. Even worse is the gesture used for two-finger scrolling, which is counter to how smartphones maneuver, with no way to change it in settings. We wish we could say that the panel and speakers made up for these shortcomings, but both the TN panel and the speakers were meh.</p> <p>Thankfully, the laptop’s performance is very respectable, especially when you consider its form factor. Inside, you’ll find a quad-core 2.4GHz Core i7-4700HQ CPU, 16GB of DDR3/1600, and the most popular GPU for ultra-thin gaming laptops today: a GeForce GTX 765M.</p> <p>While the Pulse 17’s graphics card score tied our Alienware 14 zero-point laptop (which also uses a GeForce GTX765M), the Pulse 17 was able to outperform it by 5 percent in both our Metro: Last Light and BioShock Infinite benchmarks.</p> <p>What this amounts to in real-world terms is average frame rates in the high 50s playing BioShock Infinite on “medium” settings. It wasn’t quite as impressive CPU-side, though, falling in line with Alienware’s very similar Core i7-4700MQ chip in single-threaded tasks, but faltering 6 percent in our multithreaded x264 test. It also wasn’t quite as energy-efficient as the Alienware 14, but its 6-cell battery did last around 3.5 hours in our video rundown test, which is actually longer than the majority of gaming laptops we’ve reviewed, especially compared to iBuypower’s similarly spec’d Battalion notebook, which couldn’t even make it to 2.5 hours.</p> <p>Where the iBuypower laptop really has the advantage over the Maingear, however, is in price. At $2,400, the Pulse 17 costs a whopping $540 more with very similar specs. Yowza! The added expense likely comes down to the custom paint job that Maingear offers, and the company’s two-year warranty program versus the one-year warranty that iBuypower provides. If you don’t care about those added features, but are still interested in the laptop, we recommend going with iBuypower’s product.</p> <p><strong>$2,400,</strong> <a href="http://www.maingear.com/">www.maingear.com</a></p> http://www.maximumpc.com/maingear_pulse_17_review_2014#comments Business Notebooks Hardware maingear maximum pc May issues 2014 Review Reviews Thu, 28 Aug 2014 00:35:39 +0000 Jimmy Thang 28445 at http://www.maximumpc.com OCZ Vertex 460 240GB Review http://www.maximumpc.com/ocz_vertex_460_240gb_review <!--paging_filter--><h3>Rumors of its death were greatly exaggerated</h3> <p>That last time we heard from OCZ was back before the end of 2013, when the company was in the grips of bankruptcy and nobody was sure what its future held. Fast forward to March 2014, and things are looking rather good for the formerly beleaguered company, much to everyone’s surprise. Rather than simply dissolve and fade away like we had feared, the company has been acquired by storage behemoth Toshiba, and is now operating as an independent subsidiary.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/vertex460_lrg_small_0.jpg"><img src="/files/u152332/vertex460_lrg_small.jpg" alt="OCZ’s new drive has a more subdued, corporate look to it, thanks to a takeover by “the man.”" title="OCZ Vertex 460 240GB" width="620" height="449" /></a></p> <p style="text-align: center;"><strong>OCZ’s new drive has a more subdued, corporate look to it, thanks to a takeover by “the man.”</strong></p> <p>The best news is OCZ’s NAND-acquisition troubles are seemingly a thing of the past, as Toshiba is one of the world’s largest manufacturers of NAND. So, it is no surprise that the first drive we’re seeing from the new venture is essentially a reborn Vector drive, only with Toshiba NAND flash. Dubbed the Vertex 460, this “new” drive blends the company’s proprietary Barefoot 3 controller found on its high-end Vector drives with Toshiba’s 19nm MLC NAND flash, so it’s ditching the Micron NAND it used previously. The result is basically a slight watering-down of its Vector 150 drive in order to make it more affordable and consumer-friendly. It also needed to bring its Barefoot 3 controller over to its mainstream line of Vertex-branded drives, so this drive accomplishes that feat, as well.</p> <p>In many ways, the Vertex 460 is very similar to the company’s recent Vector 150 drive, the only difference being the Vector has a five-year warranty and has a higher overall endurance rating to reflect its use of binned NAND flash. The Vertex 460 is no slouch, though, and is rated to handle up to 20GB of NAND writes per day for three years. The drive also utilizes over-provisioning, so 12 percent of the drive is reserved for NAND management by the Barefoot 3 controller. Though you lose some capacity, you gain longer endurance and better performance, so it’s a worthwhile trade-off. The Vertex 460 also offers hardware encryption support, which is very uncommon for a mainstream drive, and though we’d never use it, it’s nice to have options. Otherwise, its specs are par for the course in that it’s a 7mm drive and is available in 120GB, 240GB, and 480GB flavors. It’s also bundled with a 3.5-inch bay adapter as well as a copy of Acronis True Image, which is appreciated.</p> <p>When we strapped the Vertex to our test bench, we saw results that were consistently impressive. In every test, the Vertex 460 was very close to the fastest drives in its class, and in all scenarios it’s very close to saturating the SATA bus, so it’s not really possible for it to be any faster. It had no problem handling small queue depths of four commands in ATTO, and held its own with a 32 queue depth in Iometer, too. It was a minute slower than the Samsung 840 EVO in our Sony Vegas test, which writes a 20GB uncompressed AVI file to the drive, but also much faster than the Crucial M500 in the same test. Overall, there were no weak points whatsoever in its performance, but it is not faster than the Samsung 840 EVO, and its OCZ Toolbox software utility is extremely rudimentary compared to the Samsung app. Though the Vertex 460 is an overall very solid drive, it doesn’t exceed our expectations in any particular category. In other words, it’s a great SSD, but not quite Kick Ass.</p> <p><strong>$190,</strong> <a href="http://ocz.com/">www.ocz.com</a></p> http://www.maximumpc.com/ocz_vertex_460_240gb_review#comments Hard Drive Hardware HDD May issues 2014 OCZ Vertex 460 240GB Review solid state drive ssd Reviews Wed, 20 Aug 2014 14:16:12 +0000 Josh Norem 28382 at http://www.maximumpc.com Nvidia Shield Tablet Review http://www.maximumpc.com/nvidia_shield_tablet_review_2014 <!--paging_filter--><h3>Updated: Now with video review!&nbsp;</h3> <p>Despite its problems, we actually liked <a title="Nvidia Shield review" href="http://www.maximumpc.com/nvidia_shield_review_2013" target="_blank">Nvidia’s original Shield Android gaming handheld</a>. Our biggest issue with it was that it was bulky and heavy. With rumors swirling around about a Shield 2, we were hoping to see a slimmer, lighter design. So consider us initially disappointed when we learned that the next iteration of Shield would just be yet another Android tablet. Yawn, right? The fact of the matter is that the Shield Tablet may be playing in an oversaturated market, but it’s still great at what it sets out to be.</p> <p><iframe src="//www.youtube.com/embed/dGigsxi9-K4" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>We've updated our review to include the video review above.</strong></p> <p>At eight inches, the Shield Tablet features a gorgeous 1900x1200 display, which shares the same resolution as Google’s flagship <a title="nexus 7 review" href="http://www.maximumpc.com/google_nexus_7_review_2013" target="_blank">Nexus 7</a> tablet. At 13.1 ounces, the Shield Tablet is about three ounces heavier than the Nexus 7 but still a lot lighter than the original’s 1 lb. 4.7 ounces.&nbsp;</p> <p>Part of the weight increase with the Shield Tablet over the Nexus 7 is due to the extra inch that you’re getting from the screen, but also because the Shield Tablet is passively cooled and has an extra thermal shield built inside to dissipate heat. It’s a little heavier than we like, but isn’t likely to cause any wrist problems. On the back of the Shield is an anti-slip surface and a 5MP camera, and on the front of the tablet is a front-facing 5MP camera and two front-facing speakers. While the speakers are not going to blow away dedicated Bluetooth speakers, they sound excellent for a tablet. In addition to the speakers, the Shield Tablet has a 3.5mm headphone jack up at the top. Other ports include Micro USB, Mini HDMI out, and a MicroSD card slot capable of taking up to 128GB cards. Buttons on the Shield include a volume rocker and a power button, which we found to be a little small and shallow for our liking.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_exploded_view_black_bckgr.jpg" alt="Nvidia Shield Tablet guts" title="Nvidia Shield Tablet guts" width="620" height="349" /></p> <p style="text-align: center;"><strong>The guts of the Nvidia Shield Tablet.</strong></p> <p>All of this is running on the latest version of Android KitKat (4.4). Nvidia says that it will update the tablet to Android L within a few weeks of Google’s official release. If Nvidia’s original Shield is any indication of how well the company keeps up with OS updates, you should be able to expect to get the latest version of Android after a couple of weeks, if not a months, after release. Regardless, the Shield Tablet is running a pretty stock version of Android to begin with, the main difference being that Nvidia has pre-loaded the tablet with its Shield Hub, which is a 10-foot UI used to purchase, download, and launch games.</p> <p>Arguably, the real star of the tablet is Nvidia’s new Tegra K1 mobile superchip. The 2.2GHz quad-core A15 SOC features Nvidia’s Kepler GPU architecture and 192 CUDA cores along with 2GB of low-power DDR3. K1 supports many of the graphical features commonplace in GeForce graphics cards, including tesselation, HDR lighting, Global illumination, subsurface scattering, and more.</p> <p>In our performance benchmarks, the K1 killed it. Up until now, the original Shield’s actively cooled Tegra 4 is arguably one of the most, if not <em>the</em> most, powerful Android SOC on the market, but the K1 slaughters it across the board. In Antutu and GeekBench benchmark, we saw modest gains of 12 percent to 23 percent in Shield vs. Shield Tablet action. But in Passmark and GFX Bench’s Trex test, we saw nearly a 50 percent spread, and in 3DMark’s mobile Icestorm Unlimited test, we saw an astounding 90 percent advantage for the Shield Tablet. This is incredible when you consider that the tablet has no fans and a two-watt TDP. Compared to the second-gen Nexus 7, the Shield Tablet benchmarks anywhere from 77 percent to 250 percent faster. This SOC is smoking fast.</p> <p>In terms of battery life, Nvidia claims you’ll get 10 hours watching/surfing the web and about five hours from gaming with its 19.75 Wh battery. This is up 3.75 Wh up from Google’s Nexus 7 equivalent, and from our experiential tests, we found those figures to be fairly accurate if not a best-case scenario. It will pretty much last you all day, but you'll still want to let it sip juice every night.</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_war_thunder.jpg" alt="Shield Tablet review" title="Shield Tablet review" width="620" height="343" /></p> <p style="text-align: center;"><strong>The new wireless controller uses Wi-Fi Direct instead of Bluetooth for lower latency.</strong></p> <p>Of course, if you’re going to game with it, you’re going to need Nvidia’s new wireless Shield Controller. Sold separately for $60, the 11.2-ounce Shield Controller maintains the same button layout as the original Shield controller, but feels a lot lighter and is more comfortable to hold. While most Android game controllers operate over Bluetooth, Nvidia opted to go with Wi-Fi Direct, stating that it offers 2x faster response time and more bandwidth. The extra bandwidth allows you to plug a 3.5mm headphone into the controller and also allows you to link up to four controllers to the device, which is an appreciated feature when you hook up the tablet to your HDTV via the Shield Tablet’s <a title="shield console mode" href="http://www.maximumpc.com/nvidia_sweetens_shield_console_android_442_kitkat_price_drop_199_through_april" target="_blank">Console Mode</a>. Other unique features of the controller include capacitive-touch buttons for Android’s home, back, and play buttons. There’s also a big green Nvidia button that launches Shield Hub. The controller also has a small, triangle-shaped clickable touch pad which allows you to navigate your tablet from afar. One quibble with it is that we wish the trackpad was more square, to at least mimic the dimensions of the tablet; the triangle shape was a little awkward to interface with. Another problem that we initially had with the controller was that the + volume button stopped working after a while. We contacted Nvidia about this and the company sent us a new unit, which remedied the issue. One noticeable feature missing from the controller is rumble support. Nvidia said this was omitted on the original Shield to keep the weight down; its omission is a little more glaring this time around, however, since there's no screen attached to the device.</p> <p>The controller isn’t the only accessory that you’ll need to purchase separately if you want to tap into the full Shield Tablet experience. To effectively game with the tablet, you’ll need the Shield Tablet cover, which also acts as a stand. Like most tablets, a magnet in the cover shuts off the Shield Tablet when closed, but otherwise setting up the cover and getting it to act as a stand is initially pretty confusing. The cover currently only comes in black, and while we’re generally not big on marketing aesthetics, it would be nice to have an Nvidia green option to give the whole look a little more pop. We actually think the cover should just be thrown in gratis, especially considering that the cheapest 16GB model costs $300. On the upside though, you do get Nvidia’s new passive DirectStylus 2 that stows away nicely in the body of the Shield Tablet. Nvidia has pre-installed note-writing software and its own Nvidia Dabbler painting program. The nice thing about Dabbler is that it leverages the K1’s GPU acceleration so that you can virtually paint and blend colors in real time. There’s also a realistic mode where the “paint” slowly drips down the virtual canvas like it would in real life.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/shield_tablet_shield_controller_trine2_0.jpg" alt="Shield tablet review" title="Shield tablet review" width="620" height="404" /></p> <p style="text-align: center;"><strong>The Shield Controller is a lot lighter and less blocky than the original Shield Portable.</strong></p> <p>But that’s probably not why you’re interested in the Shield Tablet. This device is first and foremost a gaming tablet and even comes with a free Android copy of Trine 2. Trine 2 was originally a PC game and it’s made a great transition to the Shield Tablet. While the game was never known to be a polygon pusher, it looks just as good as it ever did on its x86 debut.&nbsp;</p> <p>With gaming as the primary driver for Shield Tablet, you may wonder why Nvidia didn’t bundle its new controller. The company likely learned from Microsoft’s mistake with Kinect and the Xbox One: Gamers don’t like to spend money and getting the price as low as possible was likely on Nvidia’s mind. Of course, not everyone may even want a controller, with the general lack of support for them in games. Nvidia says there are now around 400 Android titles that support its controller, but that’s only a small percentage of Android games and the straight truth is that the overwhelming majority of these games are garbage.&nbsp;</p> <p>Nvidia is making a push for Android gaming, however. The company worked with Valve to port over Half Life 2 and Portal to the Shield and they look surprisingly fantastic and are easily the two prettiest games on Android at the moment. Whether Android will ever become a legitimate platform for hardcore gaming is anyone’s guess, but at least the Shield Tablet will net you a great front seat if the time ever arises.</p> <p>Luckily, you won’t have to rely solely on the Google Play store to get your gaming fix. Emulators run just as well here as they did on the original Shield and this iteration of Shield is also compatible with Gamestream, which is Nvidia’s streaming technology that allows you to stream games from your PC to your Shield. Gamestream, in theory, lets you play your controller-enabled PC games on a Shield.</p> <p>At this point, Nvidia says Gamestream supports more than 100 games such as Batman: Arkham Origins and Titanfall from EA’s Origin and Valve’s Steam service. The problem, though, is that there are hundreds more games on Steam and Origin that support controllers—but not the Shield Tablet’s controller. For example, Final Fantasy VII, a game that we couldn’t get to work with the original Shield, still isn't supported even though it works with an Xbox controller on the PC. When Gamestream does work, however, it’s relatively lag-free and kind of wonderful. The one caveat here is that you’ll have to get a 5GHz dual-band router to effectively get it working.&nbsp;</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/rh7fWdQT2eE" width="620" height="349" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Nvidia Shield Video demo.</strong></p> <p>Would we buy the Shield Tablet if we owned the original Shield (now renamed the Shield Portable)? Probably not. If we were looking for a new tablet and top-notch gaming performance was on the checklist, the Shield Tablet is easily the top contender today. We’d take it over the second-gen Nexus 7 in a heartbeat. While we understand why Nvidia decided to separate the cover and controller to keep the prices down and avoid the Kinect factor, we think a bundled package with a small price break as an alternative would have been nice. All things considered though, consider us surprised. The Shield Tablet is pretty dang cool.&nbsp;</p> <p><strong>$300</strong></p> <p><em><strong>Update:</strong> The original article incorrectly labled the Shield Portable benchmarks with the Nexus 7 figures. The issue has been resolved and both benchmark charts are listed below.&nbsp;</em></p> http://www.maximumpc.com/nvidia_shield_tablet_review_2014#comments android Google Hardware KitKat maximum pc nvidia portable Review shield tablet wireless controller News Reviews Tablets Mon, 18 Aug 2014 21:36:57 +0000 Jimmy Thang 28263 at http://www.maximumpc.com Xidax M6 Mining Rig Review http://www.maximumpc.com/xidax_m6_mining_rig_review_2014 <!--paging_filter--><h3>A gaming rig that pays for itself</h3> <p>Exotic car paint, multiple GPUs, and custom-built chassis’ be damned, boutique PC builder <a title="xidax" href="http://www.maximumpc.com/tags/Xidax" target="_blank">Xidax</a> thinks it has the sexiest sales pitch on the planet with its <strong>M6 Mining Rig</strong>: It pays for itself! Now, we can’t say this PC is basically “free” because it ain’t that, but Xidax says by using the box’s spare GPU cycles to mine for crypto-currency, this baby would be paid off in about four months. To be honest, it’s not something we’ve ever considered, as we’ve seen gaming rigs, and we’ve seen coining rigs, but never in the same box. It seems like a solid idea though, as the system can game during the day, then mine at night to help cover its cost.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/xidax_guts13979_small_0.jpg"><img src="/files/u152332/xidax_guts13979_small.jpg" alt="The Xidax M6 Mining Rig comes set up with everything you need to start mining crypto-currancy almost right out of the box." title="Xidax M6 Mining Rig" width="620" height="676" /></a></p> <p style="text-align: center;"><strong>The Xidax M6 Mining Rig comes set up with everything you need to start mining crypto-currancy almost right out of the box.</strong></p> <p>The system’s specs include a 3.4GHz Core i5-4670K with 16GB of RAM, a Corsair RM 850 PSU, closed-loop liquid cooler, 250GB Samsung 840 EVO SSD, 1TB WD Black, and a pair of Sapphire Radeon R9 290X cards. In application performance, it’s pretty pedestrian with its stock-clocked Core i5-4670K. Why not something more badass? Xidax says it weighed hardware choices carefully because the pricier the hardware, the longer it takes to pay off with crypto-coins. The Radeons are a wise choice, as they offer about twice the performance of Nvidia’s fastest GPUs in mining applications. Gaming is also quite excellent (obviously, for a two-card system), and its mining performance is impressive at 1.7 to 1.8 Kilohashes per second. (Hashes of the kilo/mega/giga variety are the units of measurement for mining productivity.)</p> <p>Xidax ships the PC ready to start mining operations almost right out of the box, which is normally a daunting task. It also includes a Concierge (or should we say coincierge) service that has a Xidax rep remotely connect to the rig and do a final tune on the box for maximum mining performance. On this particular machine, it came ready to mine for Doge Coins and was forecast to make about $21.60 a day, or $670 a month, on a 24/7 schedule—including electricity costs.</p> <p>What’s the catch? There are a few. First, it’s loud when mining. In fact, it’s so loud that you won’t be able to stand being in the same room with it. Second, you can’t do anything with it while it’s mining because all GPU resources are pegged to the max. Third, crypto-currency can be volatile. Bitcoin saw its value see-saw from $130 to $1,242 and then back to $455 and $900 in just four months. It could all go kaput in a few months, or who knows—the government might even step in and ruin the fun.</p> <p>Considering its performance outside of mining, the M6 Mining Rig is pricey at $3,000. However, the price includes a lifetime warranty on parts and service except for the GPUs. Those carry a five-year warranty, which is still surprisingly good, considering that board vendors are already making noises that they don’t want to eat the cost of dead boards killed by mining. Xidax says it will cover them, though. And—again—it pays for itself, right?</p> <p>That’s ultimately the appeal of the M6 Gaming Rig, but it has to be carefully considered by potential buyers. After all, anything that sounds too good to be true usually is, but then again, it is a powerful gaming PC that could theoretically pay for itself in a few months. And even if the market blew up, at least you’d still have a formidable gaming PC rather than just standing there with your RAM sticks in one hand. And if it works out, whoa baby, you just got a PC for free! –</p> <p><strong>$3,000,</strong> <a href="http://www.xidax.com/">www.xidax.com</a></p> <p><img src="/files/u154082/xidax_benchmarks.png" alt="xidax benchmarks" title="xidax benchmarks" width="620" height="277" /></p> http://www.maximumpc.com/xidax_m6_mining_rig_review_2014#comments april issues 2014 bitcoin dogecoin Hardware maximum pc Review xidax m6 mining computer Reviews Systems Wed, 06 Aug 2014 16:42:51 +0000 Gordon Mah Ung 28234 at http://www.maximumpc.com Gigabyte Radeon R9 290X OC Review http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review <!--paging_filter--><h3>As good as it gets, if you can find one to buy</h3> <p>Aftermarket Radeon R9 290X GPUs are beginning to make the rounds, and this month we had a WindForce-cooled behemoth from <a title="gigabyte" href="http://www.maximumpc.com/tags/Gigabyte" target="_blank">Gigabyte</a> strutting its stuff in the lab. Unlike last month’s <a title="sapphire tri x r9 290x" href="http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review" target="_blank">Sapphire Tri-X R9 290X</a>, this board features a custom PCB in addition to the custom cooler, whereas the Sapphire slapped a huge cooler onto the reference design circuit board. Theoretically, this could allow for higher overclocks on the Gigabyte due to better-quality components, but more on that later.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/windforce14052_small_0.jpg"><img src="/files/u152332/windforce14052_small.jpg" alt="Unlike the reference design, Gigabyte’s R9 290X is cool, quiet, and overclockable." title="Gigabyte Radeon R9 290X OC" width="620" height="476" /></a></p> <p style="text-align: center;"><strong>Unlike the reference design, Gigabyte’s R9 290X is cool, quiet, and overclockable.</strong></p> <p>This is the overclocked version of the card, so it clocks up to 1,040MHz under load, which is a mere 40MHz over stock. These boards always have conservative overclocks out of the box, though, and that is by no means the final clock speed for this card. We’ve covered its WindForce cooler in past reviews, so we won’t go into all the details, but it’s a three-fan cooler that only takes up two PCIe slots and uses six heat pipes with inclined heatsinks to better dissipate the warm. It’s good for 450W of heat dispersal, according to Gigabyte, and since the R9 290X is roughly a 300W card (AMD has never given a TDP for this particular model for some reason), the WindForce cooler should be more than up to the job.</p> <p>Like all Radeon R9 290X boards, this sucker is big and long, measuring 11.5 inches. Gigabyte recommends you use at least a 600W power supply with it, and it sports two dual-link DVI ports for 2560x1600 gaming, as well as HDMI 1.4 and DisplayPort 1.2a if you want to run 4K. The card comes bundled with a free set of headphones. It used to include a free copy of Battlefield 4, but the company told us it was no longer offering the game bundle because it had run out of coupons. The MSRP of the board is $620, but some stores had it for $599 while others marked it up to $700.</p> <p>Once we had this Windy Bad Boy in the lab, we were very curious to compare it to the Sapphire Tri-X R9 290X we tested last month. Since both cards feature enormous aftermarket coolers, have the exact same specs and clocks, and are roughly the same price, we weren’t surprised to find that they performed identically for the most part.</p> <p>If you look at the benchmark chart, in every test the two cards are almost exactly the same—the only exception being Metro, but since that’s a PhysX game, AMD cards can get a bit wonky sometimes. In every other test, the two cards are within a few frames-per-second difference, making them interchangeable. Both cards also run in the mid–70 C zone under load, which is 20 C cooler than the reference design. We were able to overclock both cards to just a smidge over 1,100MHz, as well.</p> <p>“Okay,” you are saying to yourself. “I’m ready to buy!” Well, that’s where we run into a small problem. Gigabyte’s MSRP for this card is $620—the same as the Sapphire Tri-X card—but at press time, the cheapest we could find it for was $700 on Newegg. We can’t ding Gigabyte for Newegg’s pricing, but it’s a real shame these R9 290X cards are so damned expensive.</p> <p><strong>$620,</strong> <a href="http://www.gigabyte.us/">www.gigabyte.us</a></p> http://www.maximumpc.com/gigabyte_radeon_r9_290x_oc_review#comments Air Cooling amd april issues 2014 Gigabyte Radeon R9 290X OC gpu graphics card Hardware maximum pc Review Reviews Tue, 05 Aug 2014 19:52:42 +0000 Josh Norem 28227 at http://www.maximumpc.com NZXT H440 Review http://www.maximumpc.com/nzxt_h440_review <!--paging_filter--><h3>Remarkably clean, and limited, too</h3> <p>We love the fact that <a title="nzxt" href="http://www.maximumpc.com/tags/nzxt" target="_blank">NZXT</a> bills this semi-silent-themed case as a “hassle-free experience.” We wonder if the company was using the same case that we were, because we encountered quite a bit of hassle building a standard configuration into this smaller-than-usual chassis.</p> <p>For starters, the case itself ships with no printed manual—at least, ours didn’t. We only hope that’s an oversight with our early review unit instead of a standard feature of the chassis itself, because there are definitely some features of the <a title="h440" href="http://www.maximumpc.com/nzxt_h440_case_ditches_optical_drive_bays_cleaner_look" target="_blank">H440</a> that warrant a bit of instruction, especially for neophyte builders.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/h440_blk_main_24x32in_small_0.jpg"><img src="/files/u152332/h440_blk_main_24x32in_small.jpg" alt="The H440 is the first case we’ve tested that doesn’t have 5.25-inch drive bays. " title="NZXT H440" width="620" height="578" /></a></p> <p style="text-align: center;"><strong>The H440 is the first case we’ve tested that doesn’t have 5.25-inch drive bays. </strong></p> <p>Case in point: There are absolutely zero 5.25-inch bays to be found on the H440, which is a good thing to know before you start attempting to pry off the H440’s front (Dremel in hand). We know, we know, the optical drive is dead, long live the optical drive—but is it too soon? To be honest, there’s an upstart contingent here at Maximum PC who think it’s a plus, while some cranky old farts think it’s a minus. Additionally, installing the power supply might evoke a bout of head-scratching at first, as there’s seemingly no way to just stuff it into the chassis thanks to how it’s been compartmentalized on the case’s bottom. This does build on the case’s motto of “remarkably clean,” though, by hiding your messy PSU cabling.</p> <p>This leads us into one of our major gripes with this chassis: There’s a lot of screwing. We pretty much pulled out the thumbscrews in the case’s side, which are supposedly designed to not do that. Beyond that, you have to unscrew a panel to slide the power supply in, you have to unscrew the standard PCI slot covers for any devices you want to install, and—most frustratingly—you have to first unscrew the case’s steel drive trays (up to six total) just for the privilege of being able to screw in your hard drive. Clean, yes. Toolless, no.</p> <p>The case feels a bit small on the inside, but it adequately supported our standard test setup (including an Nvidia GTX 480 video card) without any cramming or wedging. We like how the case’s three rubberized cable-routing holes fit perfectly with a standard video card setup—when using the top-most PCI Express x16 slot on our ATX motherboard, our video card didn’t block any of the much-needed routing holes.</p> <p>That said, cable routing is a bit of a challenge in the H440. There’s already not that much room between the rear of the motherboard tray and the case’s side panel. Amplifying the claustrophobia is a layer of soundproofing foam adhered to the side panel. We love that NZXT cares so much about our ears, but it makes for a less-than-pleasant smashing of cables against the case’s side (especially since there’s only one provided hole for power-supply cables to route through otherwise). Cable-management options feel more constrained by this case than others we’ve tested.</p> <p>The foam surrounding the case’s insides has quite a bit of work in store for it, too. No fewer than four of NZXT’s next-gen case fans grace the inside of the chassis: three 12cm fans on the front and one 14cm fan on the back. When we fired up the system with no components inside it, the soundproof-themed case was a bit audible. A full system only adds to the din, and while we appreciate NZXT’s efforts toward keeping the volume dial at a three instead of an eleven, it seems to be a bit of a lost cause.</p> <p>NZXT seems to think this case is perfect for liquid cooling. For some all-in-one setups, sure; for customized loops, you’re going to be in for something of a tubing nightmare. Best of luck!</p> <p><strong>$120,</strong> <a href="http://www.nzxt.com/">www.nzxt.com</a></p> http://www.maximumpc.com/nzxt_h440_review#comments april issues 2014 Hardware maximum pc Review Cases Reviews Tue, 05 Aug 2014 19:46:55 +0000 David Murphy 28236 at http://www.maximumpc.com Seagate 1TB Hybrid vs. WD Black2 Dual Drive http://www.maximumpc.com/seagate_1tb_hybrid_vs_wd_black2_dual_drive_2014 <!--paging_filter--><h3>Seagate 1TB Hybrid vs. WD Black2 Dual Drive</h3> <p>Every mobile user who is limited to just one storage bay wants the best of both worlds: SSD speeds with HDD capacities. Both Seagate and WD have a one-drive solution to this problem, with Seagate offering a hybrid 1TB hard drive with an SSD cache for SSD-esque performance, and WD offering a no-compromise 2.5-inch drive with both an SSD and an HDD. These drives are arch rivals, so it’s time to settle the score.</p> <h4>ROUND 1: Specs and Package</h4> <p>The WD Black2 Dual Drive is two separate drives, with a 120GB SSD riding shotgun alongside a two-platter 1TB 5,400rpm hard drive. Both drives share a single SATA 6Gb/s interface and split the bandwidth of the channel between them, with the SSD rated to deliver 350MB/s read speeds and 140MB/s write speeds. The drive comes with a SATA-to-USB adapter and includes a five-year warranty. The Seagate SSHD uses a simpler design and features a 1TB 5,400rpm hard drive with an 8GB sliver of NAND flash attached to it, along with software that helps move frequently accessed data from the platters to the NAND memory for faster retrieval. It includes a three-year warranty and is otherwise a somewhat typical drive aimed at the consumer market, not hardcore speed freaks. Both drives include free cloning software, but since the WD includes two physical drives, a USB adapter, and a longer warranty, it gets the nod.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wd_endeavor_quarter_left_higres_smal_0.jpg"><img src="/files/u152332/wd_endeavor_quarter_left_higres_smal.jpg" alt="WD’s Black2 Dual Drive is two individual drives in one enclosure, and it has the price tag to prove it. " title="WD Black2" width="620" height="620" /></a></p> <p style="text-align: center;"><strong>WD’s Black2 Dual Drive is two individual drives in one enclosure, and it has the price tag to prove it. </strong></p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 2: Durability</h4> <p>This category is somewhat of a toss-up, as the WD Black2’s overall reliability is degraded somewhat by the fact that it has a spinning volume attached to it, giving it the same robustness of the Seagate SSHD. There’s also the issue of the WD Black using the slightly antiquated JMicron controller. We don’t have any reliability data on that controller in particular, but we are always more concerned about the SSD controller you-know-whating the bed than the memory, which is rated to last for decades, even under heavy write scenarios. Both drives also use two-platter designs, so neither one is more or less prone to damage than the other. In the end, we’ll have to go with the Seagate SSHD as being more durable, simply because you only have to worry about one drive working instead of two.&nbsp;</p> <p><strong>Winner: Seagate SSHD</strong></p> <h4>ROUND 3: Performance</h4> <p>Seagate is very clear about the performance of its hybrid drives, stating that they “boot and perform like an SSD,” but it never says they’re faster. It also claims the drive is “up to five times faster than a hard drive,” which seems like a bit of a stretch. It’s difficult to actually benchmark a caching drive because it won’t show on standard sequential read tests, and it gets killed by SSDs in access time tests. That said, we did see boot and PCMark Vantage scores improve significantly over time. Our boot time dropped by more than half, going from 2:27 to 1:07 after several boots, and our PCMark Vantage score shot up from 6,000 to 19,000. Still, these times are much slower than what we got with the WD SSD, which booted in 45 seconds (the system had three dozen programs installed), and hit 33,000 in PCMark Vantage.</p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 4: Cloning Package</h4> <p>Both drives include free software to help you clone your old drive and, in an odd twist, both companies use Acronis software to get ’er done. Seagate’s software is called DiscWizard, and works on OSes as old as Windows 98 and Mac OS 10.x. WD throws in a copy of Acronis True Image, though it only works with WD drives attached via the included USB-to-SATA adapter. We tested both software packages and found them to be nearly identical, as both let us clone our existing drive and boot from it after one pass, which can be tricky at times. Therefore, we call the software package a tie since they both perform well and use Acronis. However, WD’s $300 bundle includes a USB-to-SATA adapter that makes the cloning process totally painless. Seagate makes you forage for a cable on your own, which tips the scales in WD’s favor.</p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 5: Ease of Use</h4> <p>This round has a crystal-clear winner, and that’s the Seagate SSHD. That’s because the Seagate drive is dead-simple to use and behaves exactly like a hard drive at all times. You can plug it into any PC, Mac, or Linux machine and it is recognized with no hassle. The WD drive, on the other hand, only works on Windows PCs because it requires special software to “unlock” the 1TB hard drive partition. For us, that’s obviously not a problem, but we know it’s enraged some Linux aficionados. Also, the WD drive only has a 120GB SSD. So, if you are moving to it from an HDD, you will likely have to reinstall your OS and programs, then move all your data to the HDD portion of the drive. The Seagate drive is big enough that you would just need to clone your old drive to it.</p> <p><strong>Winner: Seagate SSHD</strong></p> <p style="text-align: center;"><strong><a class="thickbox" href="/files/u152332/laptop-sshd-1tb-dynamic-with-label-hi-res-5x7_small_0.jpg"><img src="/files/u152332/laptop-sshd-1tb-dynamic-with-label-hi-res-5x7_small.jpg" alt="Seagate’s hybrid drive offers HDD simplicity and capacity, along with SSD-like speed for frequently requested data. " title="Seagate SSHD" width="620" height="639" /><br /></a></strong></p> <p style="text-align: center;"><strong>Seagate’s hybrid drive offers HDD simplicity and capacity, along with SSD-like speed for frequently requested data. </strong></p> <h3 style="text-align: left;">And the Winner Is…</h3> <p style="text-align: left;">This verdict is actually quite simple. If you’re a mainstream user, the Seagate SSHD is clearly the superior option, as it is fast enough, has more than enough capacity for most notebook tasks, and costs about one-third of the WD Black2. But this is Maximum PC, so we don’t mind paying more for a superior product, and that’s the <strong>WD Black2 Dual Drive</strong>. It delivers both speed and capacity and is a better high-performance package, plain and simple.</p> <p style="text-align: left;"><span style="font-style: italic;">Note: This article originally appeared in the April 2014 issue of the magazine.</span></p> http://www.maximumpc.com/seagate_1tb_hybrid_vs_wd_black2_dual_drive_2014#comments Hard Drive Hardware HDD Review Seagate 1TB Hybrid ssd WD Black2 Backup Drives Hard Drives Reviews SSD Features Thu, 31 Jul 2014 19:27:45 +0000 Josh Norem 28103 at http://www.maximumpc.com MSI Radeon R9 270 Gaming OC Review http://www.maximumpc.com/msi_radeon_r9_270_gaming_oc_review <!--paging_filter--><h3>No surprises here, just a solid 1080p card</h3> <p><a title="msi" href="http://www.maximumpc.com/tags/msi" target="_blank">MSI</a> is offering two flavors of its midrange Radeon R9 270 GPU, formerly known as the <a title="7870 GHz" href="http://www.maximumpc.com/tags/radeon_hd_7870_ghz_edition" target="_blank">Radeon HD 7870 GHz edition</a>. There is a standard model and one with an “X” after its name. The difference between the two is the X model has slightly higher core and boost clocks, but otherwise the two cards are the same and are both based on AMD’s Pitcairn GCN core, which is a 28nm part that debuted in 2013.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r9_270x_gaming_2gd5v303_3d1_small_0.jpg"><img src="/files/u152332/r9_270x_gaming_2gd5v303_3d1_small.jpg" alt="Don’t bother with the R9 270X—the non-X version shown here is just fine. " title="MSI Radeon R9 270 Gaming OC" width="620" height="487" /></a></p> <p style="text-align: center;"><strong>Don’t bother with the R9 270X—the non-X version shown here is just fine. </strong></p> <p>The card in front of you is the MSI R9 270 Gaming model, which is a stock R9 270 with a mild overclock, hence the word “Gaming” in its moniker. It has an MSRP of $180, while the X version is roughly $20 more, though street prices are higher due to the mining craze and short supply. For those who are prone to guffawing at a card that is merely rebadged and price-dropped, this is par for the course and actually good news for gamers. That’s because both Nvidia and AMD refine their manufacturing processes over time, so by the time a GPU gets a rebadge, it’s often able to run at higher clocks with better efficiency for a much lower price. The bottom line is that this card once had a $350 price tag and now costs less than $200, so there’s very little to complain about.</p> <p>To rehash the specs, this is a card with a base clock of 900MHz and a boost clock of 975MHz, which is 50MHz higher than the reference board. It has 2GB of GDDR5 memory that runs at 5.6GHz, and 1,280 stream processors. Since this is not new silicon, the card does not offer support for TrueAudio, but as it’s a Graphics Core Next (GCN) card, it does support AMD’s new Mantle API (at press time, BF4 was not optimized for Mantle with the R9 270, but AMD said it’s “being investigated”). As a midrange GPU, the R9 270 has a low-ish TDP of 150w, and therefore requires only a single six-pin PCIe connector for power—an advantage over the 270X, which requires two six-pin connectors. Interestingly, the R9 270 doesn’t have a direct competitor from Nvidia since it costs just a bit over $200, so it sits right in between the $250 GTX 760 and the $150 GTX 650 Ti (the Ti Boost is out of stock everywhere, but costs around $175). The GTX 660 is about the same price, but that card is ancient, so we compared it to the more-expensive GTX 760.</p> <p>Overall, we had a pleasant testing experience with the MSI R9 270 card. It was quiet and cool—never getting hotter than <br />60 C—and was totally stable. It ran the grueling new Star Swarm demo over a weekend with nary a hiccup, and we were also able to overclock it to 1,140MHz boost clock, which netted a 10 percent bump in performance. Basically, we found its performance exactly in line with its price, in that it was a bit slower than the more-expensive GTX 760 in all the games we test aside from Tomb Raider, which is an AMD game.</p> <p>In the end, there’s nothing wrong with the MSI R9 270 Gaming OC and we have no problem recommending it. However, we’d still go with the GTX 760 just because it is quite a bit faster in many games, and only costs $30 more. If Mantle support is important to you, though, feel free to pull the trigger.</p> <p><strong>$220 (street),</strong> <a href="http://www.msi.com/index.php">www.msi.com</a></p> <p><span style="font-style: italic;">Note: This review was originally featured in the April 2014 issue of the&nbsp;</span><a style="font-style: italic;" title="maximum pc mag" href="https://w1.buysub.com/pubs/IM/MAX/MAX-subscribe.jsp?cds_page_id=63027&amp;cds_mag_code=MAX&amp;id=1366314265949&amp;lsid=31081444255021801&amp;vid=1&amp;cds_response_key=IHTH31ANN" target="_blank">magazine</a><span style="font-style: italic;">.</span></p> http://www.maximumpc.com/msi_radeon_r9_270_gaming_oc_review#comments april issues 2014 graphics card Hardware maximum pc msi radeon r9 270 oc Review videocard Reviews Videocards Wed, 30 Jul 2014 22:39:42 +0000 Josh Norem 28096 at http://www.maximumpc.com D-Link DGL-5500 Review http://www.maximumpc.com/d-link_dgl-5500_review <!--paging_filter--><h3>A router built specifically for gamers</h3> <p>When it comes to PC parts and accessories, all roads eventually lead to gamers. Intel and AMD both sell unlocked processors so gamers can more easily overclock their rigs for a few extra frames per second; pro gamer Johnathan “Fatal1ty” Wendel has endorsed everything from motherboards to power supplies; there’s gaming RAM; and of course, a whole assortment of accessories designed to give you an edge when smoking your friends on the virtual battlefield. Up until now, one of the few items missing from the list was an 802.11ac wireless router.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/dgl-5500_front_small_0.jpg"><img src="/files/u152332/dgl-5500_front_small.jpg" alt="The new Mac Pro stole its design from this router—true story. " title="D-Link DGL-5500" width="583" height="1200" /></a></p> <p style="text-align: center;"><strong>The new Mac Pro stole its design from this router—true story. </strong></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/dgl-5500_back_small_0.jpg"><img src="/files/u152332/dgl-5500_back_small.jpg" title="D-Link DGL-5500" width="584" height="1200" /></a></p> <p>D-Link gets credit for tying up that loose end with the DGL-5500, a dual-band AC1300 wireless router built specifically for gamers. What makes the DGL-5500 different from all the other 802.11ac models, including D-Link’s own DIR-868L (reviewed in our February issue), is the inclusion of Qualcomm’s StreamBoost technology.</p> <p>Whereas the majority of modern routers rely on simple quality of service (QoS) rules to prioritize network packets, StreamBoost examines what applications are running and how much actual bandwidth each one needs. It also manages latency because a laggy gamer is a dead gamer. The question is, does it work as advertised?</p> <p>For the most part, StreamBoost lives up to the hype. We consistently saw lower pings in online games when connected to the DGL-5500 versus our zero-point router, the Asus RT-AC66U. External factors beyond our control also affect ping, so it’s hard to offer an apples-to-apples comparison, but to give one example, our ping averaged around 42ms in Battlefield 4 when using Asus’s router. When switching to D-Link’s model and turning on StreamBoost, our pings hovered around 19ms. After firing up Netflix on a second PC and initiating file downloads on two other systems, the ping stayed around 22–24ms—that’s impressive.</p> <p>In our evaluation of D-Link’s DIR-868L, we said the fugly web-based interface could use a major overhaul, and that’s what we got with the DGL-5500. It’s much better looking than before and far less complicated to navigate, though it’s also painfully slow when switching between menus. The UI is also heavily biased toward StreamBoost—if you disable the feature, you lose access to the My Network map, which provides a graphical view of all active devices and how much bandwidth each one is consuming.</p> <p>The DGL-5500 outpaced our zero point router in 802.11n mode on the 2.4GHz band in our three indoor tests. It also did better at picking out uncluttered channels on its own—we use inSSIDer ($20, www.inssider.com) to identify the best channel(s) for testing. However, the RT-AC66U boasts better range and faster transfers in 802.11ac mode on the 5GHz band. It’s worth pointing out the DGL-5500 lacks beamforming, which concentrates the wireless signal at connected devices for longer range and better speeds.</p> <p>There are other shortcomings, as well—you can’t configure Guest networks, the single USB 2.0 port doesn’t support printer sharing, and the combined speed of both channels is capped at AC1300 rather than AC1750 as it is with D-Link’s DIR-868L. While StreamBoost is a step forward, the router is a step backward in other areas. Note to D-Link: Gamers care about this stuff, too.</p> <p><strong>$140 [street],</strong> <a href="http://www.d-link.com/">www.d-link.com</a></p> http://www.maximumpc.com/d-link_dgl-5500_review#comments ac wireless april issues 2014 Gaming Hardware hd media router 2000 Review Reviews Wed, 30 Jul 2014 22:22:18 +0000 Paul Lilly 28097 at http://www.maximumpc.com Sapphire Tri-X Radeon R9 290X Review http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review <!--paging_filter--><h3>A real gem of a GPU</h3> <p>For those who haven’t kept up with current events: Late last year AMD launched its all-new Hawaii GPUs, starting with its flagship Radeon R9 290X that featured a blower-type cooler designed by AMD. In testing, it ran hotter than any GPU we’ve ever tested, hitting 94 C at full load, which is about 20 C higher than normal. AMD assured everyone this was no problemo, and that the board was designed to run those temps until the meerkats came home. It was stable at 94 C, but the GPU throttled performance at those temps. The stock fan was also a bit loud at max revs, so though the card offered kick-ass performance, it was clearly being held back by the reference cooler.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_13650_small_0.jpg"><img src="/files/u152332/sapphire_13650_small.jpg" alt="The Tri-X throws off AMD’s meh cooler." title="Sapphire Tri-X Radeon R9 290X" width="620" height="413" /></a></p> <p style="text-align: center;"><strong>The Tri-X throws off AMD’s meh cooler.</strong></p> <p>Therefore, we all eagerly awaited the arrival of cards with aftermarket coolers, and this month we received the first aftermarket Radeon R9 290X—the massive triple-fan Tri-X model from Sapphire; and we must say, all of our Radeon prayers have been answered by this card.</p> <p>Not only does it run totally cool and quiet at all times, but because it runs so chilly it has plenty of room to overclock, making it a card that addresses every single one of our complaints about the reference design from AMD. There is one caveat: price. The Sapphire card is $50 more expensive than the reference card at $600, but you are obviously getting quite a bit of additional horsepower for your ducats.</p> <p>When we first fired it up, we were amazed to see it hit 1,040MHz under load, and stay there throughout testing. Even more surprising were the temps we were seeing. Since the reference card hits 94 C all day long, this is obviously a really hot GPU, but the Sapphire Tri-X cooler was holding it down at a chilly 75 C. The card was whisper-quiet too, which was also a pleasant surprise given the noise level of the reference cooler. We were also able to overclock it to 1,113MHz, which is a turnaround in that we could not overclock the reference board at all since it throttles at stock settings.</p> <p><strong>$600,</strong> <a href="http://www.sapphiretech.com/landing.aspx?lid=1">www.sapphiretech.com</a></p> <p><span style="font-style: italic;">Note: This review was originally featured in the March 2014 issue of the&nbsp;</span><a style="font-style: italic;" title="maximum pc mag" href="https://w1.buysub.com/pubs/IM/MAX/MAX-subscribe.jsp?cds_page_id=63027&amp;cds_mag_code=MAX&amp;id=1366314265949&amp;lsid=31081444255021801&amp;vid=1&amp;cds_response_key=IHTH31ANN" target="_blank">magazine</a><span style="font-style: italic;">.</span></p> http://www.maximumpc.com/sapphire_tri-x_radeon_r9_290x_review#comments Air Cooling amd gpu graphics card Hardware March issues 2014 maximun pc Review Sapphire Tri-X Radeon R9 290X Reviews Thu, 24 Jul 2014 22:09:13 +0000 Josh Norem 28024 at http://www.maximumpc.com