Features http://www.maximumpc.com/taxonomy/term/31/ en Graphics Porn (August 2014): Cheat Technical Officer Jim2point0 http://www.maximumpc.com/graphics_porn_august_2014_cheat_technical_officer_jim2point0 <!--paging_filter--><h3><img src="/files/u154082/battlefield_4.jpg" alt="battlefield 4" title="battlefield 4" width="250" height="125" style="float: right;" /></h3> <h3><span style="font-weight: normal;">The Cheat Engine whiz of Dead End Thrills opens up his archives for Graphics Porn</span></h3> <p class="p1">We’re mixing things up again this month to showcase another tour de force of the video game screenshot world. James ‘jim2point0’ Snook is a front-end web developer at eBay Enterprise by day and a devoted screenshot aficionado at night. Just like <a href="http://www.maximumpc.com/graphics_porn_july_2014_showcasing_screenshot_artist_k-putt"><span class="s1">K-putt</span></a>, he’s dedicated to showcasing the very best that our favorite games have to offer. Whether that’s a stunning scene or just a particularly awesome ray of light, James is there to grab some spectacular screenshots.&nbsp;</p> <p class="p1">It all started when James stumbled upon some of <a href="http://deadendthrills.com/">Dead End Thrill’s screenshots</a> on Reddit. Now, he’s a diehard screenshot-taker. 4K downsampling and a technical-level of control over <a href="http://www.cheatengine.org/"><span class="s1">Cheat Engine</span></a>—a utility used to modify games—means that James can add free cameras, control over the field of the view (FOV), and even time-stop functions to games like Watch Dogs and Tomb Raider. He’s good enough at it that he’s the de-facto: “Cheat Technical Officer” on the DeadEndThrills forum. His work behind-the-scenes helps people like K-Putt and Dead End Thrills capture such inspiring screenshots.</p> <p class="p1">James’ love of the technical goes beyond Cheat Engine and screenshots. His personal rig is packed to the brink with an Asus Maximus V Extreme motherboard, an Intel Core i7-3770K overclocked to 4.4GHz, two EVGA GTX 780s in SLI, 16GB of 1866MHz Corsair Vengeance RAM, and a QNIX QX2710 2560x1440 monitor.&nbsp;</p> <p class="p1">We’ve got 15 of James’ personal favorites in the gallery below. Check them out and while you’re at it follow him on <a href="https://twitter.com/jim2point0" target="_blank"><span class="s1">Twitter</span></a> to keep up with his latest exploits. Visit his <a href="https://www.flickr.com/photos/jim2point0/" target="_blank"><span class="s1">Flickr</span></a> for the complete collection of his screens as well as higher-resolution downloads.&nbsp; Last, but not least, check out the <span class="s1">Dead End Thrills forum</span> for game-specific guides on getting total control over your screenshot adventures.</p> <p class="p3"><em>Whether you've been using&nbsp;</em><a href="http://store.steampowered.com/news/5"><span class="s2"><em>Steam's nifty screenshots feature</em></span></a><em>&nbsp;or simply print screening some beautiful wallpaper-worthy game moments, we want to be able to share your captured works of art with the world. If you think you can do better than the pictures submitted below, please email your screenshots to&nbsp;</em><a href="mailto:mpcgraphicsporn@gmail.com%22%20%5Ct%20%22_blank"><span class="s2"><em>mpcgraphicsporn@gmail.com</em></span></a><em>&nbsp;so we can show them off. Make sure to include the name of the game, a title for the screenshot, and a description of what's happening on-screen.</em></p> http://www.maximumpc.com/graphics_porn_august_2014_cheat_technical_officer_jim2point0#comments Cheat Engine Dark Souls II Dead End Thrills Graphics Porn jim2point0 Tomb Raider watch dogs Witcher Features Thu, 21 Aug 2014 17:10:23 +0000 Ben Kim 28352 at http://www.maximumpc.com Rig of the Month: Parvum Titanfall http://www.maximumpc.com/rig_month_parvum_titanfall_2014 <!--paging_filter--><p><img src="/files/u162579/8a652c4a_r3e2tlo.jpeg" alt="parvum titanfall" title="parvum titanfall" width="250" height="167" style="float: right;" /></p> <h3><span style="font-weight: normal;">An amazing machine that's straight out of Titanfall</span></h3> <p>This month’s <a title="rig of the month" href="http://www.maximumpc.com/rig_month_roundup_2014" target="_blank">Rig of the Month</a> is a bit different. Instead of pulling from reader submissions, we’ve reached out to <a href="http://api.viglink.com/api/click?format=go&amp;jsonp=vglnk_jsonp_14084711173806&amp;key=7777bc3c17029328d03146e0ed767841&amp;libId=d1ae913a-6caa-4ef6-95d0-89833fb7b69c&amp;loc=http%3A%2F%2Fwww.overclock.net%2Fmessages%2Fmessages%2Fview%2Fid%2F2848026%2Fbox%2F7229559&amp;v=1&amp;out=https%3A%2F%2Fm.facebook.com%2F%3Frefsrc%3Dhttps%253A%252F%252Fwww.facebook.com%252F%23!%2Fjameswalt1computerart%3Fref%3Dbookmark&amp;ref=http%3A%2F%2Fwww.overclock.net%2Fmessages&amp;title=Private%20Message%3A%20Maximum%20PC%20Rig%20of%20the%20Month&amp;txt=https%3A%2F%2Fm.facebook.com%2F%3Frefsrc%3Dhttps%253A%252F%252Fwww.facebook.com%252F%23!%2Fjameswalt1computerart%3Fref%3Dbookmark" target="_blank">James Walter</a>, who recently completed his latest build: <a href="http://www.overclock.net/t/1476225/sponsored-parvum-titanfall-completed" target="_blank">Parvum Titanfall</a>. Based on the design of the limited-edition <a href="http://www.xbox.com/en-US/xbox-one/accessories/controllers/wireless-controller/titanfall-wireless-controller#fbid=yMhsQVkQSQu" target="_blank">Xbox One Titanfall controller</a>, Parvum Titanfall is a masterclass in clean, crisp PC building.&nbsp;</p> <p>From the moment James saw the orange, white, and black controller he was immediately inspired to create a matching rig. After finishing a <em>Robocop</em>-inspired, <a href="http://www.overclock.net/t/1426275/build-log-robocop" target="_blank">full-tower build</a>, he set out to find the right platform to work with. The Parvum Systems S2.0 was his final choice because of a proposed partnership: Parvum would provide a custom S2.0 built to James’s specs. With Parvum onboard, the project continued with Swiftech, Ensourced Sleeved Cables, Mayhem’s Dyes, and ColdZero.&nbsp;</p> <p>With plenty of hours logged in Titanfall, James decided to create something straight out of the Titanfall universe. A computer—”a sort of Dell of the Titanfall universe,”—that could very well have been made by Hammond Robotics. The military theme, serial numbers, and paintwork are the result of James’s initial decision. The custom-painted motherboard armor, radiators, and other parts mesh well with the predominantly white case. James used a Silhouette-brand craft cutter to fabricate all of the extra ornaments—serial numbers, barcodes, and the like. Little details like colored vinyl set into the grooves of the case and the Mayhem’s Aurora 2 Supernova liquid-dyed to a deep orange really complete the build.&nbsp;</p> <p style="text-align: center;"><iframe src="//www.youtube.com/embed/xVUmlFljBvs?rel=0" width="600" height="338" frameborder="0"></iframe></p> <p>As beautiful as the rig is on the outside, the system itself is equally impressive when you factor in the sheer number of water-cooling components. There’s an EK CPU block, RAM block, and two GPU blocks as well as a Swiftech pump, a Bitspower reservoir, two EK radiators, and a bunch of fans. The system components include an Intel i5-4670K that sits on an Asus Gryphon Z87, 8GB of Corsair Dominator GT memory, two EVGA GTX 770 Superclockeds, a 250GB Samsung EVO SSD, and a Corsair AX860 power supply.&nbsp;</p> <p>The reception to the finished rig has been so great that James and Parvum will be teaming up again for another game-themed build. Stay tuned to <a href="http://api.viglink.com/api/click?format=go&amp;jsonp=vglnk_jsonp_14084711173806&amp;key=7777bc3c17029328d03146e0ed767841&amp;libId=d1ae913a-6caa-4ef6-95d0-89833fb7b69c&amp;loc=http%3A%2F%2Fwww.overclock.net%2Fmessages%2Fmessages%2Fview%2Fid%2F2848026%2Fbox%2F7229559&amp;v=1&amp;out=https%3A%2F%2Fm.facebook.com%2F%3Frefsrc%3Dhttps%253A%252F%252Fwww.facebook.com%252F%23!%2Fjameswalt1computerart%3Fref%3Dbookmark&amp;ref=http%3A%2F%2Fwww.overclock.net%2Fmessages&amp;title=Private%20Message%3A%20Maximum%20PC%20Rig%20of%20the%20Month&amp;txt=https%3A%2F%2Fm.facebook.com%2F%3Frefsrc%3Dhttps%253A%252F%252Fwww.facebook.com%252F%23!%2Fjameswalt1computerart%3Fref%3Dbookmark" target="_blank">James's Facebook</a>&nbsp;for details on the upcoming Call of Duty: Advanced Warfare build.</p> <div><em>Have a case mod of your own that you would like to submit to our monthly feature? Make sure to read the rules/tips&nbsp;<a style="margin: 0px; padding: 0px; border: 0px; outline: 0px; vertical-align: baseline; color: #cc0000; text-decoration: none; background: transparent;" href="http://www.maximumpc.com/rig_month_roundup_2014" target="_blank">here</a>&nbsp;and email us at&nbsp;<a style="margin: 0px; padding: 0px; border: 0px; outline: 0px; vertical-align: baseline; color: #cc0000; text-decoration: none; background: transparent;" href="mailto:mpcrigofthemonth@gmail.com" target="_blank">mpcrigofthemonth@gmail.com</a>&nbsp;with your submissions.</em></div> http://www.maximumpc.com/rig_month_parvum_titanfall_2014#comments James Walter Parvum Titanfall Rig of the Month rig of the month titanfall Xbox One Controller Features Wed, 20 Aug 2014 21:38:52 +0000 Ben Kim 28373 at http://www.maximumpc.com Audacity Crash Course http://www.maximumpc.com/audacity_crash_course_2014 <!--paging_filter--><h3><span style="font-weight: normal;"><img src="/files/u162579/audacity-logo_0.png" alt="Audacity Logo" title="Audacity Logo" width="200" height="200" style="float: right;" />Turn your PC into a music computer with the best free audio editor</span></h3> <p><strong><a href="http://www.maximumpc.com/tags/audacity" target="_blank">Audacity</a></strong>’s been around for a long time—since mid-2000—and for good reason. It’s a relatively lightweight, open-source, and completely free audio editor that can handle pretty much every task you throw at it. Need to edit together a podcast? No problem. Looking to do some simple noise reduction? Looking to turn your PC into a <strong>music computer</strong>? Audacity’s got you covered.</p> <p>Although it’s available for free, it’s not exactly the most intuitive program. The interface isn’t necessarily dated, but it does look pretty spartan alongside programs like Adobe Photoshop and even Microsoft Office. Getting up and running with Audacity isn’t hard, but it does take a little know-how.&nbsp;</p> <h3><span style="font-weight: normal;">The Toolbar</span></h3> <p style="text-align: center;"><img src="/files/u162579/toolbar.jpg" alt="Audacity Toolbar" title="Audacity Toolbar" width="600" height="109" /></p> <p style="text-align: center;"><strong>The toolbar of Audacity is home to all of the app’s basic tools.</strong></p> <p>The first thing you’ll want to get familiar with is Audacity’s packed toolbar. It’s filled with tools, and fortunately, they’re all labeled. Hover over a button, slider, or drop-down box, and you should see a text label pop-up with the name of the tool. There are a lot of tools, but you really only need a small subset of them for all but the most demanding projects.&nbsp;</p> <p>Make note of the playback controls—play, pause, record, et cetera. They’re essential to all audio editing since you’ll want to constantly be reviewing your work as you go along. Next, you’ll want to make sure you’ve got your output and input devices set correctly. Both should be set to your Windows default devices—if they aren’t, make sure you select the correct ones in the dropdown. Once you get your audio into Audacity—we’ll cover that in a second—you can monitor your levels in the output and input level monitors (usually somewhere near the center of the toolbar).&nbsp;</p> <p>You’ll also want to make sure that you’re always aware of which cursor tool is currently selected. The standard Selection Tool is exactly what you’d expect; it’s a cursor that lets you mark your position on a track and highlight specific sections. The other essential tool is the Time Shift Tool which lets you move clips along the timeline.</p> <h3><span style="font-weight: normal;">Getting Audio Into Audacity</span></h3> <p style="text-align: center;"><img src="/files/u162579/import.jpg" alt="Audacity Import" title="Audacity Import" width="600" height="439" /></p> <p style="text-align: center;"><strong>Importing is a cinch.</strong></p> <p>If you’re working with pre-recorded audio, getting it into Audacity is just a matter of jumping into the File menu and selecting Import &gt; Audio—hit Ctrl+Shift+I if you’re feeling fancy. Find your audio files and they should pop into Audacity as separate tracks.</p> <p>If, on the other hand, you want to record a voiceover or instrumental track directly into Audacity, all you have to do is check to make sure that your input levels are set appropriately (a maxed out slider is usually fine) and click the record button. Clicking stop will end the recording whereas clicking pause will let you continue recording on the same track.&nbsp;</p> <h3><span style="font-weight: normal;">Editing Your Audio</span></h3> <p>Now you can get down to the fun part: actually editing your audio. The tools and effects you’ll use will depend on what you’re trying to accomplish, but we’ll run through some basic tasks that most projects will require.</p> <p style="text-align: center;"><img src="/files/u162579/remove_audio_menu.jpg" alt="Audacity Remove Audio" title="Audacity Remove Audio" width="600" height="331" /></p> <p style="text-align: center;"><strong>The Remove Audio dropdown in the Edit menu will be your audio-editing brother-in-arms.&nbsp;</strong></p> <p style="text-align: left;">Most audio editing projects requires a fair bit of cutting, splitting, and rearranging sections of a track—or multiple separate tracks. Cutting, splitting, silencing, trimming, and deleting is all handled in the Remove Audio section of the Edit menu. The shortcuts are simple and worth learning since these are common tasks in any editing endeavour. Highlight the section of the track you want to manipulate and select the action you want completed. Trimming removes everything but the highlighted area on any continuous piece of audio. Cutting moves the selected clip to your clipboard, and shifts the remaining pieces over. A split cut or delete removes the selected audio, and preserves the empty space between the two remaining clips.</p> <p style="text-align: center;"><img src="/files/u162579/effects_menu.jpg" alt="Audacity Effects" title="Audacity Effects" width="600" height="390" /></p> <p style="text-align: center;"><strong>Like a kid in a cand...audio effects store?</strong></p> <p>Most of the other things you’d want to do to an audio track is under the Effects menu. Here you can amplify, bass boost, change pitch, fade in and out, and normalize audio. Most of the effects are self explanatory and work as you’d expect. Some of the commands lets you select specific settings when you click on the effect.&nbsp;</p> <p style="text-align: center;"><img src="/files/u162579/bass_boost.jpg" alt="Audacity Bass Boost" title="Audacity Bass Boost" width="321" height="178" /></p> <p style="text-align: center;"><strong>Pump up the bass!</strong></p> <p style="text-align: left;">Bass boost gives you control over Frequency and the amount of Boost. Other effects like Fade In and Fade Out simply alter the audio without any confirmation. Pay attention the waveform and you’ll see it turn into a gradual fade. The expansive effects menu is one of Audacity’s greatest features. It’s the reason why the program has been a freeware staple since it's release.</p> <h3><span style="font-weight: normal;">Exporting the Finished Product</span></h3> <p style="text-align: center;"><img src="/files/u162579/export_menu.jpg" alt="Audacity Export" title="Audacity Export" width="600" height="429" /></p> <p style="text-align: center;"><strong>Where and how you want it are your choice.</strong></p> <p>Once you’re done editing, you’ll want to get your audio out of Audacity into a format that works for your project. Audacity supports a pretty large number of formats, although exporting as an MP3 requires an external codec. If all you want to do is get your file out as a WAV, FLAC, or any of the other available formats, you just have to go to File &gt; Export and select where you want it to be saved and the format you want it in.</p> <p>MP3 file exports are available after downloading the LAME MP3 encoder. It’s completely free, but can’t be distributed with Audacity directly because of software patents. Head over to the LAME download page and download the “Lame v.399.3 for Windows.exe” installer. Start up the installer and don’t change the default destination of the program. Once it’s finished, try to export your Audacity project as an MP3 and you should be asked to find “lame_enc.dll”. Go to “C:\Program Files\Lame for Audacity” and select the dll. Your project should export as an MP3 file and you’re ready to enjoy your finished product in an audio player of your choice.</p> <p>You probably aren’t an audio editing expert yet, but hopefully you’re well on your way to editing out unwanted noise, adding fades to clips, and editing homebrew podcasts with Audacity.</p> <p><em>Follow Ben on <a href="http://twitter.com/benjkim" target="_blank">Twitter</a>&nbsp;and <a href="https://plus.google.com/u/0/+BenKimJ" target="_blank">Google+</a>.</em></p> http://www.maximumpc.com/audacity_crash_course_2014#comments audacity audio editor beginners crash course freeware music computer Software tutorial Features Tue, 19 Aug 2014 23:02:33 +0000 Ben Kim 27534 at http://www.maximumpc.com Build a PC: Recommended Builds (August2014) http://www.maximumpc.com/build_pc_recommended_builds_august2014 <!--paging_filter--><h3>Budget, baseline, and performance PC builds!</h3> <p>What time is it? It's time to Build a PC with our Blueprints! This month, we've built three rigs at three approximate price points: Budget Gamer, Mid-Grade, and Turbo. That's right, we're mixing things up again. No more rotation of four systems into three slots. For the foreseeable future, there will always be a budget system in our Blueprints section. Yay!</p> <p><em>Prices listed here reflect print time</em>&nbsp;and may not match the ones you find elsewhere online. In addition, Newegg has jumped on board to offer packaged deals for each of the builds below in an attempt to offer a better overall value. To see these bundle prices, click the "Buy or get more info at Newegg" button at the bottom of each build. Feedback is welcome. Tell us what you think!</p> <p><em>Note: Some of the prices/links listed below may not show up properly if this page is ad-blocked.</em></p> <h2 style="text-align: center;"><strong>BUDGET GAMER</strong></h2> <div style="text-align: center;"><img src="/files/u160416/210_elite_black.jpg" alt="NZXT Source 210 Elite computer case" title="NZXT Source 210 Elite computer case" width="242" height="300" /></div> <div style="text-align: center;"> <div class="module-content" style="text-align: start;"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td style="text-align: center;" colspan="3"><strong>Ingredients</strong></td> </tr> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> <td><strong>Price</strong></td> </tr> <tr> <td class="item">Case</td> <td class="item-dark">NZXT Source 210 Elite</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16811146078&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$50</a></td> </tr> <tr> <td>PSU</td> <td>Corsair CX500, 500 watts</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16817139027&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$30</a></td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Biostar TA970&nbsp;</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16813138372&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$60</a></td> </tr> <tr> <td>CPU</td> <td>AMD FX-6300 3.5GHz</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16819113286&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$120</a></td> </tr> <tr> <td>CPU Cooler</td> <td>Cooler Master Hyper 212 Evo</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16835103099&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$35</a></td> </tr> <tr> <td>GPU</td> <td>Sapphire Dual-X Radeon R7 265</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16814202096&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$163</a></td> </tr> <tr> <td>RAM</td> <td>2x 4GB G.SKILL Ares Series DDR3/1600</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16820231544&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$72</a></td> </tr> <tr> <td>SSD</td> <td>Crucial MX100 128GB</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16820148819&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$80</a></td> </tr> <tr> <td>HDD</td> <td>Seagate Barracuda 1TB&nbsp;</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16822148840&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$65</a></td> </tr> <tr style="text-align: right;"> <td style="text-align: left;"><strong>Total = $675<br /></strong></td> <td style="text-align: right;" colspan="2"><strong>Click here to see the live bundle price:</strong>&nbsp;&nbsp;<a rel="nofollow" href="http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1806345&amp;cm_mmc=BAC-MaximumPC-_-Combo-1806345-Budget-_-MaximumPC_BluePrint-_-NA&amp;nm_mc=ExtBanner" target="_blank"><img src="http://www.maximumpc.com/sites/maximumpc.com/themes/omega_maximumpc/img/newegg.jpg" alt="buy online at newegg" /></a></td> </tr> </tbody> </table> </div> </div> <p class="MsoNormal" style="text-align: left;">For the first time in a while, we have reached equilibrium at the budget level. Each part on this list is pretty much the best bang for your buck. You could put a closed-loop liquid cooler (CLC) in here, but the Cooler Master Hyper 212 EVO is too good a value to pass up at this tier. Might as well put the extra cost of a CLC toward something else. If you’re prepared to spend about $700, we’d bump the SSD up to <a href="http://www.maximumpc.com/crucials_mx100_ssd_blazes_trail_550mbs_calls_it_mainstream_performance" target="_blank">a 256GB Crucial MX100</a>, which currently goes for $110. That’ll give gamers a lot more room to install their favorite games on a zippy storage device.</p> <p class="MsoNormal" style="text-align: left;"><strong>Note:</strong><span style="line-height: 15px;">&nbsp;We apparently snagged a few of these items on deep discount at the time that we assembled our list, so the Newegg live price might be a little higher.</span></p> <h2>MID-GRADE</h2> <p><img src="/files/u160416/c70_green.png" alt="Corsair Vengeance C70 computer case" title="Corsair Vengeance C70 computer case" width="228" height="300" /></p> <div class="module-content" style="text-align: start;"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td style="text-align: center;" colspan="3"><strong>Ingredients</strong></td> </tr> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> <td><strong>Price</strong></td> </tr> <tr> <td class="item">Case</td> <td class="item-dark">Corsair Vengeance C70</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16811139013&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$108</a></td> </tr> <tr> <td>PSU</td> <td>Silverstone Strider Gold S Series, 850 watts</td> <td><a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16817256100" target="_blank">$100</a></td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Gigabyte GA-Z97X-UD5H</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16813128707&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$175</a></td> </tr> <tr> <td>CPU</td> <td>Intel Core i5-4690K</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16819116899&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$240</a></td> </tr> <tr> <td>Cooler</td> <td>Cooler Master Hyper 212 EVO</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16835103099&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$35</a></td> </tr> <tr> <td>GPU</td> <td>XFX Double D Radeon R9 280X 3GB&nbsp;</td> <td><a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678" target="_blank">$250</a></td> </tr> <tr> <td>RAM</td> <td>2x 4GB G.SKILL Ares Series&nbsp;F3-1600C9D-8GAO</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16820231544&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$72</a></td> </tr> <tr> <td>Optical Drive</td> <td>Samsung SH-224DB/BEBE DVD Burner</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16827151266&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$20</a></td> </tr> <tr> <td>SSD</td> <td>Crucial MX100 256GB</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16820148820&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$115</a></td> </tr> <tr> <td>HDD</td> <td>Seagate Barracuda 1TB ST1000DM003</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16822148840&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$65</a></td> </tr> <tr style="text-align: right;"> <td style="text-align: left;"><strong>Total = $1180<br /></strong></td> <td style="text-align: right;" colspan="2"><strong>Click here to see the live bundle price:</strong>&nbsp;&nbsp;<a rel="nofollow" href="http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1806344&amp;cm_mmc=BAC-MaximumPC-_-Combo-1806344-Baseline-_-MaximumPC_BluePrint-_-NA&amp;nm_mc=ExtBanner" target="_blank"><img src="http://www.maximumpc.com/sites/maximumpc.com/themes/omega_maximumpc/img/newegg.jpg" alt="buy online at newegg" /></a></td> </tr> </tbody> </table> </div> </div> <p style="text-align: left;"><span style="line-height: 150%;">The Strider Plus, a fully modular 850-watt power supply from Silverstone, is reasonably priced, so it replaces the 750-watt semi-modular Seasonic unit we slotted last month. The extra juice better prepares this system for multiple video cards down the road. <a href="http://www.maximumpc.com/tags/devils_canyon" target="_blank">Intel’s Devil’s Canyon</a> Core i5-4690K arrives, replacing the i5-4670K. The new one’s base clock speed is 4Ghz, which is 600MHz higher than before, and it will turbo to 4.4GHz. Radeon cards continue to fall in price, and the R9 280X is now within reach; it’s now a better value at this tier than a <a href="http://www.maximumpc.com/tags/gtx_760" target="_blank">GeForce GTX 760</a>. But the 250GB <a href="http://www.maximumpc.com/samsung_840_evo_1tb_review" target="_blank">Samsung 840 Evo</a> at $160 is no longer competitively priced, so we’ve replaced it with the 256GB Crucial MX100, which isn’t as fast but is a much better value. </span></p> <p style="text-align: left;"><span style="line-height: 150%;"><strong>Note:</strong> We apparently snagged a few of these items on deep discount at the time that we assembled our list, so the Newegg live price might be a little higher.</span></p> <h2>TURBO</h2> <p><img src="/files/u160416/phantom530-1.jpg" alt="NZXT Phantom 530 computer case" title="NZXT Phantom 530 computer case" width="300" height="300" /></p> <div class="module-content" style="text-align: start;"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead></thead> <tbody> <tr> <td style="text-align: center;" colspan="3"><strong>Ingredients</strong></td> </tr> <tr> <td><strong>Part</strong></td> <td><strong>Component</strong></td> <td><strong>Price</strong></td> </tr> <tr> <td class="item">Case</td> <td class="item-dark">NZXT Phantom 530</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16811146107&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$130</a></td> </tr> <tr> <td>PSU</td> <td>Cooler Master Silent Pro M2, 1000 watts</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16817171076&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$180</a></td> </tr> <tr> <td class="item">Mobo</td> <td class="item-dark">Gigabyte GA-Z97X-UD5H</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16813128707&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner">$175</a></td> </tr> <tr> <td>CPU</td> <td>Intel Core i7-4790K</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16819117369&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner">$340</a></td> </tr> <tr> <td>Cooler</td> <td>Corsair Hydro Series H100i</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16835181032&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$95</a></td> </tr> <tr> <td>GPU</td> <td>EVGA GeForce GTX 780 3GB 03G-P4-3784-KR</td> <td><a href=" http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16814130951&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$530</a></td> </tr> <tr> <td>RAM</td> <td>4x 4GB G.SKILL Ripjaws F3-12800CL9Q-16GBRL&nbsp;</td> <td><a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16820231315&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$150</a></td> </tr> <tr> <td>Optical Drive</td> <td>LG WH14NS40 Blu-ray Burner</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16827136250&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$70</a></td> </tr> <tr> <td>SSD</td> <td>Samsung 840 Evo 500GB MZ-7TE500BW</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16820147249&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$260</a></td> </tr> <tr> <td>HDD</td> <td>Seagate Barracuda 3TB ST3000DM001</td> <td><a href="http://ad.doubleclick.net/clk;222199927;45833272;q?http://www.newegg.com/Product/Product.aspx?Item=N82E16822148844&amp;cm_mmc=BAC-MaximumPC-_-BOTB-_-NA-_-NA&amp;nm_mc=ExtBanner" target="_blank">$110</a></td> </tr> <tr style="text-align: right;"> <td style="text-align: left;"><strong>TOTAL = $2040<br /></strong></td> <td style="text-align: right;" colspan="2"><strong>Click here to see the live bundle price:</strong>&nbsp;&nbsp;<a rel="nofollow" href="http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1806346&amp;cm_mmc=BAC-MaximumPC-_-Combo-1806346-Performance-_-MaximumPC_BluePrint-_-NA&amp;nm_mc=ExtBanner" target="_blank"><img src="http://www.maximumpc.com/sites/maximumpc.com/themes/omega_maximumpc/img/newegg.jpg" alt="buy online at newegg" /></a></td> </tr> </tbody> </table> </div> </div> <p class="MsoNormal" style="text-align: left;">THIS BUILD PREVIOUSLY FEATURED a quad-core Ivy Bridge-E (IVB-E) CPU on the LGA 2011 platform, aka X79. With the zippy Devil’s Canyon CPUs available, we’ve switched to Intel’s Core i7-4790K. It’s a refresh of the company’s newer “Haswell” generation on the less expensive LGA 1150 platform. Since LGA 1150 is limited to 16 PCI Express lanes, whereas X79 has 40, the new mobo and CPU don’t handle three or more video cards nearly as well. But if you stick to “only” two video cards, you’d need a benchmark to see the difference between the two platforms. Like the i5-4690K, this chip starts at 4GHz and boosts to 4.4GHz. (We also don’t want to recommend an X79 system, since it will be retired within the next few months, in favor of the incompatible <a href="http://www.maximumpc.com/tags/haswell-e" target="_blank">LGA 2011-3, aka X99</a>.)</p> <p class="MsoNormal" style="text-align: left;">We’re also sticking with the GA-Z97X-UD5H motherboard at this higher tier, because its mixture of price, performance, and features is hard to beat. We could get a less expensive SSD, but money isn’t as strong of a concern at this tier.</p> </div> http://www.maximumpc.com/build_pc_recommended_builds_august2014#comments affordable august 2014 blueprint budget Build a PC cheap computer performance Recommended Builds Features Mon, 18 Aug 2014 22:02:34 +0000 The Maximum PC Staff 28371 at http://www.maximumpc.com 11 Awesome Tips and Tricks to Become a Google Maps Guru http://www.maximumpc.com/11_awesometips_and_tricks_become_google_maps_guru_2014 <!--paging_filter--><p><img src="/files/u69/google_maps_guru.jpg" alt="Google Maps Ninja" title="Google Maps Ninja" width="228" height="152" style="float: right;" /></p> <h3>Never get lost again with Google Maps</h3> <p>Assuming you have an Internet connection and can read this -- and who doesn't these days? -- then there's a strong possibility you're at least a little bit familiar with Google Maps. Maybe you use it to look up driving directions before heading to a concert at the other end of the state, or fire it up to find a gas station when the needle creeps uncomfortably close to E. But did you know you can use Google Maps for suggestions on what to do when you're in a new area? Or zoom in or out with one hand?</p> <p>Google Maps is constantly changing (for the better), with new and enhanced features being added at an almost breakneck pace. It's pretty mature at this point, but if all you're doing is typing in directions, you're missing out on just how slick this piece of software is.</p> <p>The good news is, you've come to the right place. <strong>We've put together a gallery of 10 gnarly tips and tricks that will level up your Google Maps-fu to Guru status</strong>. Let's get started!</p> http://www.maximumpc.com/11_awesometips_and_tricks_become_google_maps_guru_2014#comments directions gallery google maps navigation Software tips tricks Features Thu, 14 Aug 2014 22:07:55 +0000 Paul Lilly and Jimmy Thang 28226 at http://www.maximumpc.com 4K Monitors: Everything You Need to Know http://www.maximumpc.com/4k_monitor_2014 <!--paging_filter--><h3>Ultra HD (UHD) is the next-gen PC resolution—here’s why you have to have it</h3> <p>Dream Machine 2013 had some bitchin' hardware, but most of it was available at retail for any well-heeled hardware hound. One part, though, was not available to the unwashed masses: its glorious 4K monitor. You see, 4K was an other-worldly resolution back in mid-2013, simply because it offered four times the resolution of 1080p—at a wallet-busting expense of $3,500.</p> <p>Now, though, 4K is available and relatively affordable, and all modern games support it, making it one hell of an upgrade. Over the next pages, we'll tell you all about 4K, show you what you need to run it at its maximum output, and explore 4K gaming benchmarks, too. But as sweet as it is, it's not for everyone, so read this guide before making the move.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/4k_04_small_0.jpg"><img src="/files/u152332/4k_04_small.jpg" width="620" height="576" /></a></p> <h3>What is 4K?</h3> <p><strong>a slight misnomer, but catchier than ultra hd</strong></p> <p>To put it simply, 4K doubles the standard 1920x1080 resolution both vertically and horizontally to 3840x2160, which is quadruple the pixels. We can already see you folding your arms and scanning the page for a downvote button, saying, “That’s obviously not true 4K. It only sums up to 3,840 pixels.” “True” 4K resolution is a term used in the movie industry. When you hear about movies being shot in 4K, they’re typically shot at 4096x2160 with an aspect ratio of 17:9. On the PC side, we generally run with television makers, who have mostly settled on a resolution of 3840x2160, which uses the same 16:9 aspect ratio as 1080p. Despite this being far short of 4,000 pixels horizontally, television and monitor makers have all settled on 4K as the term to push, rather than Ultra HD. In other words, we don’t make up the buzzwords, so hate the game, not the player.</p> <p>In a historical context, 4K is simply the next rest stop along the path of technological progress. Way back in the day, we ran our displays at 640x480, then 800x600, then 1024x768, and then 1280x1024, and so on. As graphics cards became more powerful, we were slowly able to bump up our display resolutions to where they are now, which for a large majority of gamers is 1920x1080, or 1080p. Society has settled on 1080p as the go-to resolution right now, simply because those monitors (and TVs) are very affordable, and you don’t need a $500 graphics card to run today’s games at that resolution.</p> <p>Not everyone is average, though. Many enthusiasts today run 27-inch or 30-inch monitors at much higher resolutions of 2560x1440 or 2560x1600, respectively. That may seem like a step up from 1920x180, but a GeForce GTX 780 ti or Radeon R9 290X isn’t even stressed&nbsp; by 2560x1440 gaming. Factor in PCs with multiple GPUs, and you start to wonder why we’ve been stuck with 2560x1600 for more than seven years, as everything else has leapt forward. We won’t question that wisdom, but we do know that it’s time to move forward, and 4K is that next step. Looking ahead, the industry will eventually move to 8K, which quadruples the pixels, and then 12K, and so forth. In fact, some vendors already demonstrated resolutions way beyond 4K at CES 2014, including Nvidia, which ran three 4K panels at 12K using four GTX Titans in SLI. For 2014 and beyond, though, 4K is the new aspirational resolution for every hardcore PC gamer.</p> <h4>It’s All About the Pixels Per Inch</h4> <p>You know how every time we pass around the sun once more and it’s a new year, people joke about their “New Year’s Resolution” being some sort of super-high monitor resolution? Well, we do it, too, because as hardware aficionados there’s always room to grow and new boundaries to push. We want our hard drives to get bigger right alongside our displays, so the move into 4K is something we have been looking forward to for more than a year; as resolution scales up, so does the level of detail that is rendered on the screen. The official term for this spec is pixels per inch, or PPI, and it’s a good bellwether for how much detail you can see on your display.</p> <p>To see how PPI varies according to screen size, let’s look at a few examples. First, a 24-inch 1920x1080 display sports roughly 91 pixels per inch. If you bump up the vertical resolution to 1,200 pixels (typical on some 16:10 ratio IPS panels), you get a PPI of 94. If you crank things up a notch to 2560x1440 at 27 inches, the PPI goes to 108, which is a small bump of about 20 percent, and probably not very noticeable. Moving on to 2560x1600 on a 30-inch panel, you actually get lower density, arriving at a PPI of 100. To put this in the context of a mobile device, the Google Nexus 5 smartphone has a 4.95-inch display that runs a resolution of 1920x1080, giving it a crazy-high PPI of 445. The iPad’s 9.7-inch screen delivers a PPI of 264, and the iPhone 5’s PPI is 326 pixels. Clearly, there’s a lot of room for improvement on the PC, and 4K is a step in the right direction.&nbsp;</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/nvidia-geforce-gtx-4k-batman-arkham-origins-4k-versus-hd_small_0.jpg"><img src="/files/u152332/nvidia-geforce-gtx-4k-batman-arkham-origins-4k-versus-hd_small.jpg" alt="The difference between Full HD 1080p and 4K is huge, and noticeable. " width="620" height="337" /></a></p> <h4>Pixel Peeping</h4> <p>Now, let’s look at the PPI for a few of these fancy new 4K monitors that are coming down the pike. We’ll start with the model we used for <a title="Dream Machine 2013" href="http://www.maximumpc.com/dream_machine_2013" target="_blank">Dream Machine 2013</a>, which is an Asus PQ321Q. With its resolution of 3840x2160 spread across 31.5 luscious inches, its PPI is a decent 140, a noticeable increase from a 2560x1600 display. Not enough for you? Dell has a new 24-inch 4K monitor dubbed the UP2414Q that shrinks your icons for you while retaining their sharpness. Still, it has the highest PPI yet for the desktop panel at a skyscraping 185 pixels. Slightly below the Dell is a new Asus LCD named the PB287Q, which at 28 inches has a modest PPI of 157 pixels. Keep in mind that in some cases, this is a 50 percent increase in the number of pixels per inch, so when you tally that up across the entirety of the display, it equals quite a few more pixels, which results in a lot more detail that is visible even to the naked and semi-clothed eye.</p> <hr /> <p>&nbsp;</p> <h4>Why it’s a big deal</h4> <p>Just like going from a 24-inch 1080p monitor to a 30-inch 1600p monitor is a life-changing experience, so is going to a 32-inch or smaller 4K panel. The level of detail you can see on the screen is surprising, and when you fire up Battlefield 4 for the first time, you’ll most likely be staring at the screen with your mouth open, and not just because the server dropped you again. Gaming at 4K looks simply incredible. And unlike television, where there’s a dearth of content, almost all PC games support the resolution, and some developers are even including higher resolution textures now, too.</p> <p>Of course, both Nvidia and AMD are also pushing 4K because you need one hell of a GPU setup to push those pixels around. Since 4K is still extremely new and not quite ready for prime time, the hardware required to run it is in the same embryonic stage, which translates in layman’s terms to “almost there but not quite.”&nbsp; Even a GeForce GTX Titan with its 6GB frame buffer or a Radeon R9 290X and its 4GB frame buffer can barely eclipse 30fps at 4K with all settings maxed. Sure, you can turn down some of the settings and get a flagship GPU to run pretty well at 4K these days, but we’d rather castrate ourselves with a soldering iron than turn down the eye candy. We didn’t spend $3,500 on a monitor, or have our friends die face-down in the muck, to turn down graphics settings, so we’re not budging on that. With all settings turned up, gaming at 4K is truly cutting-edge, and is really the only application that currently stresses today’s crop of high-end GPUs, aside from a multi-monitor setup. Today, getting any single GPU to run 4K at the magical 60fps is not possible. There’s even a telling statement on the Nvidia website: “In order to power games at this resolution (4K) with settings turned up, NVIDIA recommends GTX 780 SLI or better.”</p> <h3>Is 4K 'Retina'?</h3> <p>First off, you have got some balls to compare a glorious 4K display to a marketing term such as "retina display." However, for the sake of argument, we'll humor you. As noted elsewhere, a 4K display can have as many as 185 pixels per inch (PPI), which is almost double what is found in today's 1080p displays. However, the term "Retina" as coined by The Jobs is usually more than 200 PPI for a notebook, and more than 300 for a mobile display. You see, a PPI rating's significance all comes down to how close you are to the display. Apple defines a Retina display as having enough pixels that the human retina can't distinguish between them, which is quite easy to pull off at a distance of 16 inches, but much less so at six inches.&nbsp; Therefore, mobile displays, which are held closer to your face, oftentimes have crazy-high PPI ratings. Interestingly, despite being the first to heavily push high-PPI displays, Apple has been out-Retina’d these days. The Samsung Galaxy S4 has a PPI of 441 PPI, and the Sony Vivo Xplay sits at an insane 490 PPI. The iPhone, with its rating of 326, actually isn't even in the top ten of high-PPI devices. Still, if you ask the average Joe, he’ll say Retina is better. The bottom line: At a far enough distance, everything is a retina display, because pixels are indistinguishable.</p> <h3>Go Big or Go Home</h3> <p><strong>Full HD, Ultra HD, HD HD—what does it all mean?</strong></p> <h4>HD 720p</h4> <p>A resolution with more than 700 horizontal pixels was the original “HD” resolution and was used to sell a zillion television sets the world over. On a 1600x900 20-inch display, you get a reasonable PPI of 92.</p> <h4>Full HD 1080p</h4> <p>After vanilla HD came Full HD, which cranked it up a notch to 1920x1080 resolution. Full HD is just a marketing term, though, as there’s no HD-sanctioning body. Full HD on a 23-inch panel delivers a PPI of 96, so, not much better than HD.</p> <h4>Quad HD</h4> <p>Though usually not referred to by its proper name, Quad HD refers to a panel featuring 2560x1440 resolution, which is four times the pixels of HD. A 27-inch panel at this res features a so-so PPI of 108.</p> <h4>Ultra HD</h4> <p>This is actual 4K resolution, meaning 4,000 horizontal pixels. It is four times the resolution of Full HD, and features a PPI of 144 pixels on a 32-inch panel. The term refers to how professional film is produced and projected though, so it’s not really a PC term since PC displays are slightly less than 4K at 3840x2160.</p> <h3>4K confusion cleared up</h3> <p><strong>How to pick the right 4K monitor</strong></p> <p>It used to be easy to pick a monitor when your biggest decision was choosing between an IPS or TN panel, and your choice at the high end was either 24 or 30 inches. Today, it isn’t so easy. Besides the thorny question of whether to choose an IPS model for its superior color accuracy and off-axis viewing or going with a speedier TN panel, you now have to factor in very high refresh rates, pixel density, resolution differences, and even such technology as Nvidia’s new G-sync. We can’t pick for you, but we can help you make your decision.</p> <p>As with all things in computing, there is no one-size-fits-all product. How much monitor you need depends on your specific usage. Are you a gamer? A content creator? A multi-tasker? On a budget, or a baller like Carlos Slim?</p> <p>For a professional or advanced amateur editing photos or video, the color accuracy of TN panels still isn’t good enough. Today’s budget 4K panels, such as Dell’s $699 P2815Q or Asus’s $799 PB287Q, both use TN panels, so pixel peepers will want to move along. The Dell P2815Q also features a major flaw in that its refresh rate is limited to 30Hz at its native resolution. For professionals, the only real answer for now is to go big (and expensive) with the Asus 32-inch PQ321Q for $3,000, or go dense with the 24-inch Dell UP2414Q for $1,300. The Asus model uses an IPS panel from Sharp with indium gallium zinc oxide (IGZO) to help it pack the pixels so closely. The Dell is also an IPS panel, but other details of the panel technology have yet to be disclosed. Both will hit 60Hz, but you’d better have a gnarly GPU or two if you want to use these panels for gaming. The Dell’s pixel density is to die for, with 183 pixels per inch. That’s about double that of a standard 24-inch 1920x1080 panel. There are PC panels that are denser, but not in a desktop form factor. Remember: You’ll need bionic vision, too, if you intend to use these monitors without scaling cranked up a few notches, because windows and icons will look like miniatures—and we’re not happy with how Windows scales up right now.</p> <p>That brings us to the refresh rate debate. For gamers, 60Hz IPS or TN panels are OK, but if you’ve ever played on a 120Hz panel with a powerful GPU pushing it, you know just how beautifully blur-free they can be. We dare say it, if we gamed more than we edited photos or videos, we’d take a pass on the lowly 60Hz panels. The problem with high-refresh monitors has been their pedestrian resolution of 1920x1080 in 24 inches. There are 120Hz 27-inch monitors as well, but their 1920x1080 resolution gives them a remarkably low pixel density of just 81 PPI. Asus thinks it has the gamer’s ultimate fantasy monitor with its new ROG Swift PG278Q. This 27-inch TN monitor has a respectable resolution of 2560x1440 and a refresh rate of 120Hz. While its pixel density doesn’t approach that of a 4K monitor, the 120Hz refresh may compensate for gaming purposes—for those with hefty GPUs.</p> <p>For those with lesser graphics cards, though, the Asus Swift monitor also boasts Nvidia’s new proprietary G-sync technology. (The Titan, 7-series, and several 6-series are supported in G-sync.) This tech syncs the monitor’s refresh rate to the GPU’s rendering, translating to smoother and sharper images, even if the frame rate dips below 30fps. G-sync, of course, won’t work with AMD cards, but for gamers not hung up on color accuracy or off-axis viewing, the ROG Swift might be the ultimate monitor right now in the green camp. And yes, we know AMD has talked of FreeSync—the free method to sync refresh with GPU rendering. It’s just not clear if FreeSync will work with desktop monitors yet, although it is promising on laptops, which are typically fairly low-powered in the graphics department.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/rog-swift-pg278q_right_small_2.jpg"><img src="/files/u152332/rog-swift-pg278q_right_small_0.jpg" alt="The Asus ROG Swift is the first 2560p monitor with a 120Hz refresh rate." title="Asus ROG Swift" width="620" height="603" /></a></p> <p style="text-align: center;"><strong>The Asus ROG Swift is the first 2560p monitor with a 120Hz refresh rate.</strong></p> <p>What about a 120Hz panel that also runs at 4K? That’s coming too, but remember that you’ll need an inordinate amount of graphics grunt to push twice the pixels of a single 4K panel.</p> <hr /> <p>&nbsp;</p> <h3>What you'll need to run 4K</h3> <p><strong>We hope you got a huge tax refund</strong></p> <h4>You'll need new cables</h4> <p>If you're like us and have been running DVI or dual-link DVI for the past—oh, we don't know, forever?—4K requires a different connection, as the Digital Visual interface tops out at 2560x1600 at 60Hz—far short of 4K’s needs. To run 4K resolution, you will need to run either DisplayPort 1.2 or HDMI 2.0. For those on the DisplayPort train, version 1.2 is available today and will let you run 4K at 60Hz when using a Multi-Stream Transport (MST) mode. In MST mode, the graphics card generates several signals, or "streams," which are combined over the DisplayPort cable in order to run the panel at 60Hz. If you were to use a DisplayPort cable and run the panel in Single Stream Transport (SST) mode, you would top out at 30Hz. If you're more interested in running HDMI for some reason, it’s more complicated. A single HDMI 1.4 connection is only able to hit 30Hz at 4K resolution, which is unacceptable. Some posters on our website have said "30Hz is fine for porn and web browsing," but we disagree. Just dragging a window around the screen causes it to shear and stutter in a manner similar to how it looks when you run your PC without graphics drivers installed. Some monitors and GPUs allow dual-HDMI connections to achieve the bandwidth needed, but it’s a kludge and few support it. The fix for HDMI will come with HDMI 2.0, which will easily allow 60Hz at 4K, as well as multi-channel audio, but no monitor nor any GPU we know of currently has the new interface. So, be sure to verify what your panel supports before buying; if you get a 30Hz panel, you will be very, very sorry. And forget about trying to game on that thing.</p> <h4>The monitors</h4> <p>We're still in the beginning stages of 4K monitor growth. Throughout the year, you should see 4K panels offered from all the usual suspects. The good news is that prices have already dropped from the $3,500 mark down to under $1,000, and we expect many more manufacturers to be offering panels in this lower price range. Whether or not we'll get an affordable 4K IPS panel is a different story. Although, for gaming, TN is fine. The Dell 24-inch IPS panel is relatively affordable at $1,299—just don't expect to see it hit the $500ish prices of today's 27-inch and 30-inch panels until at least 2015, if not later.</p> <h4>The GPUs</h4> <p>If you thought purchasing a 4K monitor was financially painful, you ain't seen nothing yet. That transaction was merely foreplay for the real pain and suffering that will occur when you have to buy enough graphics firepower to run that display at its native resolution on today's games. As we stated earlier, Nvidia itself recommends at least two GTX 780 cards in SLI, so that's $1,000 worth of GPU, on top of the $800 to $3,500 for the monitor. The cheapest way to get into the 4K ballgame at this point would be to buy two Radeon R9 290 cards—assuming you can even find them for sale anywhere—which will set you back $800. Or you could get two GTX 780s, which will cost you roughly $1,000. You can pretty much forget about anything less powerful than these $400 to $500 GPUs though, as we can guarantee you they won't pack enough of a punch to drive a 4K display to anywhere close to 60Hz. Even the last generation of dual-GPU boards, such as the GTX 690 and Radeon HD 7990, aren’t up to the ask on their own.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/nvidia-geforce-gtx-battlebox-sli-bridge_small_0.jpg"><img src="/files/u152332/nvidia-geforce-gtx-battlebox-sli-bridge_small.jpg" alt="In most cases, the price of a 4K monitor will pale in comparison to the GPUs needed to game on it." width="620" height="512" /></a></p> <p style="text-align: center;"><strong>In most cases, the price of a 4K monitor will pale in comparison to the GPUs needed to game on it.</strong></p> <h4>4K benchmarks</h4> <p>Before you look at the benchmark chart below, we recommend that you walk over to your PC and put a blanket over it. Seriously, it doesn't want to see you staring at these benchmark numbers. When you see how incompetent even the most high-end GPUs on the planet are for running 4K, it will probably make your PC seem, well, inadequate. What we mean is, look at these numbers. Even a $700 GeForce GTX 780 Ti can only hit 23fps in Unigine Heaven 4.0 with everything maxed out, and it hits only 19fps in Metro: Last Light. If there's one takeaway from this benchmark chart, it's this: Most of today's high-end GPUs are still not capable of running 4K at an acceptable level of performance. We're sorry, but that is a fact. Sure, all these games are playable—some more than others—but none of these cards, or combinations thereof, could hit 60fps in any of the games we chose for benchmarking. For this generation of GPUs, this is the reality.</p> <p>Since 4K is gaining so much traction, it's very likely that whatever is coming from AMD and Nvidia will be better equipped to handle this resolution, and we certainly hope it is.&nbsp;</p> <p>We've heard nothing about what AMD has up its sleeve, as we expect its Hawaii cards to have a long shelf life. The impending Mantle API update should give the cards a shot in the arm, so to speak. As with Nvidia, though, the current generation is barely capable of running 4K, so we can expect the next-generation cards to be much more capable. The good news is that by the time these newfangled cards arrive, we should have a whole flock of new 4K panels on offer, so it'll be glorious times for well-heeled PC gamers.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>Nvidia GTX 780 Ti</td> <td>Nvidia GTX 780</td> <td>Nvidia GTX Titan</td> <td>AMD Radeon R9 290X</td> <td>AMD Radeon R9 290X Crossfire</td> <td>Nvidia GTX Titan SLI</td> <td>EVGA GTX 780 ACX SLI</td> </tr> <tr> <td class="item">Driver</td> <td class="item-dark">332.21</td> <td>332.21</td> <td>332.21</td> <td>13.12</td> <td>13.12</td> <td>332.21</td> <td>332.21</td> </tr> <tr> <td>Unigine Heaven 4.0 (fps)</td> <td>23</td> <td>23<br /><strong>&nbsp;</strong></td> <td>21</td> <td>18</td> <td>17</td> <td><strong>37</strong></td> <td>30</td> </tr> <tr> <td class="item">Unigine Valley 1.0 (fps)</td> <td class="item-dark">30</td> <td>30</td> <td>28</td> <td>26</td> <td>23</td> <td><strong>37</strong></td> <td>16</td> </tr> <tr> <td>Tomb Raider (fps)</td> <td>28</td> <td>26<br /><strong>&nbsp;</strong></td> <td>25</td> <td>27</td> <td><strong>52</strong></td> <td>44</td> <td>44</td> </tr> <tr> <td>Metro: Last Light (fps)</td> <td>19</td> <td>17<strong><br /></strong></td> <td>17</td> <td>18</td> <td><strong>29</strong></td> <td>26</td> <td>26</td> </tr> <tr> <td>Battlefield 4 (fps)</td> <td>40</td> <td>36</td> <td>35</td> <td>38</td> <td><strong>64</strong></td> <td>60</td> <td>57</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td>44</td> <td>43</td> <td>41</td> <td>38</td> <td><strong>66</strong></td> <td>57</td> <td>57</td> </tr> <tr> <td>Hitman: Absolution (fps)</td> <td>44</td> <td>39</td> <td>40</td> <td>44</td> <td><strong>75</strong></td> <td>55</td> <td>55</td> </tr> </tbody> </table> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7 3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. Our monitor is a 32-inch Sharp PN-K321. All games are run at 3840x2160 with no AA. <br /></em></p> </div> </div> </div> </div> </div> http://www.maximumpc.com/4k_monitor_2014#comments 1080p 3840x2160 4k april issues 2014 HDTV monitor resolution tv uhd ultra high definition News Features Wed, 13 Aug 2014 21:28:53 +0000 Josh Norem 28140 at http://www.maximumpc.com Minecraft Beginner's Guide http://www.maximumpc.com/minecraft_beginners_guide_2014 <!--paging_filter--><h3><img src="/files/u160391/minecraft.jpg" alt="Minecraft" title="Minecraft" width="250" height="150" style="float: right;" /></h3> <h3>How to get into Minecraft</h3> <p>Minecraft is a veritable juggernaut in the PC gaming world, with a bustling mod community, dedicated Let's Play streamers, and hundreds of variations on play to keep things fresh. Nearly everywhere you go, even in department stores, you see the gaping mouths of Creepers, blank stares of Steve heads, and even diamond pickaxe styluses.</p> <p>It's a phenomenon that's only picking up steam, so what better time than now to get your hands dirty in the wide world of Minecraft? Whether you've been digging up informational videos here and there and have a basic understanding of the world or you've never even survived your first night, we're here to help you out. Grab a shovel, your best avatar skin, and let's get started.</p> <p><strong> 1. Getting Minecraft </strong></p> <p>The first step, of course, is procuring a copy of the game. You can purchase a copy at the official website (<a title="minecraft.net" href="https://minecraft.net/" target="_blank">minecraft.net</a>) for $26.95 or you can pick up a game card at the store for the same amount. You can also purchase gift codes from the website as well, just in case you happen to need a mining buddy. Registering for an account is free, and if you'd simply like to test the waters, there is a <a title="minecraft demo" href="https://minecraft.net/demo" target="_blank">Minecraft demo</a> available for trial purposes.</p> <p><strong> 2. Getting Acquainted</strong> </p> <p>Once you've gotten everything installed and customized to your liking, it's time to get started. Choose the single player button to create your very first world. It will be a completely randomized spawn, so you will be working with the luck of the draw. Choose Create New World and you'll be prompted to name your new stomping grounds. If you can't come up with any inventive names, don't worry. You can always alter it later. After creating a world, you’ll need to choose a game mode.</p> <p><strong>Game Modes </strong></p> <p><strong>Survival:</strong> Minecraft players usually flock to this game mode, usually viewed as the "standard" version. Monsters will attack you at night, and you must create shelter, find sustenance, and craft items to stay alive. You will likely die, but if you do, you simply respawn. You can choose between difficulties within Survival Mode as well.</p> <p><strong>Creative:</strong> If you're enamored with the ornate and elaborate creations you see time and time again in the Minecraft community, then Creative mode is for you. Think of it as a safe "sandbox" with unlimited resources, building materials, and no survival elements to get in the way of your genius.</p> <p><strong>Hardcore:</strong> If you're reading this guide, chances are you won't want to choose this challenging game mode. It's aimed at experienced miners and those looking to sharpen their "pro skills." If you die, everything will be lost. Think carefully before choosing this mode, as it may be more than you bargained for.</p> <p>After you've chosen a game type, click Create New World and your very first Minecraft kingdom will be generated.</p> <p><strong> 3. The Basics </strong></p> <p>Depending on what was randomly generated for your world, you'll now find yourself smack dab in the middle of verdant greenery, a beachside scene, a desert, or even snow-capped mountains. This terrain is now yours to begin cultivating as you see fit. It's the start of a brand new day, and a cycle that you will want to become acquainted with. Each day cycle lasts ten minutes, so it's prudent to remember to complete important tasks during the day, since monsters roam the countryside at night.</p> <p style="text-align: center;"><img src="/files/u154082/minecraft_ui.png" alt="minecraft UI" title="minecraft UI" width="620" height="342" /></p> <p style="text-align: center;"><strong>The basic Minecraft UI</strong></p> <p>Let's take a look at the user interface quickly before advancing. The bottom of the screen with the boxes is a quick look at your inventory. You start off with nothing, but once you pick something up, your inventory boxes start to fill up quickly. Above the boxes are hearts to the left and what look like delicious pieces of meat. The hearts represent your HP, and the meat on the right is your hunger meter. Periodically you must eat in order to keep yourself full, or you will start to lose hearts. When your hearts reach zero, you'll die. When your hunger is low, you may not sprint, either.</p> <p>Additional bars you may see when playing include the armor bar, which will appear above your health meter. This displays the integrity of the armor you have equipped. You may note a bar that looks akin to the progress bar in any standard RPG. That's your experience level, displaying your progress toward the next level. You earn experience by collecting glowing green orbs that are dropped when you kill something.&nbsp;</p> <p style="text-align: center;"><img src="http://static.gamesradar.com/images/mb/GamesRadar/us/Games/M/Minecraft/Everything%20else/Beginners%20guide/Minecraft%204%20-%20inventory%20and%20wood%20block--article_image.jpg" alt="minecraft inventory" title="minecraft inventory" width="542" height="516" /></p> <p>You can open your full inventory screen by pressing E. When your toolbar is full, you can go here to see what else you have in your possession. The toolbar can be seen in the grid have as well, and for quick access you can click and drag from here. To the left of your character are slots for armor that you will want to equip later on in the game.</p> <p>The crafting area (2 x 2 squares) is for quick crafting. You can slot raw materials here to create specific items. Organizing your items from the inventory is simple. Use the left mouse button to pick up and rearrange the entire number of items. Drag it over to another slot to move. Double-click to collect items into one stack. Alternatively, you can use the right mouse button to pick up half of a group of items, place one from the group into an empty slot, and hold and drag to place the items across multiple slots.</p> <p>By now you should have a clearer picture of the user interface and how specific parts of the game work. It's time to use the daylight to our advantage!</p> <p><strong> 4. Survive the Night! </strong></p> <p>The night is dark and full of terrors. Melisandre had something there. She must have been referring to Minecraft, because the baddies all come out at nightfall. You could technically just dig a hole and cover yourself with it, but our method will end up being much more convenient and helpful. We have 10 minutes to gather resources and create some sort of makeshift shelter, so let's make the most of it.</p> <p>Start by looking around for trees. Trees can be "punched" using your fist and will break down into wood blocks. Just hold down the mouse button and punch away until a block of wood topples down. Keep doing this and repeat with several other trees for a quick score of wood. Keeping 5-10 blocks of wood will be more than enough for our crafting purposes. Once you've collected wood, you can chop with that same piece of wood. While punching trees, keep a lookout for sheep as well. You'll need them for wool, so any you find wandering the landscape will need to die a swift death.</p> <p><iframe src="//www.youtube.com/embed/MmB9b5njVbA" width="560" height="315" frameborder="0"></iframe></p> <p>Once you've topped up your wood supply, it's time to start crafting. Crafting is integral to survival, so it's best to get started learning it. Go into your inventory screen and choose the entire stack of wood blocks you gathered earlier. Drag it to the crafting area in your inventory. You'll see a new image appear in the box beside those four. These are wooden planks, and they're important building blocks for other equipment you'll need in the future. Right now we need to build a crafting table to make more advanced objects. Take four wooden planks and arrange them in the corresponding blocks here in the crafting area. Ta-da! You now have a crafting table.</p> <p>Go ahead and set the crafting table in your toolbar, then close the inventory and select it. The crafting table will appear and you'll see a 3x3 square in which you can place materials. Take two planks and place them on top of the other. This will create wooden sticks. Once you have a series of planks and sticks, arrange three planks in the top three squares of the crafting table. Place two sticks stacked vertically beneath the middle planks.</p> <p>Congratulations! You've just created your very first tool. You can use the wooden pickaxe to accomplish a number of things like digging, chopping, and killing creatures. Once equipped with a pickaxe, you can chop down more trees for more supplies. It's always a good idea to carry more than you need, especially when you don't know what you might be facing up against.</p> <p>Once you've got your tools and some extra supplies squared away, it'll be time to start looking for a suitable location to stay for the night. While searching for a good place to camp, keep in mind that you shouldn't wander too far from landmarks you recognize or far from your original spawn point. While exploring, it can be prudent to keep an eye on your surroundings, leaving a breadcrumb trail of blocks if you need to.</p> <p>We're going to scout out a place to build your own happy little home. Look out for a nice hill or cliff you can carve into. We're going for function over aesthetic value, after all. On your way to finding some prime real estate, be on the lookout for items like coal (black specks on rock), sand, sheep, and more trees for wood. Collect as much as you can, as these items will prove quite beneficial in the long run.</p> <p>When you've settled on a suitable location, start digging with your pickaxe. Leave a space for a door and enough room inside to set up your crafting table. Opening your crafting table up after setting it up in the new pad, you'll want to make some torches. Torches are extremely important. The light will keep monsters at bay and shine into the darker areas you find yourself in, like the mines you will eventually create. Place torches all around your room, and revel in the brightness of the sanctuary you've created. Neat!</p> <p>Now you need a door to keep the nasties out. Use six wooden planks and fill up the first two columns of your crafting table, vertically. You should have a suitable area for your door in your small home, so right click on the floor in the doorway area and place the door. Make sure it's closed, and voila! You're ready to have a fun-filled night at home in your very first Minecraft house. You shouldn't venture out into the dark if you've been slow about gathering your items. If you have some daylight to kill, it's safe to putter about in your home to see what you can accomplish as far as upgrades and augments.</p> <p>For now, this is the basic setup and enough to get you through the night alone. See, that wasn't so hard, was it?</p> http://www.maximumpc.com/minecraft_beginners_guide_2014#comments beginner's guide how to minecraft Mojang noob pc PC gaming pc version Features Wed, 13 Aug 2014 21:07:00 +0000 Brittany Vincent 26269 at http://www.maximumpc.com Best Cheap Graphics Card http://www.maximumpc.com/best_cheap_graphics_card_2014 <!--paging_filter--><h3>Six entry-level cards battle for budget-board bragging rights</h3> <p>The video-card game is a lot like Hollywood. Movies like My Left Foot and The Artist take home the Oscars every year, but movies like Grown Ups 2 and Transformers 3 pull in all the cash. It's the same with GPUs, in that everyone loves to talk about $1,000 cards, but the actual bread-and-butter of the market is made up of models that cost between $100 and $150. These are not GPUs for 4K gaming, obviously, but they can provide a surprisingly pleasant 1080p gaming experience, and run cool and quiet, too.</p> <p>This arena has been so hot that AMD and Nvidia have recently released no fewer than six cards aimed at budget buyers. Four of these cards are from AMD, and Nvidia launched two models care of its all-new Maxwell architecture, so we decided to pit them against one another in an old-fashioned GPU roundup. All of these cards use either a single six-pin PCIe connector or none at all, so you don't even need a burly power supply to run them, just a little bit of scratch and the desire to get your game on. Let's dive in and see who rules the roost!</p> <h3>Nvidia's Maxwell changes the game</h3> <p>Budget GPUs have always been low-power components, and usually need just a single six-pin PCIe power connector to run them. After all, a budget GPU goes into a budget build, and those PCs typically don't come with the 600W-or-higher power supplies that provide dual six- or eight-pin PCIe connectors. Since many budget PSUs done have PCIe connectors, most of these cards come with Molex adapters in case you don't have one. The typical thermal design power (TDP) of these cards is around 110 watts or so, but that number fluctuates up and down according to spec. For comparison, the Radeon R9 290X has a TDP of roughly 300 watts, and Nvidia's flagship card, the GTX 780 Ti, has a TDP of 250W, so these budget cards don't have a lot of juice to work with. Therefore, efficiency is key, as the GPUs need to make the most out of the teeny, tiny bit of wattage they are allotted. During 2013, we saw AMD and Nvidia release GPUs based on all-new 28nm architectures named GCN and Kepler, respectively, and though Nvidia held a decisive advantage in the efficiency battle, it's taken things to the next level with its new ultra-low-power Maxwell GPUs that were released in February 2014.</p> <p>Beginning with the GTX 750 Ti and the GTX 750, Nvidia is embarking on a whole new course for its GPUs, centered around maximum power efficiency. The goal with its former Kepler architecture was to have better performance per watt compared to the previous architecture named Fermi, and it succeeded, but it's taken that same philosophy even further with Maxwell, which had as its goal to be twice as efficient as Kepler while providing 25 percent more performance.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/maxwell_small_0.jpg"><img src="/files/u152332/maxwell_small.jpg" alt="Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. " width="620" height="279" /></a></p> <p style="text-align: center;"><strong>Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units. </strong></p> <p>Achieving more performance for the same model or SKU from one generation to the next is a tough enough challenge, but to do so by cutting power consumption in half is an even trickier gambit, especially considering the Maxwell GPUs are being fabricated on the same 28nm process it used for Kepler. We always expect more performance for less power when moving from one process to the next, such as 32nm to 28nm or 22nm to 14nm, but to do so on the same process is an amazing achievement indeed. Though Nvidia used many technological advances to reduce power consumption, the main structural change was to how the individual CUDA cores inside the Graphics Processing Clusters (GPCs) are organized and controlled. In Kepler, each GPC contained individual processing units, named SMX units, and each unit featured a piece of control logic that handled scheduling for 192 CUDA cores, which was a major increase from the 32 cores in each block found in Fermi. In Maxwell, Nvidia has gone back to 32 CUDA cores per block, but is putting four blocks into each unit, which are now called SM units. If you're confused, the simple version is this—rather than one piece of logic controlling 192 cores, Maxwell has a piece of logic for each cluster of 32 cores, and there are four clusters per unit, for a total of 128 cores per block. Therefore, it's reduced the number of cores per block by 64, from 192 to 128, which helps save energy. Also, since each piece of control logic only has to pay attention to 32 cores instead of 192, it can run them more efficiently, which also saves energy.</p> <p>The benefit to all this energy-saving is the GTX 750 cards don't need external power, so they can be dropped into pretty much any PC on the market without upgrading the power supply. That makes it a great upgrade for any pre-built POS you have lying around the house.</p> <h4>Gigabyte GTX 750 Ti WindForce</h4> <p>Nvidia's new Maxwell cards run surprisingly cool and quiet in stock trim, and that's with a fan no larger than an oversized Ritz cracker, so you can guess what happens when you throw a mid-sized WindForce cooler onto one of them. Yep, it's so quiet and cool you have to check with your fingers to see if it's even running. This bad boy ran at 45 C under load, making it the coolest-running card we've ever tested, so kudos to Nvidia and Gigabyte on holding it down (the temps, that is). This board comes off the factory line with a very mild overclock of just 13MHz (why even bother, seriously), and its boost clock has been massaged up to 1,111MHz from 1,085MHz, but as always, this is just a starting point for your overclocking adventures. The memory is kept at reference speeds however, running at 5,400MHz. The board sports 2GB of GDDR5 memory, and uses a custom design for its blue-colored PCB. It features two 80mm fans and an 8mm copper heat pipe. Most interesting is the board requires a six-pin PCIe connector, unlike the reference design, which does not.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/9755_big_small_0.jpg"><img src="/files/u152332/9755_big_small.jpg" alt="The WindForce cooler is overkill, but we like it that way. " title="Gigabyte GTX 750 Ti WindForce" width="620" height="500" /></a></p> <p style="text-align: center;"><strong>The WindForce cooler is overkill, but we like it that way. </strong></p> <p>In testing, the GTX 750 Ti WindForce was neck-and-neck with the Nvidia reference design, proving that Nvidia did a pretty good job with this card, and that its cooling requirements don't really warrant such an outlandish cooler. Still, we'll take it, and we loved that it was totally silent at all times. Overclocking potential is higher, of course, but since the reference design overclocked to 1,270MHz or so, we don’t think you should expect moon-shot overclocking records. Still, this card was rock solid, whisper quiet, and extremely cool.</p> <p><strong>Gigabyte GTX 750 Ti WindForce</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9.jpg" alt="score:9" title="score:9" width="210" height="80" /></div> </div> </div> <p><strong>$160(Street), <a href="http://www.gigabyte.us/ " target="_blank">www.gigabyte.us</a></strong></p> <h4>MSI GeForce GTX 750 Gaming</h4> <p>Much like Gigabyte's GTX 750 Ti WindForce card, the MSI GTX 750 Gaming is a low-power board with a massive Twin Frozr cooler attached to it for truly exceptional cooling performance. The only downside is the formerly waifish GPU has been transformed into a full-size card, measuring more than nine inches long. Unlike the Gigabyte card though, this GPU eschews the six-pin PCIe connector, as it's just a 55W board, and since the PCIe slot delivers up to 75W, it doesn't even need the juice. Despite this card's entry-level billing, MSI has fitted it with “military-class” components for better overclocking and improved stability. It uses twin heat pipes to dual 100mm fans to keep it cool, as well. It also includes a switch that lets you toggle between booting from an older BIOS in case you run into overclocking issues.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msigtx750_small_0.jpg"><img src="/files/u152332/msigtx750_small.jpg" alt="MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card." title="MSI GeForce GTX 750 Gaming" width="620" height="364" /></a></p> <p style="text-align: center;"><strong>MSI’s Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.</strong></p> <p>Speaking of which, this board lives up to its name and has a beefy overclock right out of the box, running at 1,085MHz base clock and 1,163MHz boost clock. It features 1GB of GDDR5 RAM on a 128-bit interface.</p> <p>The Twin Frozr cooler handles the miniscule amount of heat coming out of this board with aplomb—we were able to press our finger forcibly on the heatsink under load and felt almost no warmth, sort of like when we give Gordon a hug when he arrives at the office. As the only GTX 750 in this test, it showed it could run our entire test suite at decent frame rates, but it traded barbs with the slightly less expensive Radeon R7 260X. On paper, both the GTX 750 and the R7 260X are about $119, but rising prices from either increased demand or low supply have placed both cards in the $150 range, making it a dead heat. Still, it's a very good option for those who want an Nvidia GPU and its ecosystem but can't afford the Ti model.</p> <p><strong>MSI GeForce GTX 750 Gaming</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$140, <a href="http://www.msi.com/ " target="_blank">www.msi.com</a></strong></p> <h4>Sapphire Radeon R7 265 Dual-X</h4> <p>The Sapphire Radeon R7 265 is the odds-on favorite in this roundup, due to its impressive specs and the fact that it consumes more than twice the power of the Nvidia cards. Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270. This card actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. At 150W, its TDP is very high—or at least it seems high, given that the GTX 750 Ti costs the exact same $150 and is sitting at just 60W. Unlike the lower-priced R7 260X Bonaire part, though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire, basically). However, it will support the Mantle API, someday.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small_0.jpg"><img src="/files/u152332/sapphire_radeon_r7_265_dualx_2gb_small.jpg" alt="Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. " title="Sapphire Radeon R7 265 Dual-X" width="620" height="473" /></a></p> <p style="text-align: center;"><strong>Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus. </strong></p> <p>The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damned impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall, though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check, too, making this the go-to GPU for those with small boxes or small monitors.</p> <p><strong> Sapphire Radeon R7 265 Dual-X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_9ka.jpg" alt="score:9ka" title="score:9ka" width="210" height="80" /></div> </div> </div> <p><strong>$150 (MSRP), <a href="http://www.sapphiretech.com/ " target="_blank">www.sapphiretech.com</a></strong></p> <h4>AMD Radeon R7 260X</h4> <p>The Radeon R7 260X was originally AMD's go-to card for 1080p gaming on a budget. It’s the only card in the company’s sub-$200 lineup that supports all the next-gen features that appeared in its Hawaii-based flagship boards, including support for TrueAudio, XDMA Crossfire, Mantle (as in, it worked at launch), and it has the ability to drive up to three displays —all from this tiny $120 GPU. Not bad. In its previous life, this GPU was known as the Radeon HD 7790, aka Bonaire, and it was our favorite "budget" GPU when pitted against the Nvidia GTX 650 Ti Boost due to its decent performance and amazing at-the-time game bundles. It features a 128-bit memory bus, 896 Stream Processors, 2GB of RAM (up from 1GB on the previous card), and a healthy boost clock of 1,100MHz. TDP is just 115W, so it slots right in between the Nvidia cards and the higher-end R7 265 board. Essentially, this is an HD 7790 card with 1GB more RAM, and support for TrueAudio, which we have yet to experience.</p> <p style="text-align: center;"><a class="thickbox" href="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small_0.jpg"><img src="http://www.maximumpc.com/files/u152332/amdrad_r7_260x_1_small.jpg" alt="This $120 card supports Mantle, TrueAudio, and CrossFire. " title="AMD Radeon R7 260X" width="620" height="667" /></a></p> <p style="text-align: center;"><strong>This $120 card supports Mantle, TrueAudio, and CrossFire. </strong></p> <p>In testing, the R7 260X delivered passable performance, staking out the middle ground between the faster R7 265 and the much slower R7 250 cards. It ran at about 30fps in tests like Crysis 3 and Tomb Raider, but hit 51fps on CoD: Ghosts and 40fps on Battlefield 4, so it's certainly got enough horsepower to run the latest games on max settings. The fact that it supports all the latest technology from AMD is what bolsters this card's credentials, though. And the fact that it can run Mantle with no problems is a big plus for Battlefield 4 players. We like this card a lot, just like we enjoyed the HD 7790. While it’s not the fastest card in the bunch, it’s certainly far from the slowest.</p> <p><strong>AMD Radeon R7 260X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_8.jpg" alt="score:8" title="score:8" width="210" height="80" /></div> </div> </div> <p><strong>$120 <a href="http://www.amd.com/ " target="_blank">www.amd.com</a></strong></p> <h4> <hr />MSI Radeon R7 250 OC</h4> <p>In every competition, there must be one card that represents the lower end of the spectrum, and in this particular roundup, it’s this little guy from MSI. Sure, it's been overclocked a smidge and has a big-boy 2GB of memory, but this GPU is otherwise outgunned, plain and simple. For starters, it has just 384 Stream Processors, which is the lowest number we've ever seen on a modern GPU, so it's already severely handicapped right out of the gate. Board power is a decent 65W, but when looking at the specs of the Nvidia GTX 750, it is clearly outmatched. One other major problem, at least for those of us with big monitors, is we couldn't get it to run our desktop at 2560x1600 out of the box, as it only has one single-link DVI connector instead of dual-link. On the plus side, it doesn't require an auxiliary power connector and is just $100, so it's a very inexpensive board and would make a great upgrade from integrated graphics for someone on a strict budget.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/msi_radeon_r7_250_oc_2gb_small_0.jpg"><img src="/files/u152332/msi_radeon_r7_250_oc_2gb_small.jpg" alt="Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB." title="MSI Radeon R7 250 OC" width="620" height="498" /></a></p> <p style="text-align: center;"><strong>Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.</strong></p> <p>That said, we actually felt bad for this card during testing. The sight of it limping along at 9 frames per second in Heaven 4.0 was tough to watch, and it didn't do much better on our other tests, either. Its best score was in Call of Duty: Ghosts, where it hit a barely playable 22fps. In all of our other tests, it was somewhere between 10 and 20 frames per second on high settings, which is simply not playable. We'd love to say something positive about the card though, so we'll note that it probably runs fine at medium settings and has a lot of great reviews on Newegg from people running at 1280x720 or 1680x1050 resolution.</p> <p><strong>MSI Radeon R7 250 OC 1TB</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_6.jpg" alt="score:6" title="score:6" width="210" height="80" /></div> </div> </div> <p><strong>$90 <a href=" http://us.msi.com/ " target="_blank">http://us.msi.com</a></strong></p> <h4>PowerColor Radeon R7 250X</h4> <p>The PowerColor Radeon R7 250X represents a mild bump in specs from the R7 250, as you would expect given its naming convention. It is outfitted with 1GB of RAM however, and a decent 1,000MHz boost clock. It packs 640 Stream Processors, placing it above the regular R7 250 but about mid-pack in this group. Its 1GB of memory runs on the same 128-bit memory bus as other cards in this roundup, so it's a bit constrained in its memory bandwidth, and we saw the effects of it in our testing. It supports DirectX 11.2, though, and has a dual-link DVI connector. It even supports CrossFire with an APU, but not with another PCIe GPU&shy;—or at least that's our understanding of it, since it says it supports CrossFire but doesn't have a connector on top of the card.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/r7-250x-angle_small_0.jpg"><img src="/files/u152332/r7-250x-angle_small.jpg" alt="The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. " title="PowerColor Radeon R7 250X " width="620" height="369" /></a></p> <p style="text-align: center;"><strong>The R7 250X is a rebadged HD 7770, made for cash-strapped gamers. </strong></p> <p>When we put the X-card to the test, it ended up faring a smidgen better than the non-X version, but just barely. It was able to hit 27 and 28 frames per second in Battlefield 4 and CoD: Ghosts, and 34 fps in Batman: Arkham Origins, but in the rest of the games in our test suite, its performance was simply not what we would call playable. Much like the R7 250 from MSI, this card can't handle 1080p with all settings maxed out, so this GPU is bringing up the rear in this crowd. Since it's priced "under $100" we won't be too harsh on it, as it seems like a fairly solid option for those on a very tight budget, and we'd definitely take it over the vanilla R7 250. We weren't able to see "street" pricing for this card, as it had not been released at press time, but our guess is even though it's one of the slowest in this bunch, it will likely be the go-to card under $100.</p> <p><strong>PowerColor Radeon R7 250X</strong></p> <p><strong><br /></strong></p> <div class="module-content" style="margin-top: -20px;"> <div class="module-text full"> <div class="verdict"><img src="/sites/maximumpc.com/themes/maximumpc/i/mxpc_7.jpg" alt="score:7" title="score:7" width="210" height="80" /></div> </div> </div> <p><strong>$100, <a href="http://www.powercolor.com/ " target="_blank">www.powercolor.com</a></strong></p> <h3>Should you take the red pill or the green pill?</h3> <p><strong>Both companies offer proprietary technologies to lure you into their "ecosystems," so let’s take a look at what each has to offer</strong></p> <h4>Nvidia's Offerings</h4> <p><strong>G-Sync</strong></p> <p>Nvidia's G-Sync technology is arguably one of the strongest cards in Nvidia's hand, as it eliminates tearing in video games caused by the display's refresh rate being out of sync with the frame rate of the GPU. The silicon syncs the refresh rate with the cycle of frames rendered by the GPU, so movement onscreen looks buttery smooth at all times, even below 30fps. The only downside is you must have a G-Sync monitor, so that limits your selection quite a bit.</p> <p><strong>Regular driver releases</strong></p> <p>People love to say Nvidia has "better drivers" than AMD, and though the notion of "better" is debatable, it certainly releases them much more frequently than AMD. That's not to say AMD is a slouch—especially now that it releases a new "beta" build each month—but Nvidia seems to be paying more attention to driver support than AMD.</p> <p><strong>GeForce Experience and ShadowPlay</strong></p> <p>Nvidia's GeForce Experience software will automatically optimize any supported games you have installed, and also lets you stream to Twitch as well as capture in-game footage via ShadowPlay. It's a really slick piece of software, and though we don't need a software program to tell us "hey, max out all settings," we do love ShadowPlay.</p> <p><strong>PhysX</strong></p> <p>Nvidia's proprietary PhysX software allows game developers to include billowing smoke, exploding particles, cloth simulation, flowing liquids, and more, but there's just one problem—very few games utilize it. Even worse, the ones that do utilize it, do so in a way that is simply not that impressive, with one exception: Borderlands 2.</p> <h4>AMD's Offerings</h4> <p><strong>Mantle and TrueAudio</strong></p> <p>AMD is hoping that Mantle and TrueAudio become the must-have "killer technology" it offers over Nvidia, but at this early stage, it's difficult to say with certainty if that will ever happen. Mantle is a lower-level API that allows developers to optimize a game specifically targeted at AMD hardware, allowing for improved performance.</p> <p><strong>TressFX</strong></p> <p>This is proprietary physics technology similar to Nvidia's PhysX in that it only appears in certain games, and does very specific things. Thus far, we've only seen it used once—for Lara Croft's hair in Tomb Raider. Instead of a blocky ponytail, her mane is flowing and gets blown around by the wind. It looks cool but is by no means a must-have item on your shopping list, just like Nvidia's PhysX. <strong>&nbsp;</strong></p> <p><strong>Gaming Evolved by Raptr<br /></strong></p> <p>This software package is for Radeon users only, and does several things. First, it will automatically optimize supported games you have installed, and it also connects you to a huge community of gamers across all platforms, including PC and console. You can see who is playing what, track achievements, chat with friends, and also broadcast to Twitch.tv, too. AMD also has a "rewards" program that doles out points for using the software, and you can exchange those points for gear, games, swag, and more.</p> <p><strong>Currency mining</strong></p> <p>AMD cards are better for currency mining than Nvidia cards for several reasons, but their dominance is not in question. The most basic reason is the algorithms used in currency mining favor the GCN architecture, so much so that AMD cards are usually up to five times faster in performing these operations than their Nvidia equivalent. In fact, the mining craze has pushed the demand for these cards is so high that there's now a supply shortage.</p> <h3>All the cards, side by side</h3> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>MSI Geforce GTX 750 Gaming</td> <td>GigaByte GeForce GTX 750 Ti </td> <td>GeForce GTX 650 Ti Boost *</td> <td>GeForce GTX 660 *</td> <td>MSI Radeon R7 250</td> <td>PowerColor Radeon R7 250X</td> <td>AMD Radeon R7 260X</td> <td>Sapphire Radeon R7 265</td> </tr> <tr> <td class="item">Price</td> <td class="item-dark">$120 </td> <td>$150</td> <td>$160</td> <td>$210</td> <td>$90</td> <td>$100</td> <td>$120</td> <td>$150</td> </tr> <tr> <td>Code-name</td> <td>Maxwell</td> <td>Maxwell</td> <td>Kepler</td> <td>Kepler</td> <td>Oland</td> <td>Cape Verde</td> <td>Bonaire</td> <td>Curaco</td> </tr> <tr> <td class="item">Processing cores</td> <td class="item-dark">512</td> <td>640</td> <td>768</td> <td>960</td> <td>384</td> <td>640</td> <td>896</td> <td>1,024</td> </tr> <tr> <td>ROP units</td> <td>16</td> <td>16</td> <td>24</td> <td>24</td> <td>8</td> <td>16</td> <td>16</td> <td>32</td> </tr> <tr> <td>Texture units</td> <td>32</td> <td>40<strong><br /></strong></td> <td>64</td> <td>80</td> <td>24</td> <td>40</td> <td>56</td> <td>64</td> </tr> <tr> <td>Memory</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>2GB</td> <td>1GB</td> <td>1GB</td> <td>2GB</td> <td>2GB</td> </tr> <tr> <td>Memory speed</td> <td>1,350MHz</td> <td>1,350MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,500MHz</td> <td>1,125MHz</td> <td>1,500MHz</td> <td>1,400MHz</td> </tr> <tr> <td>Memory bus</td> <td>128-bit</td> <td>128-bit</td> <td>192-bit</td> <td>192-bit</td> <td>128-bit</td> <td>128-bit</td> <td>128-bit</td> <td>256-bit</td> </tr> <tr> <td>Base clock</td> <td>1,020MHz</td> <td>1,020MHz</td> <td>980MHz</td> <td>980MHz</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> <td>N/A</td> </tr> <tr> <td>Boost clock</td> <td>1,085MHz</td> <td>1,085MHz</td> <td>1,033MHz</td> <td>1,033MHz</td> <td>1,050MHz</td> <td>1,000MHz</td> <td>1,000MHz</td> <td>925MHz</td> </tr> <tr> <td>PCI Express version</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> <td>3</td> </tr> <tr> <td>Transistor count</td> <td>1.87 billion</td> <td>1.87 billion</td> <td>2.54 billion</td> <td>2.54 billion</td> <td>1.04 billion</td> <td>1.04 billion</td> <td>2.08 billion</td> <td>2.8 billion</td> </tr> <tr> <td>Power connectors</td> <td>N/A</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>N/A</td> <td>1x six-pin</td> <td>1x six-pin</td> <td>1x six-pin</td> </tr> <tr> <td>TDP</td> <td>54W</td> <td>60W</td> <td>134W</td> <td>140W</td> <td>65W</td> <td>80W</td> <td>115W</td> <td>150W</td> </tr> <tr> <td>Fab process</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> <td>28nm</td> </tr> <tr> <td>Multi-card support</td> <td>No</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>No</td> <td>Yes</td> <td>Yes</td> <td>Yes</td> </tr> <tr> <td>Outputs</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, <br />2x HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, <br />HDMI, DisplayPort</td> <td>DVI-S, VGA, HDMI</td> <td>DVI, VGA, HDMI</td> <td>2x DVI, HDMI, DisplayPort</td> <td>2x DVI, HDMI, DisplayPort</td> </tr> </tbody> </table> <p><em>Provided for reference purposes.<br /></em></p> </div> </div> </div> </div> </div> <h3>How we tested</h3> <p><strong>We lowered our requirements, but not too much</strong></p> <p>We normally test all of our video cards on our standardized test bed, which has now been in operation for a year and a half, with only a few changes along the way. In fact, the only major change we've made to it in the last year was swapping the X79 motherboard and case. The motherboard had endured several hundred video-card insertions, which is well beyond the design specs. The case had also become bent to the point where the video cards were drooping slightly. Some, shall we say, "overzealous" overclocking also caused the motherboard to begin behaving unpredictably. Regardless, it's a top-tier rig with an Intel Core i7-3960X Extreme processor, 16GB of DDR3 memory, an Asus Rampage IV Extreme motherboard, Crucial M500 SSD, and Windows 8 64-bit Enterprise.</p> <p>For the AMD video cards, we loaded Catalyst driver 14.1 Beta 1.6, as that was the latest driver, and for the Nvidia cards, we used the 334.89 WHQL driver that was released just before testing began. We originally planned to run the cards at our normal "midrange GPU" settings, which is 1920x1080 resolution with maximum settings and 4X AA enabled, but after testing began, we realized we needed to back off those settings just a tad. Instead of dialing it down to medium settings, though, as that would run counter to everything we stand for as a magazine, we left the settings on "high" across the board, but disabled AA. These settings were a bit much for the lower-end cards, but rather than lower our settings once again, we decided to stand fast at 1080p with high settings, since we figured that's where you want to be gaming and you deserve to know if some of the less-expensive cards can handle that type of action.</p> <h3>Mantle Reviewed</h3> <p><strong>A word about AMD's Mantle API</strong></p> <p>AMD's Mantle API is a competitor to DirectX, optimized specifically for AMD's GCN hardware. In theory, it should allow for better performance since its code knows exactly what hardware it's talking to, as opposed to DirectX's "works with any card" approach. The Mantle API should be able to give all GCN 1.0 and later AMD cards quite a boost in games that support it. However, AMD points out that Mantle will only show benefits in scenarios that are CPU-bound, not GPU-bound, so if your GPU is already working as hard as it can, Mantle isn’t going to help it. However, if your GPU is always waiting around for instructions from an overloaded CPU, then Mantle can offer respectable gains.</p> <p>To test it out, we ran Battlefield 4 on an older Ivy Bridge quad-core, non-Hyper-Threaded Core i5-3470 test bench with the R7 260X GPU at 1920x1080 and 4X AA enabled. As of press time, there are only two games that support Mantle—Battlefield 4 and an RTS demo on Steam named Star Swarm. In Battlefield 4, we were able to achieve 36fps using DirectX, and 44fps using Mantle, which is a healthy increase and a very respectable showing for a $120 video card. The benefit was much smaller in Star Swarm, however, showing a negligible increase of just two frames per second.</p> <p><img src="/files/u152332/bf4_screen_swap_small.jpg" alt="Enabling Mantle in Battlefield 4 does provide performance boosts for most configs." title="Battlefield 4" width="620" height="207" /></p> <p>We then moved to a much beefier test bench running a six-core, Hyper-Threaded Core i7-3960X and a Radeon R9 290X, and we saw an increase in Star Swarm of 100 percent, going from 25fps with DirectX to 51fps using Mantle in a timed demo. We got a decent bump in Battlefield 4, too, going from 84 fps using DirectX to 98 fps in Mantle.</p> <p>Overall, Mantle is legit, but it’s kind of like PhysX or TressFX in that it’s nice to have when it’s supported, and does provide a boost, but it isn’t something we’d count on being available in most games.</p> <h3>Final Thoughts</h3> <h3>If cost is an issue, you've got options</h3> <p>Testing the cards for this feature was an enlightening experience. We don’t usually dabble in GPU waters that are this shallow, so we really had no idea what to expect from all the cards assembled. To be honest, if we were given a shot of sodium pentothal, we’d have to admit that given these cards’ price points, we had low expectations but thought they’d all at least be able to handle 1920x1080 gaming. As spoiled gamers used to running 2K or 4K resolution, 1080p seems like child’s play to us. But we found out that running that resolution at maximum settings is a bridge too far for any GPU that costs less than $120 or so. The $150 models are the sweet spot, though, and are able to game extremely well at 1080 resolution, meaning the barrier to entry for “sweet gaming” has been lowered by $100, thanks to these new GPUs from AMD and Nvidia.</p> <p>Therefore, the summary of our results is that if you have $150 to spend on a GPU, you should buy the Sapphire Radeon R7 265, as it’s the best card for gaming at this price point, end of discussion. OK, thanks for reading.</p> <p>Oh, are you still here? OK, here’s some more detail. In our testing, the Sapphire R7 265 was hands-down the fastest GPU at its price point—by a non-trivial margin in many tests—and is superior to the GTX 750 Ti from Nvidia. It was also the only GPU to come close to the magical 60fps we desire in every game, making it pretty much the only card in this crowd that came close to satisfying our sky-high demands. The Nvidia GTX 750 Ti card was a close second, though, and provides a totally passable experience at 1080p with all settings maxed. Nvidia’s trump card is that it consumes less than half the power of the R7 265 and runs 10 C cooler, but we doubt most gamers will care except in severely PSU-constrained systems.</p> <p>Moving down one notch to the $120 cards, the GTX 750 and R7 260X trade blows quite well, so there’s no clear winner. Pick your ecosystem and get your game on, because these cards are totally decent, and delivered playable frame rates in every test we ran.</p> <p>The bottom rung of cards, which consists of the R7 250(X) cards, were not playable at 1080p at max settings, so avoid them. They are probably good for 1680x1050 gaming at medium settings or something in that ballpark, but in our world, that is a no-man’s land filled with shattered dreams and sad pixels.</p> <div class="module orange-module article-module"> <div class="module orange-module article-module"><span class="module-name">Benchmarks</span><br /> <div class="module-content"> <div class="module-text full"> <div class="spec-table orange"> <table style="width: 620px; height: 265px;" border="0"> <thead> </thead> <tbody> <tr> <td>&nbsp;</td> <td>Nvidia GTX 750 Ti (reference)</td> <td>Gigabyte GTX 750 Ti</td> <td>MSI GTX 750 Gaming</td> <td>Sapphire Radeon R7 265</td> <td>AMD Radeon R7 260X</td> <td>PowerColor Radeon R7 250X</td> <td>MSI Radeon R7 250 OC</td> </tr> <tr> <td class="item">Driver</td> <td class="item-dark">334.89</td> <td>334.89</td> <td>334.89</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> <td>14.1 v1.6</td> </tr> <tr> <td>3DMark Fire Storm</td> <td>3,960</td> <td>3,974</td> <td>3,558</td> <td><strong>4,686</strong></td> <td>3,832</td> <td>2,806</td> <td>1,524</td> </tr> <tr> <td class="item">Unigine Heaven 4.0 (fps)</td> <td class="item-dark"><strong>30</strong></td> <td><strong>30<br /></strong></td> <td>25</td> <td>29</td> <td>23</td> <td>17</td> <td>9</td> </tr> <tr> <td>Crysis 3 (fps)</td> <td>27</td> <td>25</td> <td>21</td> <td><strong>32</strong></td> <td>26</td> <td>16</td> <td>10</td> </tr> <tr> <td>Far Cry 3 (fps)</td> <td><strong>40</strong></td> <td><strong>40<br /></strong></td> <td>34</td> <td><strong>40</strong></td> <td>34</td> <td>16</td> <td>14</td> </tr> <tr> <td>Tomb Raider (fps)</td> <td>30</td> <td>30</td> <td>26</td> <td><strong>36</strong></td> <td>31</td> <td>20</td> <td>12</td> </tr> <tr> <td>CoD: Ghosts (fps)</td> <td>51</td> <td>49</td> <td>42</td> <td><strong>67</strong></td> <td>51</td> <td>28</td> <td>22</td> </tr> <tr> <td>Battlefield 4 (fps)</td> <td>45</td> <td>45</td> <td>32</td> <td><strong>49</strong></td> <td>40</td> <td>27</td> <td>14</td> </tr> <tr> <td>Batman: Arkham Origins (fps)</td> <td><strong>74</strong></td> <td>71</td> <td>61</td> <td>55</td> <td>43</td> <td>34</td> <td>18</td> </tr> <tr> <td>Assassin's Creed: Black Flag (fps)</td> <td>33</td> <td>33</td> <td>29</td> <td><strong>39</strong></td> <td>21</td> <td>21</td> <td>14</td> </tr> </tbody> </table> <p><em>Best scores are bolded. Our test bed is a 3.33GHz Core i7-3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 1920x1080 with no AA except for the 3DMark tests.<br /></em></p> </div> </div> </div> </div> </div> http://www.maximumpc.com/best_cheap_graphics_card_2014#comments 1080p affordable amd benchmarks budget cheap cheap graphics card gpu Hardware Hardware maximum pc may 2014 nvidia Video Card Features Tue, 12 Aug 2014 21:43:32 +0000 Josh Norem 28304 at http://www.maximumpc.com The 11 Best Videogame Water http://www.maximumpc.com/11_best_videogame_water_2014 <!--paging_filter--><h3>Virtual water so beautiful, you'll be able to drown in it</h3> <p>Your fancy GPU maybe be able to render billions of pixels and triangles a second, but you’re not showing off its full technical power unless there’s something pretty to look at. You know what’s pretty to look at? Videogame water, specifically good videogame water.&nbsp;</p> <h3><img src="/files/u154082/crysis_water.jpg" alt="crysis water" title="crysis water" width="620" height="271" /></h3> <p>We’ve reached a point where videogame water is looking so wonderful and realistic, that it seems like you could drink from it, nay, drown in it even.</p> <p>Not all videogame water is created equal, however. To suss out which virtual H20 is worth your GPU’s rendering time, we’ve compiled a list of the 11 best videogame water. In addition to the pictures and descriptions below, make sure to check out the links for videos to see what the water looks like in action.</p> <p>What’s your choice for best videogame water? Let us know in the comments below!</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="https://plus.google.com/+SeanKnightD?rel=author" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="https://twitter.com/SeanDKnight" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="https://www.facebook.com/seandknight" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> http://www.maximumpc.com/11_best_videogame_water_2014#comments 11 best videogame water assassin's creed IV black flag Bioshock Brothers tale of two sons Crysis Dear Esther Empire total war Hydrophobia Prophecy The Elder Scrolls V: Skyrim The Witcher 2 Tomb Raider videogame water water Features Mon, 11 Aug 2014 22:36:34 +0000 Sean D Knight and Jimmy Thang 28210 at http://www.maximumpc.com Graphics Analysis: Wolfenstein: The New Order http://www.maximumpc.com/graphics_analysis_wolfenstein_new_order_2014 <!--paging_filter--><h3>We compare Wolfenstein: The New Order's low, medium, high, and ultra settings with pics and video</h3> <p>For this graphical analysis feature, we examine the graphical capabilities of Bethesda's Wolfenstein: The New Order. When the first-person shooter was released on PC, it had tons of graphical glitches, which included long load times and massive texture pop-in issues. Luckily, most of these problems have been sorted out with a few patches.</p> <p>Now the new Wolfenstein title is known for being a beautiful-looking game, so we wanted to take this graphical behemoth for a test run to see how it looks across its different graphics presets. Is this game going to show off your graphics card in all of its glory? Read on to find out!<iframe src="//www.youtube.com/embed/Neztt453910" width="560" height="315" frameborder="0"></iframe></p> <p><strong>Testing Methodology:</strong></p> <p>We wanted our tests to be easily replicated, so we ran the game in 1080p, using Wolfenstein’s preset graphics options, which include "Low," "Medium," "High”, and “Ultra”. We should mention that the point of this test is to analyze image quality and visual fidelity. This is not a frame rate performance test.</p> <p>We captured our screenshots and video with a fairly beefy gaming rig, which sports an Intel Core i7 4770K CPU, 8GB of 1600MHz G.Skill RAM, and a GTX 780 video card.</p> <p><strong>The settings we used for each test are shown in the screenshots below:</strong></p> <p style="text-align: center;"><img src="/files/u154280/2014-06-02_00001.jpg" alt="Low settings" title="Low settings" width="600" /></p> <p style="text-align: center;"><strong>Low Settings</strong></p> <p style="text-align: center;"><img src="/files/u154280/2014-06-02_00002.jpg" alt="Medium settings" title="Medium settings" width="600" height="338" /></p> <p style="text-align: center;"><strong>Medium Settings</strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/2014-06-02_00003.jpg" alt="High settings" title="High settings" width="600" height="338" /></strong></p> <p style="text-align: center;"><strong>High Settings</strong></p> <p style="text-align: center;"><strong><img src="/files/u154280/2014-06-02_00004.jpg" alt="Ultra settings" title="Ultra settings" width="600" /></strong></p> <p style="text-align: center;"><strong>Ultra Settings</strong></p> <p><strong>Video Scene Analysis:</strong></p> <p>Note: You can click on the images below to see an animated GIF comparing the scene running across low, medium, high, and ultra settings.</p> <p style="text-align: center;"><a title="Mech scene" href="/files/u154280/output_f9fq3a.gif" target="_blank"><img src="/files/u154280/mech_scene.png" alt="Mech scene" title="Mech scene" width="600" /></a></p> <p><strong>Mech Scene:</strong></p> <p>The first scene has the main character, William Blazkowicz, inside of a mech suit. When the game is rendering in Low settings we see very little detail in our character's clothing. His sleeves gain more texture and color as we go from Low to Ultra settings. The mech suit also has some differences going from Low to Ultra settings, but they’re very minimal. For example, the gauges on the left-hand side gain more texture and lighting as we ramp up graphical fidelity. The rest of the scene looks almost the same across all four presets. Yes, there are a few extra textures sprinkled into the landscape in High and Ultra settings, which Low and Medium don’t have, but again, this a very small difference and you really have to pixel peep to notice them. &nbsp;</p> <p style="text-align: center;"><a title="Soldier scene" href="/files/u154280/output_qzb5rg.gif" target="_blank"><img src="/files/u154280/soldier_scene_3.png" alt="Soldier scene" title="Soldier scene" width="600" height="338" /></a></p> <p><strong>Soldier scene:</strong></p> <p>In the Soldier scene above, we see there’s less texture and definition in the soldier who’s standing in the left corner of the screen. His clothing gets more texture as we go up in graphical quality. The same can be said for the soldier in the middle of the screen too. The texture quality of his clothing is better in Ultra than in Low settings, but the rest of the scene looks almost the same across the other presets. The dark colors and low lighting in this scene's background make it hard for us to discern any other meaningful differences.</p> <p style="text-align: center;"><a title="Airplane scene" href="/files/u154280/output_ile9i3.gif" target="_blank"><img src="/files/u154280/airplane_scene.png" alt="Airplane scene" title="Airplane scene" width="600" /></a></p> <p><strong>Airplane scene:</strong></p> <p>The hardest scene to tell any difference between the three presets is the Airplane scene. We couldn’t see anything inside the aircraft which looked noticeably different. The ocean in the game's Ultra preset looks the best of the four scenes, but that’s the only noticeable difference we could effectively draw here. The interior of the cabin looks very similar across all four presets.&nbsp;</p> <p><strong>Conclusion:</strong></p> <p>Analyzing Wolfenstein's different visual settings, we were surprised by how hard it was for us to tell the difference between any of the game’s four presets. At times, we really had to dig to bring out the nitty gritty details. In general, we found that characters look more fully realized the higher you crank up the settings. If you’re not looking at soldiers on screen, the game looks very similar across all the settings. The similar look and feel in all four presets is in part due to Wolfenstein's dark color palette. When Wolfenstein adds in more textures onto gray, black, and brown surfaces, it can be hard to notice much of a visual improvement. If the game was brighter and offered a wider color palette, it may be easier to pick up on the differences. Regardless, as it stands, it doesn't look like Wolfenstein: The New Order is going to shock and awe anyone at the highest settings, at least not compared to the game's lower presets.</p> <p>Which game would you like us to do a deep dive graphical analysis on next? Let us know in the comments below!</p> <p><span style="font-style: normal;">Follow Chris on&nbsp;</span><a style="font-style: normal;" href="https://plus.google.com/u/0/117154316323139826718" target="_blank">Google</a><span style="font-style: normal;">+&nbsp;or&nbsp;</span><a style="font-style: normal;" href="https://twitter.com/chriszele" target="_blank">Twitter</a></p> http://www.maximumpc.com/graphics_analysis_wolfenstein_new_order_2014#comments graphics analysis graphics card maximum pc pc version settings wolfenstein the new order Gaming News Features Thu, 07 Aug 2014 18:03:09 +0000 Chris Zele 28164 at http://www.maximumpc.com A Crash Course to Editing Images in Adobe Lightroom http://www.maximumpc.com/crash_course_editing_images_adobe_lightroom_2014 <!--paging_filter--><p><strong>When your images aren’t up to snuff, there’s always photo-editing software</strong></p> <p>Photography can be impenetrable from the gear to actually shooting and then the image editing software is a whole other uphill battle. Even with Adobe introducing Lightroom as a lightweight Photoshop alternative, it can be daunting to see a screen full of sliders as a complete novice. To help get you from serial Instagramer to amateur photographer, here’s a crash course to making your images look great with just a few steps in Lightroom.</p> <p><img src="http://www.maximumpc.com/files/u170397/lightroom_crash_course_top.jpg" width="620" height="419" style="font-weight: bold;" /></p> <h3 dir="ltr">Why you should shoot in RAW</h3> <p>First off, before we get to editing any images, it’s super important to start shooting RAW format images in case you haven’t already. Unlike JPEGs, RAW files are uncompressed digital negatives that carry much more information. This in turn makes them easier to work with in Lightroom or any image editor. Thanks to this full allotment of the data packed into RAW files, you can fix more images otherwise destined for the trash heap such as blue-tinged messes or almost completely black frames.</p> <p>If that wasn’t enough to sell you on shooting in RAW, this entire guide was done using the uncompressed format to show off and take advantage of the full image editing power of Lightroom.</p> <h3 dir="ltr">Getting started</h3> <p><img src="/files/u170397/image_import.jpg" width="620" height="324" /></p> <p>The first thing you’ll need to in Lightroom is to migrate your images of course. Upon starting Adobe Lightroom, navigate your mouse up to File and select “Import Photos and Video” (Ctrl+Shift+I). Another shortcut users can take advantage of is Lightroom will auto-detect any memory cards or cameras plugged into the computer.</p> <p>Lightroom will automatically drop images into dated folders. Unfortunately (or fortunately for some) this is programmed into the software, but users can always rename their folders. More importantly keywording your photos will be an indispensable tool to manage, search, and organize your images.</p> <h3 dir="ltr">Getting around inside lightroom</h3> <p><img src="/files/u170397/main_screen.jpg" width="620" height="333" /></p> <p>Once your images are all loaded into the library we can start editing one by clicking over (or hitting "D" on the keyboard) to the “Develop” screen. On the right edge of the screen users will find a list of settings that will allow them to tweak their images.</p> <p>There’s a lot to take in with Lightroom’s interface, but the most important thing users will navigate to are the filmstrip along the bottom to navigate images. Clicking anywhere on the image displayed in the center window, meanwhile, will zoom into the frame.</p> <p>Just beneath the featured picture there’s also a box designated with “X|Y” that will allow you to view the original image next to their processed counterpart. The button to the left of this aforementioned comparison toggle will return the window to normal, displaying only the final picture. Along the left side of the screen users will find a history log of all the edits made so far to each individual photo--and speaking of image settings, they’re all stacked on the right side of the window. At the bottom of this list of editing options there's also a handy "Previous" button to let users undo one chance or "Reset" to start all over again.</p> <h3 dir="ltr">Fix your framing</h3> <p><img src="/files/u170397/image_rotate.jpg" width="620" height="333" /></p> <p>Sometimes in the rush to capture that decisive moment there isn’t enough time to line up a perfect composition. But as long as the subject in the photo is in focus and your camera has enough megapixels, there’s always the option to crop the image.</p> <p>The crop tool is located on right, underneath the histogram, and is designated by a boxed grid icon closest to the left. Depending on the shot it might be smart to cut away some of the background to isolate the subject. Alternatively, cropping could come in handy to remove a busy or boring background (otherwise known as negative space). Sticklers for completely level images can also bring their mouse cursor to the edge of the frame to rotate the picture as well.</p> <h3 dir="ltr">Red Eye Correction</h3> <p><img src="/files/u170397/red_eye_correction.jpg" width="620" height="364" /></p> <p>Red eyes and flash photography seem to be inseparable despite all our technological advances, but at least it has gotten incredibly easy to fix this niggling issue. Located just two icons to the right from the Framing icon, clicking on Red Eye Correction will give you a new cursor that you'll want to select any red eyes in the photo. After that Lightroom will use the point users select and auto detect red pupils.</p> <h3 dir="ltr">White balance</h3> <p><img src="/files/u170397/white_balance.jpg" width="620" height="418" /></p> <p>Lighting is one of the toughest things in photography, especially when there’s a mix of sunlight and a blue hued lightbulb. Not only do the two different types of warm and cool light clash, they also completely throw off all the colors in your photos. With this in mind shifting the white balance should be one of the very first stops on your image editing train. Lightroom comes with a series of preset white balance settings just as cameras do with options such as daylight, shade, tungsten, and flash just to name a few.</p> <p>There's also the option to have Lightroom figure it out all on its own and most of the time it does an admirable job of picking out the right type of lighting. In case anything still looks a little off, there are also sliders that users can move around. Each slider is fairly self explanatory—shifting the top knob leftwards will make the image take on a blue shade while shifting towards yellow will cause your image to take on a sepia tone. The one underneath splits the spectrum between green and violet.</p> <p>For those wanting a bit more fine tuned control with a point-and-click solution users should select the eyedropper tool. Simply hover the dropper over to a neutral gray or white area and clicking it will have Lightroom take a best guess on white balance from that one spot.</p> <p><em>Click on to the next page where we'll dive into more editing magic.<br /></em></p> <hr /> <h3 dir="ltr">Getting to the Meat</h3> <p>Now that we’ve colored corrected the image and fixed up the composition, it's time to adjust the exposure. But before we start, there’s no hard and fast rule for what is the perfect image. It does not have to be a perfectly balanced image where everything in the frame is evenly illuminated. There’s nothing wrong with having harsh shadows or a blindingly bright spot—in fact it can actually be the thematic part of the picture you want to accentuate.</p> <p>Without further ado, here’s are the main ways you can use Lightroom to manipulate your images.</p> <ul> <li> <h3 dir="ltr"><img src="/files/u170397/basic_settings.jpg" width="200" height="610" style="float: right; margin: 10px;" /></h3> <p>Exposure: In a nutshell this lets users make the entire image brighter or darker.</p> </li> <li> <p>Contrast: Contrast changes the difference between the bright and dark parts of the image. Lowering the contrast evens out the exposure making it helpful if the picture was caught with extremely dark and bright sections. As such it can help to restore parts of the frame caught in shadows, but the trade off is this can also cause the entire picture to turn gray. On the flipside making photos more contrasty will produce a harsher look and cause colors to intensify.</p> </li> <li> <p>Highlights: Similar to affecting the brightness of the image, highlights specifically tones down the brightest parts of the frame. In most cases this could be useful for bringing back clouds lost in the blinding sunlight. Alternatively, photographers will want to tweak the highlights when photographing anything with a backlit screen or lights at night.</p> </li> <li> <p>Shadows: On the flipside of highlights changing the shadows will brighten or darken any areas caught in shade.</p> </li> <li> <p>Whites: Despite the fact we’ve already adjusted the bright parts of the frame, changing the White level in the image appears to do the same thing. Appears. What changing the white level really does is affect the lightest (or brightest) tones in the image, whereas highlights control the midtones in the frame.</p> </li> <li> <p>Blacks: At the opposite end of the spectrum blacks dictate how the darkest part of the images look. This can be helpful to make sure dark colors aren't grayed out when you've already brightened up the shadows.</p> </li> <li> <p>Auto Tone: Aside from setting all the parameters manually, Lightroom also has a handy Auto Tone tool. As with auto white balance, auto tone automatically adjusts the picture for what the program thinks will look best.</p> </li> </ul> <h3 dir="ltr">Time to get technical</h3> <p>Aside from the mix of sliders and staring at the image preview, a much more technical way of editing is using the histogram, which appears at the very top of the right side panel. Essentially it displays a graphical overview of the pictures’s full tonal range in which darker pixels fill out on the left side of as they lighten towards the right. Every edit we just explained can be done by clicking on parts of this histogram and dragging them around. Either way works so it's really up to your preference.</p> <h3 dir="ltr">Making photos “pop”</h3> <p>The tonal curve isn’t all there is to editing images. Just underneath the exposure settings is something called presence. Starting with Clarity, users can increase the sharpness of their images or give them a dreamy, hazy quality. Saturation intensifies colors in the photo, which can be useful to bringing back some color on gray and cloudy days.</p> <p>Vibrance does a similar job of intensifying colors except in a slightly smarter fashion than Saturation. Rather than uniformly bumping up the hues in the frame, Vibrance increases the intensity of muted colors whilst leaving already bright colors alone.</p> <p><em>Next up Sharpening, Noise Reduction, Lens Correction, and more.<br /></em></p> <hr /> <h3 dir="ltr">Detail control</h3> <p>Located in the "Detail" section below Lightroom’s "Basic" editing options you’ll find options to sharpen and reduce the noise of photos.</p> <p dir="ltr"><strong>Sharpening</strong></p> <p dir="ltr"><strong><img src="/files/u170397/sharpening_mask.jpg" width="620" height="363" /><br /></strong></p> <p>Firstly to quell any misconceptions, Sharpening won’t fix images for soft focus, camera shake, or any mistakes made at the time of taking the shot. Rather sharpening is a tool to accentuate details already in the photo. Just don’t over do it as over sharpening images introduces a slew of new problems including harsh edges, grainy noise, and smooth lines transforming into jagged zigzags.</p> <p>There are four parameters when it comes sharpening images:</p> <ul> <li> <p><strong>The Alt key:</strong> Well before we actually get started with any settings, holding down the Alt key is an invaluable tool that will give you a clearer, alternate view of what’s going on while you move around the sliders.</p> </li> <li> <p><strong>Amount:</strong> As you might have guessed this increases the amount of sharpening you add. This value starts at zero and as users get towards the high-end they will end up enhancing the noise in the image along with sharpening details.</p> </li> <li> <p><strong>Radius:</strong> Image sharpening mainly refines edges, but the Radius can be extended by a few pixels. In this case the radius number corresponds with the number of pixels Lightroom will apply sharpening around the edges in the picture. Having a high radius number will intensify details with a thicker edge.</p> </li> <li> <p><strong>Detail:</strong> The Detail slider determines how many edges on the image get sharpened. With lower values the image editor will only target large edges in the frame, meanwhile a value of a 100 will include every small edge.</p> </li> <li> <p><strong>Masking:</strong> Although every other slider has been about incorporating more sharpening into the image, masking does the opposite by telling Lightroom which areas should not be sharpened. Just keep in mind masking works best from image with an isolated background. The sharpening masks' effectiveness is significantly more limited with busy images, where there are edges everywhere.</p> </li> </ul> <p dir="ltr"><strong>Noise Reduction</strong></p> <p dir="ltr"><strong><img src="/files/u170397/noise_reduction.jpg" width="620" height="364" /><br /></strong></p> <p>Noise is unavoidable whether its due to shooting higher ISOs or a result from bumping up the exposure in post—luckily there’s a way to save images from looking like sandpaper.</p> <ul> <li> <p><strong>Luminance:</strong> Our first stop towards reducing noise. Increasing this value will smooth over any stippling on the photo. Take care not to raise this too high as Lightroom will begin to sacrificing the detail and turn the picture into a soft mess.</p> </li> <li> <p><strong>Detail:</strong> In case users want to better preserve the sharp details in their image, they should increase the Detail slider.</p> </li> <li> <p><strong>Contrast:</strong> This is specifically used to tone down the amount of chromatic noise—typically green and red flecks that make their way into high ISO images. Unless there is colored noise in the image, it’s best to leave this set to 0.</p> </li> </ul> <ul> </ul> <h3 dir="ltr">Lens Correction</h3> <p>Moving on, we’re going to start correcting for imperfections in the lens by scrolling down the right sidebar to "Lens Corrections."</p> <p dir="ltr"><strong>Lens profiles</strong></p> <p dir="ltr"><strong><img src="/files/u170397/lens_correction.jpg" width="620" height="333" /><br /></strong></p> <p>Enter the round hole, square peg problem. No matter how well engineered an expensive lens is, it will always produce some amount of distortion thanks to the nature of curved lenses filtering light onto flat sensors. The good news is this is the easiest thing to correct for. Simply click on "Enable Profile Corrections" on the "Basic" pane of Lens Corrections and Lightroom will do the work for you. Witness as your images are automatically corrected for barrel distortion and vignetting (dark corners). It's pretty much fool proof unless of course Adobe has not made a Lens Profile for the lens you shot with. It also might not be necessary to always click this option on as some photos might look better with the vingetting and distortion.</p> <p dir="ltr"><strong>Color Fringing</strong></p> <p dir="ltr"><strong><img src="/files/u170397/fringing.jpg" width="620" height="333" /><br /></strong></p> <p>Fringing for who don’t know appears as a purple or blue and green outline when an object is captured against a bright background—the most common example being a tree limb with the bright sky behind it. It can be a minor quibble with photos in most cases but certain lenses fringe so badly it can make a scene look like it was outlined with a color pencil.</p> <p>Luckily getting rid of fringing in Lightroom can be easy as spotting it and then clicking on it. To start, select the Color pane within the Lens Corrections and use the eyedropper just as we did with white balance. Usually fringing appears at points of high contrast so bring the cursor over to dark edges that meet a bright background. It might take a little bit of sniffing around but stay vigilant and you should be able to spot some misplaced purple or green-blue colors eventually. Some lenses are guilty of fringing terribly while others control it well, so it’s really up to you if the flaw is noticeable enough to merit correction.</p> <p dir="ltr"><strong>Chromatic Aberration</strong></p> <p>Since we’re here anyway, go ahead and click on the option to remove chromatic aberration—another type of color fringing where wavelengths of light are blurring together—since it’s as simple as turning the option on.</p> <h3 dir="ltr">You Can’t Save Them All</h3> <p><img src="/files/u170397/cannot_save.jpg" width="620" height="333" /></p> <p>Despite how extensive this guide might appear, there’s even more editing magic to mine from Lightroom—we haven’t even gotten to making black and white images, or split toning! This is only a crash course to help you make your images look better and the only way to master photography is to keep on shooting and practicing.</p> <p>In the same breath, however, we would recommend users should not use Lightroom as a crutch. Although Lightroom can do a lot to salvage poorly shot images, it’s no excuse to just shoot half-assed and expect to fix things up afterwards. Otherwise post processing will end up eating up most of the shooter's time and eventually they’ll realize that there are even certain images Lightroom can’t save (as evidenced by the one shown above). Image editing software can be a great help, but its no substitute for good old skilled photography.</p> http://www.maximumpc.com/crash_course_editing_images_adobe_lightroom_2014#comments Adobe image editing Lighroom Lighroom crash course Media Applications photoshop post processing Software Software Features Wed, 06 Aug 2014 17:43:10 +0000 Kevin Lee 28246 at http://www.maximumpc.com Ask the Doctor: Dual Dual-Boot Questions, Ancient Computers, Router Confusion, and more http://www.maximumpc.com/ask_doctor_dual_dual-boot_questions_ancient_computers_router_confusion_and_more_2014 <!--paging_filter--><h3>The doctor tackles Dual Dual-Boot Questions, Ancient Computers, Router Confusion, and more</h3> <h4>Dual Boot 7 and Linux</h4> <p>I am a bit of a newbie, but 76 years old. Can a Windows user install both Windows 7 and a Linux distro on a solid-state drive—particularly a Samsung 840 EVO drive?</p> <p><span style="font-style: italic;">- Charles Greenwood<br /></span></p> <p><strong>The Doctor Responds:</strong></p> <p>Yes, you can install both Windows and Linux onto the same SSD, just as you would on a mechanical drive, provided the drive is large enough to accommodate both operating systems. The Doc would recommend at least a 256GB drive, if not larger. The best way to do this is to install Windows first, then install your Linux distro. Ubuntu makes this particularly easy, as its install process allows you to install it side-by-side with Windows, and guides you through the process of shrinking your Windows partition to make room for Ubuntu. If you want to shrink your Windows partition from within Windows, see the next Doctor question. See this help page for details: http://bit.ly/MPC_WinDB.<strong></strong></p> <p>If you’re not planning to install Ubuntu or one of its variants, the procedure will be slightly different, but the answer is the same: Yes, as long as there’s room, and install Windows first because its bootloader doesn’t play nicely with Linux if Linux is on the drive first.<strong></strong></p> <h4>Dual-boot 7 and 8</h4> <p>I have Windows 7 Ultimate 64-bit and Windows 8 Pro 64-bit. I’d like to have both installed on my system and choose between them when I boot up. I don’t really like Win 8 and really do like Win 7. I have a 128GB Samsung 840 Pro SSD with about 16TB of other hard drives. I am looking to upgrade to a 256GB drive so that I can better run both OSes on. I am hoping you might be able to give me proper instructions as to how to have a dual-system boot on my system.</p> <p><span style="font-style: italic;">- David Dube</span></p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/shrink_partition_small_1.jpg"><span style="font-style: italic;"><img src="/files/u152332/shrink_partition_small_0.jpg" alt="You can use Windows Disk Management to shrink full-disk partitions, enabling you to create another partition on the same drive." width="620" height="545" /></span></a></p> <p style="text-align: center;"><span style="font-style: italic;"><strong>You can use Windows Disk Management to shrink full-disk partitions, enabling you to create another partition on the same drive.</strong><br /></span></p> <p><strong>The Doctor Responds:</strong></p> <p>You can certainly dual-boot Windows 8 and 7, but the Doc thinks it’s a waste of space, since you’ll be using double the disk space to install nearly identical operating systems. The desktop portion of Windows 8 is virtually the same as Windows 7’s, except better, and the annoying parts of Windows 8 can be minimized or turned off entirely via the use of Stardock’s Start8 and ModernMix.<strong></strong></p> <p>Anyway, assuming your mind is made up, the best way is to install Windows 7 first, making sure to only use half of the free space on the drive for your Win 7 partition. If you accidentally use the whole thing, no problem. Once Windows 7 is installed, go into Disk Management (right-click on My Computer, select Manage, and go to Storage &gt; Disk Management), right-click the C partition, and select Shrink Partition. Resize it so that about half the space on your drive is free. It’s best to do this right after installing Windows 7, of course, so you have free space on the drive. Leave the newly created free space as unallocated space; don’t put another partition there. Then, shut down your computer and boot from the Windows 8 install media. The Doc is assuming that you bought the physical media or were able to create a bootable flash drive or DVD from the download tool.<strong></strong></p> <p>When the installation wizard gets to&nbsp; the part where it asks “What kind of installation do you want?” select Custom, then choose the unallocated space on your drive, and the rest is a cinch. After it installs and you restart your computer, you should be able to choose which OS to boot into, and which one you boot into by default. <strong></strong></p> <h4>Ancient Motherboard Problems<strong></strong></h4> <p>Doc, I’m having problems with my motherboard. I’m running an MSI K8N Neo4 with a dual-core AMD processor with 1GB of RAM and a 500-watt PSU. I’m trying to reinstall Windows XP. I started a clean install, but halfway through the computer just shuts off. The power light on the computer flashes on and off. I have to wait 15 minutes to restart. At first, I thought it was overheating, but it never gets over 29 C. I set the BIOS to not shut down on errors. I also swapped the video card to a smaller one that requires less power. Then, I thought it was the power supply, so I replaced that, but my PC still shuts off.<strong></strong></p> <p>Finally, I started unplugging the power to the motherboard and plugging it back in. By doing this, I found that I could restart the computer right away. The solder and pegs of the power socket are intact. Do you have any ideas, or is the motherboard toast? <strong></strong></p> <p><span style="font-style: italic;">- Darrel W</span></p> <p><strong>The Doctor Responds:</strong></p> <p>The Doc would take a good long look at any capacitors on the motherboard. If any of them are bulged out, it’s quite possible you are a victim of bad caps. You can actually replace the cap yourself, but most people elect to move on. For those who don’t know the ancient history, the electronics industry as a whole was a victim of sub-quality capacitors that urban lore claims was the result of one vendor using a stolen electrolytic formula. Basically, an unknown number of PCs, televisions, and all other sorts of electronic devices have failed due to bulged or bad caps. Vendors are so sensitive to this old issue that they all like to proclaim that they use military-grade capacitors made from unicorn horns in their motherboards.&nbsp; That MSI motherboard is a little late to be part of the bad caps era, but it’s still possible.<strong></strong></p> <p>If the Doc were in your shoes, he’d look for the next possible failure: inadequate cooling due to thermal paste that’s disappeared. If the machine is roughly 10 years old, the thermal paste could be kaput, causing the machine to overheat and reboot. So, consider reseating the processor and reseating the heat sink with fresh thermal paste. You should also try removing one of the pieces of RAM (assuming you have two) and trying to reproduce the issue. Do so with both pieces of RAM. The last step may be to actually remove the motherboard from the case and see if it was installed incorrectly in the first place. Sometimes a poorly installed motherboard mount will short out the system. But in all likelihood, the board is bad. You should also know that Windows XP is at end-of-life status and will no longer receive updates from Microsoft. You should upgrade to a newer OS so that you can continue to receive security patches. The nForce 4 chipset on your board is also long dead as a supported product, so maybe you can accentuate the positive and use the opportunity to get something a little more fresh.<strong></strong></p> <h4>Chronic Router Model 420<strong></strong></h4> <p>Your February 2014 802.11ac Router Buyer’s Guide has me re-evaluating if I want to continue my subscription to your magazine. Do you folks have your lab in a state that has legalized marijuana? Reflected in the chart labeled AC Routers Compared, you show up to 419Mb/s throughput using the 802.11ac Asus router. This is really fantastic, but where did you get an Internet connection of, what, 500Mb/s? Did you generate that in a lab?&nbsp; <strong></strong></p> <p>Since you did not state your download speed, I will assume it was around 500Mb/s. Who has Internet that fast? Now, if your chart had a statement showing these routers all have linear ratings, and that if you only get 50Mb/s you can divided 50/500 =.1 and then take the Asus router’s speed of 180 Mb/s (in the bedroom at 10ft) and you can be assured that at a download speed of 50 Mb/s, your speed will be: 419 x .1 = 41.9 Mb/s for the 802.11 AC and 180 x .1 = 18 Mb/s for the 802.11N. That could turn this useless chart into a useful chart, if in fact there is a way to deduct these figures. As it stands now, it is absolutely useless crap.<strong></strong></p> <p>If you want to publish a magazine that is helpful to the vast majority of the readers, then tailor it to some level of reality. Also, it might have been useful to add that at present, as far as I could find, there are NO Apple 802.11ac adapters on the market as of Feb 16, 2014.&nbsp; And finally, show router input (download speeds and from where) and output (upload speeds). Then add the nifty theoretical specifications for future possibilities.<strong></strong></p> <p><span style="font-style: italic;">- Dave Shaff</span></p> <p><strong>The Doctor Responds:</strong></p> <p>Thanks for your feedback, Dave. While our state hasn’t yet legalized marijuana, it’s pretty easy to get a medical-usage card. <strong></strong></p> <p>First of all, you’re correct that broadband speeds in the United States are pretty dismal, and you don’t need an 802.11ac router to get maximum performance from your broadband connection. But there are several things you’ve overlooked. The first is that we tested performance between the test laptop and a PC wired to the router’s Ethernet jacks. The speed of the broadband Internet connection was not tested, precisely because Internet connection speed varies so widely. <strong></strong></p> <p>Secondly, performance just doesn’t scale down linearly like that, even if we were talking about download speeds from the Internet at large. You can’t take the 419Mb/s the Asus router got on the Wireless-AC test, and the 180Mb/s it got on Wireless-N on the same computer in the same location, and just apply a linear reduction in the way you seem to think you can, even if you are working with a slow Internet connection.<strong></strong></p> <p>The Doc has around a 30Mb/s connection at home, with an Asus RT-AC66U router. So, we ran SpeedTest a bunch of times, just for kicks. A desktop wired directly to the router’s Gigabit switch got 28.48Mb/s down. A laptop with a dual-band Intel AC 7260 Wi-Fi card 15 feet from the router got 28.54Mb/s on its 5GHz AC connection, and 22Mb/s on the 2.4GHz N connection. There’s no 50 percent linear reduction going from Wireless-AC to Wireless-N across the board. So, even if you have a relatively slow Internet connection, like most of the country, your router isn’t going to bottleneck you at the rate you assumed from the chart. You’d have to have an extremely fast network connection before you started seeing your router limit your download or upload speeds.<strong></strong></p> <p>Thirdly, the reason to get one of these fast routers (and the reason we test their PC-to-PC speeds, not download speeds) is for fast in-network transfers, like streaming HD video from a home server to&nbsp; an HTPC, or backups. But if you’re not doing that, you’re correct, you don’t need to spend $200 on a router. <strong></strong></p> <p>Finally, as far as Apple goes, the 2013 MacBook Airs and Pros have Wireless-AC built in, and have since last year. <strong><br /></strong></p> http://www.maximumpc.com/ask_doctor_dual_dual-boot_questions_ancient_computers_router_confusion_and_more_2014#comments feature maximum pc May issues 2014 Ask the Doctor Features Wed, 06 Aug 2014 14:43:42 +0000 Maximum PC staff 28297 at http://www.maximumpc.com 21 Back to School Tech Gifts http://www.maximumpc.com/15_smart_buys_your_back_school_student_2014 <!--paging_filter--><h3>Smart Buys for Your Back to School Student</h3> <p>Summers never seem to last long enough, and before you know it, you're surfing the web for research rather than the ocean waves for fun. It's a bummer, but only if you let it be. Rather than slip into a deep depression as you count down the number of days until next summer, try focusing on the good things that come with going back to school, like new tech gear!</p> <p>Whether you're going off to college or starting a new year in high school, now is the time to pitch the folks on a new laptop, tablet, or any other must-have tech item that will help you become a better student. The back to school shopping season is the perfect time to stock up on gadgets, both because you'll often find new hardware on sale, and also because it's a little easier to convince the parental units that a Chromecast is an economical investment that no student should be without. "It's Economics 101, dad!"</p> <p>Since there's still time to catch some rays, <strong>we thought we'd do you a solid by researching 15 of the best tech gifts</strong> so you can enjoy what's left of your break. Then, when you're ready, browse our gallery below to check out which electronic gear made the grade!</p> http://www.maximumpc.com/15_smart_buys_your_back_school_student_2014#comments back to school backpack charger college features gallery high school laptop phone tech gadgets technology Features Mon, 04 Aug 2014 22:41:26 +0000 Paul Lilly and Jimmy Thang 28195 at http://www.maximumpc.com 9 Horrific Game Launches http://www.maximumpc.com/9_horrific_game_launches_2014 <!--paging_filter--><h3>DRM issues, poor performance, and crashing servers</h3> <p>If you’re like us, you like the Internet, but there are unfortunately downsides to the service. It seems that over the years, developers have been releasing unfinished buggy games, hoping to just patch the situation later.&nbsp;</p> <p>While some games get better with patches and updates over time, there’s really nothing that can completely erase the memory of a rocky launch. With that said, here’s a look of the 9 worst PC game launches.</p> <p>What’s the worst game launch you’ve experienced? Let us know in the comments below!&nbsp;</p> http://www.maximumpc.com/9_horrific_game_launches_2014#comments april issues 2014 diablo 3 half life 2 maximum pc the list The Sims watch dogs worst game launches Features The List Thu, 31 Jul 2014 21:47:48 +0000 Maximum PC staff 28250 at http://www.maximumpc.com Seagate 1TB Hybrid vs. WD Black2 Dual Drive http://www.maximumpc.com/seagate_1tb_hybrid_vs_wd_black2_dual_drive_2014 <!--paging_filter--><h3>Seagate 1TB Hybrid vs. WD Black2 Dual Drive</h3> <p>Every mobile user who is limited to just one storage bay wants the best of both worlds: SSD speeds with HDD capacities. Both Seagate and WD have a one-drive solution to this problem, with Seagate offering a hybrid 1TB hard drive with an SSD cache for SSD-esque performance, and WD offering a no-compromise 2.5-inch drive with both an SSD and an HDD. These drives are arch rivals, so it’s time to settle the score.</p> <h4>ROUND 1: Specs and Package</h4> <p>The WD Black2 Dual Drive is two separate drives, with a 120GB SSD riding shotgun alongside a two-platter 1TB 5,400rpm hard drive. Both drives share a single SATA 6Gb/s interface and split the bandwidth of the channel between them, with the SSD rated to deliver 350MB/s read speeds and 140MB/s write speeds. The drive comes with a SATA-to-USB adapter and includes a five-year warranty. The Seagate SSHD uses a simpler design and features a 1TB 5,400rpm hard drive with an 8GB sliver of NAND flash attached to it, along with software that helps move frequently accessed data from the platters to the NAND memory for faster retrieval. It includes a three-year warranty and is otherwise a somewhat typical drive aimed at the consumer market, not hardcore speed freaks. Both drives include free cloning software, but since the WD includes two physical drives, a USB adapter, and a longer warranty, it gets the nod.</p> <p style="text-align: center;"><a class="thickbox" href="/files/u152332/wd_endeavor_quarter_left_higres_smal_0.jpg"><img src="/files/u152332/wd_endeavor_quarter_left_higres_smal.jpg" alt="WD’s Black2 Dual Drive is two individual drives in one enclosure, and it has the price tag to prove it. " title="WD Black2" width="620" height="620" /></a></p> <p style="text-align: center;"><strong>WD’s Black2 Dual Drive is two individual drives in one enclosure, and it has the price tag to prove it. </strong></p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 2: Durability</h4> <p>This category is somewhat of a toss-up, as the WD Black2’s overall reliability is degraded somewhat by the fact that it has a spinning volume attached to it, giving it the same robustness of the Seagate SSHD. There’s also the issue of the WD Black using the slightly antiquated JMicron controller. We don’t have any reliability data on that controller in particular, but we are always more concerned about the SSD controller you-know-whating the bed than the memory, which is rated to last for decades, even under heavy write scenarios. Both drives also use two-platter designs, so neither one is more or less prone to damage than the other. In the end, we’ll have to go with the Seagate SSHD as being more durable, simply because you only have to worry about one drive working instead of two.&nbsp;</p> <p><strong>Winner: Seagate SSHD</strong></p> <h4>ROUND 3: Performance</h4> <p>Seagate is very clear about the performance of its hybrid drives, stating that they “boot and perform like an SSD,” but it never says they’re faster. It also claims the drive is “up to five times faster than a hard drive,” which seems like a bit of a stretch. It’s difficult to actually benchmark a caching drive because it won’t show on standard sequential read tests, and it gets killed by SSDs in access time tests. That said, we did see boot and PCMark Vantage scores improve significantly over time. Our boot time dropped by more than half, going from 2:27 to 1:07 after several boots, and our PCMark Vantage score shot up from 6,000 to 19,000. Still, these times are much slower than what we got with the WD SSD, which booted in 45 seconds (the system had three dozen programs installed), and hit 33,000 in PCMark Vantage.</p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 4: Cloning Package</h4> <p>Both drives include free software to help you clone your old drive and, in an odd twist, both companies use Acronis software to get ’er done. Seagate’s software is called DiscWizard, and works on OSes as old as Windows 98 and Mac OS 10.x. WD throws in a copy of Acronis True Image, though it only works with WD drives attached via the included USB-to-SATA adapter. We tested both software packages and found them to be nearly identical, as both let us clone our existing drive and boot from it after one pass, which can be tricky at times. Therefore, we call the software package a tie since they both perform well and use Acronis. However, WD’s $300 bundle includes a USB-to-SATA adapter that makes the cloning process totally painless. Seagate makes you forage for a cable on your own, which tips the scales in WD’s favor.</p> <p><strong>Winner: WD Black2</strong></p> <h4>ROUND 5: Ease of Use</h4> <p>This round has a crystal-clear winner, and that’s the Seagate SSHD. That’s because the Seagate drive is dead-simple to use and behaves exactly like a hard drive at all times. You can plug it into any PC, Mac, or Linux machine and it is recognized with no hassle. The WD drive, on the other hand, only works on Windows PCs because it requires special software to “unlock” the 1TB hard drive partition. For us, that’s obviously not a problem, but we know it’s enraged some Linux aficionados. Also, the WD drive only has a 120GB SSD. So, if you are moving to it from an HDD, you will likely have to reinstall your OS and programs, then move all your data to the HDD portion of the drive. The Seagate drive is big enough that you would just need to clone your old drive to it.</p> <p><strong>Winner: Seagate SSHD</strong></p> <p style="text-align: center;"><strong><a class="thickbox" href="/files/u152332/laptop-sshd-1tb-dynamic-with-label-hi-res-5x7_small_0.jpg"><img src="/files/u152332/laptop-sshd-1tb-dynamic-with-label-hi-res-5x7_small.jpg" alt="Seagate’s hybrid drive offers HDD simplicity and capacity, along with SSD-like speed for frequently requested data. " title="Seagate SSHD" width="620" height="639" /><br /></a></strong></p> <p style="text-align: center;"><strong>Seagate’s hybrid drive offers HDD simplicity and capacity, along with SSD-like speed for frequently requested data. </strong></p> <h3 style="text-align: left;">And the Winner Is…</h3> <p style="text-align: left;">This verdict is actually quite simple. If you’re a mainstream user, the Seagate SSHD is clearly the superior option, as it is fast enough, has more than enough capacity for most notebook tasks, and costs about one-third of the WD Black2. But this is Maximum PC, so we don’t mind paying more for a superior product, and that’s the <strong>WD Black2 Dual Drive</strong>. It delivers both speed and capacity and is a better high-performance package, plain and simple.</p> <p style="text-align: left;"><span style="font-style: italic;">Note: This article originally appeared in the April 2014 issue of the magazine.</span></p> http://www.maximumpc.com/seagate_1tb_hybrid_vs_wd_black2_dual_drive_2014#comments Hard Drive Hardware HDD Review Seagate 1TB Hybrid ssd WD Black2 Backup Drives Hard Drives Reviews SSD Features Thu, 31 Jul 2014 19:27:45 +0000 Josh Norem 28103 at http://www.maximumpc.com