News en China to Microsoft: You Have 20 Days to Explain Compatibility Problems <!--paging_filter--><h3><img src="/files/u69/microsoft_sign_5.jpg" alt="Microsoft Sign" title="Microsoft Sign" width="228" height="133" style="float: right;" />Microsoft must issue a written statement to China within 20 days</h3> <p>Around the same time China banned Windows 8 from government use over concerns that there could be built-in spying mechanisms, authorities also began investigating Microsoft for antitrust violations. The latest in China's antitrust probe over Microsoft's business practices has the <strong>State Administration for Industry and Commerce giving the Redmond outfit 20 days to issue a written explanation</strong>. What for, you ask?</p> <p>The agency wants Microsoft to explain "problems like incompatibility and other issues caused by a lack of released information about its Windows and Office software," according to <a href="" target="_blank"><em>The Wall Street Journal's</em></a> translation of the SAIC's <a href="" target="_blank">online notice</a>. That's an incredibly vague task, though the agency issued the 20-day deadline during a meeting with Microsoft, in which further details were likely given.</p> <p>Citing state media reports, <a href="" target="_blank"><em>Reuters</em> says</a> Microsoft's use of verification codes led to complaints by Chinese companies. Interestingly, verification codes could be one of the ways Microsoft supposedly violated China's anti-monopoly law, though if that's the case, it puts Microsoft in a tough spot. Software piracy in China is a big problem for Microsoft, and it's difficult to see how verification codes could run afoul of antitrust laws.</p> <p>Microsoft isn't China's only foreign target when it comes to anti-monopoly concerns. There are dozens of other companies being investigated, including Qualcomm, which China accuses of overcharging customers for its patents.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> antitrust china microsoft office Software Windows News Mon, 01 Sep 2014 16:01:22 +0000 Paul Lilly 28458 at Scholar Feeds Over 12 Million Historic Photos to Flickr <!--paging_filter--><h3><img src="/files/u69/stnicholas.jpg" alt="St. Nicholas" title="St. Nicholas" width="228" height="163" style="float: right;" />Browse millions of copyright-free images spanning hundreds of years</h3> <p>The Internet is a wonderful place filled with text, videos, and images. Lots and lots of images. In fact, Yahoo's popular <strong>Flickr photo sharing service is the lucky recipient of millions of historical images</strong> plucked from 600 million library book pages scanned in by the Internet Archive. The project is spearheaded by Kalev Leetaru, who began work on the massive undertaking while researching communications technology Georgetown University as part of a fellowship sponsored by Yahoo.</p> <p>One thing that always bothered Leetaru was that digitization projects tend to focus on words while leaving out the pictures. What he's doing is the exact opposite. Leetaru went so far as to write his own software to sidestep the way books had originally been digitized.</p> <p><a href="" target="_blank">According to <em>BBC</em></a>, the Internet Archive used an optical character recognition (OCR) program to analyze all 600 million scanned pages and turn the image of each word into searchable text. The software could detect which parts of a page were pictures, and it would discard them.</p> <p>Leetaru's software taps into the process by taking that information and focusing on parts that the original OCR ignored. Each one was then saved as a separate JPEG picture. His software also copied the caption for each image.</p> <p>The end result will be a searchable database of more than 12 million historical copyright-free images available on Flickr. At present, Leetaru has uploaded more than 2.6 million pictures, all of which you can <a href="" target="_blank">browse here</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> flickr photography photos News Mon, 01 Sep 2014 15:20:16 +0000 Paul Lilly 28457 at Dell Readies 34-inch Curved Display for Holiday Season <!--paging_filter--><h3><img src="" alt="Dell 34-inch Curved Monitor" title="Dell 34-inch Curved Monitor" width="228" height="104" style="float: right;" /></h3> <h3>Earlier this month, LG had unveiled a similar offering</h3> <p>The Alienware Area-51 gaming PC is not the only product that Dell will try and woo gamers with during the upcoming holiday season. The PC vendor is also readying its first curved monitor. Announced alongside the Haswell-E powered Area-51, the <strong>Dell UltraSharp U3415W is a 34-inch curved display that will be available in China in November and the rest of the world in December.<br /></strong></p> <p>The U3415W has an ultra-wide 21:9 display with a WQHD 3440x1440 resolution, according to a press release issued by Dell Friday. Further, the 34-incher, whose price hasn’t been revealed as yet, packs “HDMI 2.0, MHL, DisplayPort and Mini DisplayPort connectivity, dual integrated 9 watt speakers, 60Hz performance and, when it’s set up in multi-monitor configuration, a virtually borderless cockpit view across multiple screens thanks to its ultra-thin bezels.”</p> <p>Follow Pulkit on <a href="" target="_blank">Google+</a></p> curved display dell Dell UltraSharp U3415W Hardware monitor News Mon, 01 Sep 2014 06:54:48 +0000 Pulkit Chandna 28456 at Netflix: VHS-level Streaming Quality Forced Us Into Deal with Comcast <!--paging_filter--><h3><img src="" alt="Netflix " title="Netflix " width="228" height="153" style="float: right;" /></h3> <h3>Netflix sheds light on circumstances that lead to 'interconnection' deal with Comcast</h3> <p>Currently undergoing regulatory review, the proposed merger between Comcast and Time Warner Cable isn’t something Netflix is excited about. The Los Gatos, California-based company views the deal as a potential threat to online video distributors (OVDs), according to the <strong>“Petition to Deny” (<a href=";_ga=1.235764675.337222476.1409081769" target="_blank">PDF</a>) it recently filed with the Federal Communications Commission (FCC).</strong></p> <p>Netflix, which has, of late, been busy entering into costly agreements with Internet Service Providers (ISPs) to ensure smooth delivery of its video streaming content to customers, fears that an ISP as large as the resulting entity would enjoy unduly large bargaining power “over an OVD in negotiating such access fees because failure to reach an agreement with a terminating access network that accounts for a very large portion of an OVD's customers could have a devastating effect on the finances of the OVD.” </p> <p>To make its point, the company cites the example of its February 2014 “interconnection” deal with Comcast and the exact circumstances that lead to it. Apparently, the deal came about because, in the two months immediately preceding it, the quality of the company’s service over the latter’s network had deteriorated to such a degree that customer support calls quadrupled and Netflix began losing customers.</p> <p>“Comcast subscribers went from viewing Netflix content at 720p on average (i.e., HD quality) to viewing content at nearly VHS quality. For many subscribers, the bitrate was so poor that Netflix's streaming video service became unusable,” the petition reads.”The fact that the height of the congestion occurred in December and January is significant. December is one of Netflix's busiest times because members spend more time at home over the holidays and therefore request more streaming video from Netflix and other OVDs. It became clear that Comcast would continue to allow congestion across its network to negatively affect its subscribers' online video streaming experience.”</p> <p>Follow Pulkit on <a href="" target="_blank">Google+</a></p> Comcast fcc Federal Communications Commission merger NetFlix Time Warner Cable TWC News Mon, 01 Sep 2014 06:20:29 +0000 Pulkit Chandna 28455 at Leap Motion Controller Mount for Oculus Rift Now Available <!--paging_filter--><h3>Company is also working on embeddable “mega sensor” for future VR headsets</h3> <p>Palmer Luckey, the creator of the Oculus Rift virtual reality head-mounted display (HMD), recently told <a href="" target="_blank">GamesIndustry International</a> that VR may not become mainstream for quite some time to come. One of the things currently holding VR back, per Luckey, is the use of traditional controllers, which he feels are far from ideal for VR. However, as we wait for the ideal VR input to materialize, <strong>the list of controller alternatives for VR aficionados to experiment with keeps on growing. The latest addition to it is the Leap Motion Controller.<br /></strong></p> <p>Leap Motion, Inc. has introduced a Leap Motion controller mount for the Oculus Rift. Called the <a href="" target="_blank">VR Developer Mount</a>, this $20 contraption allows the company’s eponymous motion-sensing camera to snap onto the Oculus Rift. </p> <p>“If virtual reality is to be anything like actual reality, we believe that fast, accurate, and robust hand tracking will be absolutely essential,” the company’s CTO David Holz said in a <a href="" target="_blank">blog post</a> Thursday. “We believe in the concept of other specialized controllers as well, but our hands themselves are the fundamental and universal human input device.”</p> <p>The VR Developer Mount will help developers make the most of a new API that for the first time gives them access to raw infrared from the Leap Motion Controller: “When mounted directly onto a head-worn display, these [infrared] images become stereoscopic windows into the world around you. What it sees, you see.”</p> <p>“This expands the tracking space to be in any direction you’re facing. You can reach forward, turn around, look up and down, and the tracking follows you wherever you go. Because our device’s field of view exceeds that of existing VR displays, you’ll find it can start to track your hands before you even see them.”</p> <p>Holz also revealed that the company is working on a next-generation “mega sensor” that will be offered directly to VR OEMs for embedding in their HMDs. Codenamed “Dragonfly”, the new sensor will come with “greater-than-HD image resolution, color and infrared imagery, and a significantly larger field of view.”</p> <p><iframe src="//" width="620" height="340" frameborder="0"></iframe></p> <p>Follow Pulkit on <a href="" target="_blank">Google+</a></p> controller facebook Hardware hmd leap motion controller oculus rift oculus vr Palmer Luckey vr developer mount vr headset vr input News Mon, 01 Sep 2014 03:37:49 +0000 Pulkit Chandna 28453 at Alienware Outs New Area-51 Gaming Desktop <!--paging_filter--><h3>New Triad chassis is the most outré thing to come out of Alienware’s stable in recent times</h3> <p><strong>Alienware is bringing back its Area-51 desktop PC and it’s nothing like the previous iterations</strong>, having received a pretty radical makeover. The 2014 edition Area-51 gaming rig is what can best be described as the love child of a hexagon and a triangle. Perhaps that is why the company has chosen to call the new hexagonal chassis the “Triad.”</p> <p>But what is so special about the new Triad chassis? Per the company, there is more to the design than meets the eye. The angled design is said to be better at heat dissipation as compared to the average rectangular chassis. This is because it affords “a large space for hot air to escape where a traditional, rectangular chassis only leaves a small space between it and the wall.” Further, this design is meant to enable easier access to both the front I/O panel and the rear portion.</p> <p>There is no word yet on the price of the new rig, which will be available for order in October. But it’s a slightly different story when it comes to specs. While the exact specifications are unknown, the <a href="" target="_blank">official product page</a> contains this technical outline of the Area-51: “The Alienware Area-51 ushers in a new era of performance with 6-core and 8-core Intel Core i7 Extreme processor options [<a href="" target="_blank">read our Haswell-E review</a>] that come factory overclocked and made possible with the new Intel X99 Express chipset alongside 2133Mhz DDR4 memory — up to 32GB in quad channel. Do insane, intensive multitasking like rendering video or extreme performance 4K gaming at your leisure. Experience a custom balance of solid-state and hard drives, enabling the performance of SSD with the storage capacity of HDD. Plus, 802.11ac – the latest wireless protocol – prioritizes streaming video and gaming, so lag is reduced.”</p> <p><iframe src="//" width="620" height="320" frameborder="0"></iframe></p> <p>Follow Pulkit on <a href="" target="_blank">Google+</a></p> alienware area 51 dell gaming desktop gaming rig Hardware haswell-e x99 News Sun, 31 Aug 2014 23:59:29 +0000 Pulkit Chandna 28452 at Asus Rolls Out Trio of X99 Chipset Motherboards <!--paging_filter--><h3><img src="/files/u69/asus_x99_boards.jpg" alt="Asus X99" title="Asus X99" width="228" height="141" style="float: right;" />Hello, Haswell-E</h3> <p>Now that <a href="">Haswell-E has finally arrived</a>, have you started thinking about a new build? If so, you have plenty of options. In terms of motherboard choice, <strong>Asus today announced three X99 Series boards</strong>, including the X99-Deluxe, X99-A, and X-99 Pro. All three sport the latest LGA 2011-v3 socket for Haswell-E processors and feature DDR4 memory support, along with exclusive Asus technologies.</p> <p>Out of the three, the <a href="" target="_blank">X99-Deluxe</a> is the top-end model. It comes bundled with an Asus Hyper M.2 x4 expansion card for ultra-fast 32Gbit/s transfer speeds, onboard 3x3 antenna for 802.11ac connectivity, and an Asus Fan Extension Card for flexible cooling options.</p> <p>One feature that's on all three boards is Asus's patent-pending OC Socket. This is a unique socket with extra pins that connect proprietary circuitry to extra contacts found on Haswell-E's LGA and is fully compatible with Intel's new LGA 2011-v3 chips. According to Asus, its OC Socket allows for higher DDR4 memory frequencies, lower memory latencies, and enhanced stability when overclocking.</p> <p>The X99-Deluxe is available now for $399. Look for the X99-A to land in early September for $279, and the X99-Pro in October with a price that's yet to be determined.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> asus Build a PC Hardware haswell-e motherboard x99-a x99-deluxe x99-pro News Fri, 29 Aug 2014 17:56:05 +0000 Paul Lilly 28451 at IDC's Outlook on PC Shipments is a Mixed Bag <!--paging_filter--><h3><img src="/files/u69/dell_xps_18_0.jpg" alt="Dell XPS 18" title="Dell XPS 18" width="228" height="134" style="float: right;" />High and low points abound in the PC sector</h3> <p>Reading through International Data Corporation's (IDC) latest report on PC shipments is like slapping yourself in the face to kill a blood sucking mosquito -- you're not sure whether to be relieved or pained. More to the point, IDC's roller-coaster examination starts with adjusting its forecast. <strong>IDC now expects worldwide PC shipments to decline by 3.7 percent in 2014</strong> (bummer!), though that's an improvement from its previous forecast of a 6 percent drop (hooray!).</p> <p>The market research firm says PC shipments in emerging regions are being held back by competition from alternative devices and economic challenges (drats!), though commercial demand and a "rekindling" of interest from consumers in mature markets helped to boost results in Q1 and through the rest of the year (sweet!).</p> <p>Pop a Dramamine because the up and down ride doesn't smooth out from here. According to IDC, competition from tablets is waning, especially as PCs continue to make progress in competing with slimmer, touch-friendly, and low-cost systems (rock on!), though in emerging regions, competition from other devices is a bigger factor (darn!) and over time, the replacement of Windows XP systems (yay!) will decline as the install base shrinks (d'oh!).</p> <p>"Programs to reduce PC prices, such as Windows 8.1 with Bing, have helped to improve PC shipments in some segments," <a href="" target="_blank">said Jay Chou</a>, Senior Research Analyst, Worldwide PC Trackers. "Coupled with a shift toward more mobile PCs, the market has seen a quickened pace of innovation and a focus on price points. Nevertheless, the prospects for significant PC growth in the long term remain tenuous, as users increasingly see PCs as only one of several computing devices."</p> <p>The bottom line? Quite frankly, we don't what the flip to think about IDC's assessment, so we'll wrap things up with the firm's current outlook. Barring an adjusted forecast down the line, IDC sees desktop PCs going from 133.5 million shipments in 2014 to 121.1 million in 2018; portable PC shipments holding steady at 170 million in 2014 and in 2018; and total PC shipments going from 303.5 million in 2014 to 291.1 million in 2018.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Hardware idc pc shipments News Fri, 29 Aug 2014 17:32:35 +0000 Paul Lilly 28450 at G.Skill Sets Memory Frequency Record Using Ripjaws 4 DDR4 RAM <!--paging_filter--><h3><img src="/files/u69/gskill_oc_0.jpg" alt="G.Skill Ripjaws 4 OC" title="G.Skill Ripjaws 4 OC" width="228" height="151" style="float: right;" />DDR4 memory record sits at 4,004MHz</h3> <p>We said over and over that Haswell-E was just around the corner, and after all that waiting and anticipation, today marks the official launch of the new CPU line from Intel (see our <a href="">review of Haswell-E</a>). It's not just about the processors, though -- it takes a village of components to raise Haswell-E the right way, and if you're looking to set records, G.Skill makes a strong case for its <a href="">Ripjaws 4 Series</a>. At present, <strong>G.Skill and its Ripjaws 4 Series of DDR4 RAM own the DDR4 frequency record</strong> after hitting 4,004MHz.</p> <p>There are always caveats to this level of extreme overclocking, such as cooling. As you probably guessed, it took doses of LN2 to keep things cool enough to set the record. That's a buzz kill if you're only interested in stable clocks using air or liquid cooling, though it's par for course in the overclocking sector.</p> <p>G.Skill also had to drop down to single-channel mode. In doing so, the company was able to push its Ripjaws 4 to 2,002.2MHz (4,004MHz effective) with 17-25-29-50 timings. The memory was plopped into an Asus ROG Rampage V Extreme motherboard with an Intel Core i7 5930K processor.</p> <p>It's only a matter of time before the record is broken -- perhaps by G.Skill -- but for now, this is where the bar has been set.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC ddr4 g.skill Hardware Memory ram ripjaws 4 News Fri, 29 Aug 2014 17:06:04 +0000 Paul Lilly 28449 at Mozilla Experiments with Ads in Firefox Nightly Build <!--paging_filter--><h3><img src="/files/u69/firefox_nightly_logo.jpg" alt="Firefox Nightly Logo" title="Firefox Nightly Logo" width="228" height="168" style="float: right;" />Advertisers can buy sponsored tiles in Firefox's new tab page</h3> <p>Mozilla is in search of a new revenue stream for its Firefox browser, and one proposed solution is to sell sponsored tiles that would appear on a new tab page. More than just a concept at this point, <strong>Mozilla is actively experimenting with sponsored tiles, which now appear in the newest Firefox Nightly build</strong>. These are test builds of the popular browser that contain new features and enhancements that may or may not advance into later builds, including a stable release.</p> <p>This isn't the first we've heard of this. Mozilla mentioned the possibility of ads back in February 2014 to a mostly unreceptive Internet audience. Though the idea of ads isn't a popular one among users, Mozilla promised that they wouldn't have any tracking features, and would be clearly labeled as ads.</p> <p>Fast forward to today and the time for experimentation is upon us. The folks over at <em>The Next Web</em> gave Firefox Nightly a test run and noted that when you first launch the browser, there's a message on the new tab page explaining what the tiles are, a link to a support page telling how sponsored tiles work, a promise that it adheres to Mozilla's privacy policies, and a reminder that you can turn tiles off or opt for a blank new tab page.</p> <p>"It's quite a lot to take in all at once," <em>The Next Web</em> <a href="" target="_blank">writes</a>.</p> <p>According to Firefox Product Manager Bryan Clark, some sites will show up in tiles even when there's no sponsorship deal in place. For example, popular sites like Amazon and Facebook might appear even though they didn't pay for the spot.</p> <p>It's easy to see why Mozilla would consider this approach. The majority of the company's revenue comes from search deals with Google, in which the search giant pays a premium -- hundreds of millions of dollars -- to have its search engine the default option in Firefox. While this relationship has worked up to this point, it's hard to fault Mozilla for not wanting to be beholden to a single entity.</p> <p>As to the ads, it's not a foregone conclusion that they'll stick. If they do, the earliest you'd see them in a stable build would be three months from now, which is when the latest version of Firefox Nightly is scheduled to hit the stable channel. However, Mozilla's been slow playing this, so it's probably more likely that we'd see ads in a stable release sometime next year, if that's the direction Mozilla goes.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> ads browser firefox Mozilla Software sponsored tiles News Fri, 29 Aug 2014 16:13:15 +0000 Paul Lilly 28448 at Haswell-E Review <!--paging_filter--><h3>Haswell-E: Meet Intel’s new eight-core game changing CPU</h3> <p>After three long years of going hungry with quad-cores, red meat is finally back on the menu for enthusiasts. And not just any gamey slab full of gristle with shared cores, either. With its new eight-core Haswell-E CPU, Intel may have served up the most mouth-watering, beautifully seared piece of red meat in a long time.</p> <p>And it’s a good thing, too, because enthusiast’s stomachs have been growling. Devil’s Canyon? That puny quad-core was just an appetizer. And that dual-core highly overclockable Pentium K CPU? It’s the mint you grab on your way out of the steak house.</p> <p><iframe src="//" width="620" height="349" frameborder="0"></iframe></p> <p>No, what enthusiasts have craved and wanted ever since Intel’s original clock-blocking job on the original Sandy Bridge-E was a true, overclockable enthusiast chip with eight cores. So if you’re ready for a belt loosening, belly full of enthusiast-level prime rib, pass the horse radish, get that damned salad off our table, and read on to see if Intel’s Haswell-E is everything we hoped it would be.&nbsp;</p> <p><strong>Meet the Haswell-E parts</strong></p> <p><span style="color: #ff0000;"><img src="/files/u154082/haswell-e_comparison_chart.png" alt="haswell e comparison chart" title="haswell e comparison chart" width="620" height="241" /></span></p> <p>&nbsp;</p> <p><img src="/files/u154082/lga2011v3socket.jpg" alt="haswell e socket" title="haswell e socket" width="620" height="626" /></p> <p><strong>Despite its name, the LGA2011-v3 socket is not same as the older LGA2011 socket. Fortunately, the cooling offsets are exactly the same, so almost all older coolers and accessories should work just fine.&nbsp;</strong></p> <p><img src="/files/u154082/lga2011socket1.jpg" alt="lga2011" title="lga2011" width="620" height="556" /></p> <p><strong>Though they look the same, LGA2011’s socket has arms that are actually arranged differently than the new LGA2011-v3 that replaces it. And no, you can’t drop a newer Haswell-E into this socket and make it work.</strong></p> <h4>Haswell-E</h4> <p><strong>The first consumer Intel eight-core arrives at last</strong></p> <p>Being a card-carrying member of the PC enthusiast class is not an easy path to follow. Sure, you get the most cores and priciest parts, but it also means you get to wait a hell of a long time in between CPU upgrades. And with Intel’s cadence the last few years, it also means you get the leftovers. It’s been that way ever since Intel went with its two-socket strategy with the original LGA1366/LGA1156. Those who picked the big-boy socket and stuck to their guns on Pure PC performance always got the shaft.&nbsp;</p> <p>The original Ivy Bridge in LGA1156 socket, for example, hit the streets in April of 2012. As a reward for having the more efficient and faster CPU, Intel rewarded the small-socket crowd with its Haswell in June of 2013. It wasn’t until September of 2013 that big-boy socket users finally got Ivy Bridge-E for their LGA2011s. But with Haswell already out and tearing up the benchmarks, who the hell cared?</p> <p>Well, that time has come with Haswell-E, Intel’s first replacement for the aging LGA2011 platform since 2011. This time though, Intel isn’t just shuffling new parts into its old stack. For the first since the original Pentium 4 Extreme Edition, paying the price premium actually nets you more: namely, the company’s first consumer eight-core CPU.</p> <p><strong>Meet the T-Rex of consumer CPUs: The Core i7-5960X</strong></p> <p>We were actually a little leery of Haswell when it first launched last year. It was, after all, a chip seemingly tuned for the increasingly mobile/laptoppy world we were told was our post PC-apocalyptic future. Despite this, we recognized the chip as the CPU to have for new system builders. Clock for clock, its 22nm process, tri-gate transistors put everything else to shame—even the six-core Core i7-3930K chip in many tasks. So it’s no surprise that when Intel took a quad-core Haswell, put it in the Xerox machine, and hit the copy x2 button , we’d be ecstatic. Eight cores are decidedly better than six cores or four cores when you need them.&nbsp;</p> <p>The cores don’t come without a cost though, and we don’t mean the usual painful price Intel asks for its highest-end CPUs. It’s no secret that more cores means more heat, which means lower clock speeds. That’s one of the rationales Intel used with the original six-core Core i7-3960X. Although sold as a six-core, the original Sandy Bridge-E was built using an eight-core die on which Intel had permanently switched off two cores. Intel said it wanted to balance the needs of the many versus the needs of the few—that is, by turning off two of the cores, the part could hit higher clock speeds. Indeed, the Core i7-3960X had a base clock of 3.3GHz and Turbo Boost of 3.9GHz, and most could overclock it to 5GHz. The same chip packaged as a Xeon with all eight cores working—the Xeon E5-2687W—was locked down at 3.1GHz and mostly buzzed along at 3.4GHz.</p> <p>With the new Core i7-5960X—the only eight-core of the bunch—the chip starts at a seemingly pedestrian 3GHz with a Turbo Boost of one core up to 3.5GHz. Those subsonic clock speeds won’t impress against the Core i7-4790K, which starts at 4GHz. You’ll find more on how well Haswell-E performs against Haswell in our performance section, but that’s the price to be paid, apparently, to get a chip with this many cores under the heat spreader. Regarding thermals, in fact, Intel has increased the TDP rating to 140 watts versus 130 watts of Ivy Bridge-E and Sandy Bridge-E.&nbsp;</p> <p>If the low clocks annoy you, the good news is the part is fully unlocked, so the use of overclocking has been approved. For our test units, we had very early hardware and tight deadlines, so we didn’t get very far with our overclocking efforts. Talking with vendors, however, most seem very pleased with the clock speeds they were seeing. One vendor told us overclocks of all cores at 4.5GHz was already obtainable and newer microcode updates were expected to improve that. With even the vaunted Devil’s Canyon Core i7-4790K topping out at 4.7GHz to 4.8GHz, a 4.5GHz is actually a healthy overclock for an eight-core CPU.</p> <p><span style="white-space: pre;"> </span>When you dive down into the actual cores though, much is the same, of course. It’s based on a 22nm process. It has “3D” tri-gate transistors and integrated voltage regulation. Oh, and it’s also the first CPU to feature an integrated DDR4 memory controller.</p> <p><strong>Click the next page to read about DDR4</strong></p> <hr /> <p>&nbsp;</p> <h4>DDR4 details</h4> <p>If you think Haswell-E has been a long wait, just think about DDR3, which made its debut as main memory in systems since 2007. Yes, 2007. The only component that has lasted seven years in most enthusiasts systems might be the PSU, but it’s even rare to find anyone kicking a 500-watt PSU from 2007 these days.&nbsp;</p> <p><span style="white-space: pre;"> </span>DDR4 has been in gestation seemingly as long, so why the delay? From what we can tell, resistance to yet another new memory standard during a time when people thought the desktop PC and the PC in general were dying has been the root delay. It didn’t help that no one wanted to stick their head out first, either. RAM makers didn’t want to begin producing it DDR4 in volume until AMD or Intel made chipsets for it, and AMD and Intel didn’t want to support it because of the costs it would add to PCs at a time when people were trying to lower costs. The stalemate finally ends with Haswell-E, which integrates a quad-channel memory controller into its die.</p> <p>Initial launch speeds of DDR4 clock in at DDR4/2133. For those already running DDR3 at 3GHz or higher, a 2,133 data rate is a snooze, but you should realize that anything over 2133 is overclocked RAM. With DDR4, the JEDEC speeds (the body that sets RAM standards) already has target data rates of 3200 on the map. RAM vendors we’ve talked to are already shopping DIMMS near that speed.</p> <p>The best part of DDR4 may be its density message, though. For years, consumer DDR3 has topped out at 8GB on a DIMM. With DDR4, we should see 16B DIMMs almost immediately, and stacking of chips is built into the standard, so it’s possible we’ll see 32GB DIMMs over its lifetime. On a quad-channel, eight-DIMM motherboard, you should expect to be able to build systems with 128GB of RAM using non-ECC DIMMs almost immediately. DDR4 also brings power savings and other improvements, but the main highlights enthusiasts should expect are higher densities and higher clocks. Oh, and higher prices. RAM prices haven’t been fun for anyone of late, but DDR4 will definitely be a premium part for some time. In fact, we couldn’t even get exact pricing from memory vendors as we were going to press, so we’re bracing for some really bad news.</p> <h4>PCIe lanes: now a feature to be blocked</h4> <p>Over the years, we’ve come to expect Intel to clock-block core counts, clock speeds, Hyper-Threading, and even cache for “market segmentation” purposes. What that means is Intel has to find ways to differentiate one CPU from another. Sometimes that’s by turning off Hyper-Threading (witness Core i5 and Core i7) and sometimes its locking down clock speeds. With Haswell-E though, Intel has gone to new heights with its clock-blocking by actually turning off PCIe lanes on some Haswell-E parts to make them less desirable. At the top end, you have the 3GHz Core i7-5960X with eight cores. In the midrange you have the six-core 3.5GHz Core i7-5930K. And at the “low-end” you have the six-core 3.3GHz Core i7-5820K. The 5930K and the 5820K are virtually the same in specs except for one key difference: The PCIe lanes get blocked. Yes, while the Core i7-5960X and Core i7-5930K get 40 lanes of PCIe 3.0, the Core i7-5820K gets an odd 28 lanes of PCIe 3.0. That means those who had hoped to build “budget” Haswell-E boxes with multiple GPUs may have to think hard and fast about using the lowest-end Haswell-E chip. The good news is that for most people, it won’t matter. Plenty of people run Haswell systems with SLI or CrossFire, and those CPUs are limited to 16 lanes. Boards with PLX switches even support four-way GPU setups.</p> <p>Still, it’s a brain bender to think that when you populate an X99 board with the lowest-end Haswell-E, the PCIe configuration will change. The good news is at least they’ll work, just more slowly. Intel says it worked with board vendors to make sure all the slots will function with the budget Haswell-E part.&nbsp;</p> <p><img src="/files/u154082/mpc_haswell_front-back_1.jpg" alt="haswell e chip" title="haswell e chip" width="620" height="413" /></p> <p><strong>There have been clock-blocking rumors swirling around about the Haswell being a 12-core Xeon with four cores turned off. That’s not true and Intel says this die-shot proves it.&nbsp;</strong></p> <p><img src="/files/u154082/ivbe.jpg" alt="ivy bridge e" title="ivy bridge e" width="620" height="550" /></p> <p><strong>Ivy Bridge-E’s main advantage over Sandy Bridge-E was a native six-core die and greatly reduced power consumption. And, unfortunately, like its Ivy Bridge counterpart, overclocking yields on Ivy Bridge-E were greatly reduced over its predecessor, too, with few chips hitting more than 4.7GHz at best.</strong></p> <p><img src="/files/u154082/snbe.jpg" alt="sandy bridge e" title="sandy bridge e" width="308" height="260" /></p> <p><strong>Sandy Bridge-E and Sandy Bridge will long be remembered for its friendliness to overclocking and having two of its working cores killed Red Wedding–style by Intel.</strong></p> <p><strong>Click the next page to read about X99.</strong></p> <hr /> <p>&nbsp;</p> <h4>X99&nbsp;</h4> <p><strong>High-end enthusiasts finally get the chipset they want, sort of</strong></p> <p><img src="/files/u154082/x99blockdiagram.jpg" alt="x99 block diagram" title="x99 block diagram" width="620" height="381" /></p> <p><strong>Intel overcompensated in SATA on X99 but oddly left SATA Express on the cutting-room floor.</strong></p> <p>You know what we won’t miss? The X79 chipset. No offense to X79 owners, while the Core i7-4960X can stick around for a few more months, X79 can take its under-spec’ed butt out of our establishment. Think we’re being too harsh? We don’t.</p> <p>X79 has no native USB 3.0 support. And its SATA 6Gb/s ports? Only two. It almost reads like a feature set from the last decade to us. Fortunately, in a move we wholly endorse, Intel has gone hog wild in over-compensating for the weaknesses of X79.&nbsp;</p> <p>X99 has eight USB 2.0 ports and six USB 3.0 ports baked into the peripheral controller hub in it. For SATA 6Gb/s, Intel adds 10 ports to X99. Yes, 10 ports of SATA 6Gb/s. That gazongo number of SATA ports, however, is balanced out by two glaring omission in X99: no official SATA Express or M.2 support that came with Z97. Intel didn’t say why it left off SATA Express or M.2 in the chipset, but it did say motherboard vendors were free to implement it using techniques they gleaned from doing it on Z97 motherboards. If we had to hazard a guess, we’d say Intel’s conservative nature led it to leave the feature off the chipset, as the company is a stickler for testing new interfaces before adding official support. At this point, SATA Express has been a no-show. After all, motherboards with SATA Express became available in May with Z97, yet we still have not seen any native SATA Express drives. We expect most motherboard vendors to simply add it through discrete controllers; even our early board sample had a SATA Express port.&nbsp;</p> <p>One potential weakness of X99 is Intel’s use of the DMI 2.0. That offers roughly 2.5GB/s of transfer speed between the CPU and the south bridge or PCH, but with the board hanging 10 SATA devices, USB 3.0, Gigabit Ethernet, and 8 PCIe Gen 2.0 lanes off that link, there is the potential for massive congestion—but only in a worst-case scenario. You’d really have to a boat load of hardware lit up and sending and receiving data at once to cause the DMI 2.0 to bottleneck. Besides, Intel says, you can just hang the device off the plentiful PCIe Gen 3.0 from the CPU.</p> <p>That does bring up our last point on X99: the PCIe lanes. As we mentioned earlier, there will be some confusion over the PCIe lane configuration on systems with Core i7-5820K parts. With only 28 lanes of PCIe lanes available from that one chip, there’s concern that whole slots on the motherboard will be turned off. That won’t happen, Intel says. Instead, if you go with the low-rent ride, you simply lose bandwidth. Take an X99 mobo and plug in the Core i7-5930K and you get two slots at x16 PCIe, and one x8 slot. Remove that CPU and install the Core i7-5820K, and the slots will now be configured as one x16, one x8 and one x4. It’s still more bandwidth than you can get from a normal LGA1150-based Core i7-4770K but it will be confusing nonetheless. We expect motherboard vendors to sort it out for their customers, though.</p> <p>Haswell-E does bring one more interesting PCIe configuration though: the ability to run five graphics cards in the PCIe slots at x8 speeds. Intel didn’t comment on the reasons for the option but there only a few apparent reasons. The first is mining configurations where miners are already running six GPUs. Mining, however, doesn’t seem to need the bandwidth a x8 slot would provide. The other possibility is a five-way graphics card configuration being planned by Nvidia or AMD. At this point it’s just conjecture, but one thing we know is that X99 is a welcome upgrade. Good riddance X79.&nbsp;</p> <h4>Top Procs Compared</h4> <p><span style="color: #ff0000;"><span style="white-space: pre;"><img src="/files/u154082/top_processors.png" alt="top processors compared" title="top processors compared" width="620" height="344" /></span></span></p> <h4>Core Competency&nbsp;</h4> <p><strong>How many cores do you really need?</strong></p> <p><img src="/files/u154082/haswelletaskamanger.png" alt="haswell task manager" title="haswell task manager" width="620" height="564" /></p> <p><strong>It is indeed a glorious thing to see a task manager with this many threads, but not everyone needs them.</strong></p> <p>Like the great technology philosopher Sir Mix-A-Lot said, we like big cores and we cannot lie. We want as many cores as legally available. But we recognize that not everyone rolls as hard as we do with a posse of threads. With Intel’s first eight-core CPU, consumers can now pick from two cores all the way to eight on the Intel side of the aisle—and then there’s Hyper-Threading to confuse you even more. So, how many cores do you need? We’ll give you the quick-and-dirty lowdown.</p> <p><strong>Two cores</strong></p> <p>Normally, we’d completely skip dual-cores without Hyper-Threading because the parts tend to be the very bottom end of the pool Celerons. Our asterisk is the new Intel Pentium G3258 Anniversary Edition, or “Pentium K,” which is a real hoot of a chip. It easily overclocks and is dead cheap. It’s not the fastest in content creation by a long shot, but if we were building an ultra-budget gaming rig and needed to steal from the CPU budget for a faster GPU, we’d recommend this one. Otherwise, we see dual-cores as purely ultra-budget parts today.</p> <p><strong>Two cores with Hyper-Threading</strong></p> <p>For your parents who need a reliable, solid PC without overclocking (you really don’t want to explain how to back down the core voltage in the BIOS to grandma, do you?), the dual-core Core i3 parts fulfill the needs of most people who only do content creation on occasion. Hyper-Threading adds value in multi-threaded and multi-tasking tasks. You can almost think of these chips with Hyper-Threading as three-core CPUs.&nbsp;</p> <p><strong>Four cores</strong></p> <p>For anyone who does content creation such as video editing, encoding, or even photo editing with newer applications, a quad-core is usually our recommended part. Newer game consoles are also expected to push min specs for newer games to quad-cores or more as well, so for most people who carry an Enthusiast badge, a quad-core part is the place to start.</p> <p><strong>Four cores with Hyper-Threading</strong></p> <p>Hyper-Threading got a bad name early on from the Pentium 4 and existing software that actually saw it reduce performance when turned on. Those days are long behind us though, and Hyper-Threading offers a nice performance boost with its virtual cores. How much? &nbsp;A 3.5GHz Core i7 quad-core with Hyper-Threading generally offers the same performance on multi-threaded tasks as a Core i5 running at 4.5GHz. The Hyper-Threading helps with content creation and we’d say, if content creation is 30 percent or less of your time, this is the place to be and really the best fit for 90 percent of enthusiasts.</p> <p><strong>Six cores with Hyper-Threading</strong></p> <p>Once you pass the quad-core mark, you are moving pixels professionally in video editing, 3D modeling, or other tasks that necessitate the costs of a six-core chip or more. We still think that for 90 percent of folks, a four-core CPU is plenty, but if losing time rendering a video costs you money (or you’re just ADD), pay for a six-core or more CPU. How do you decide if you need six or eight cores? Read on.&nbsp;</p> <p><strong>Eight cores with Hyper-Threading</strong></p> <p>We recognize that not everyone needs an eight-core processor. In fact, one way to save cash is to buy the midrange six-core chip instead, but if time is money, an eight-core chip will pay for itself. For example, the eight-core Haswell-E is about 45 percent faster than the four-core Core i7-4790K chip. If your render job is three hours, that’s more time working on other paying projects. The gap gets smaller between the six-core and the eight-core of course, so it’s very much about how much your time is worth or how short your attention span is. But just to give you an idea, the 3.3GHz Core i7-5960X is about 20 percent faster than the Core i7-4960X running at 4GHz.</p> <p><strong>Click the next page to see how Haswell-E stacks up against Intel's other top CPUs.</strong></p> <hr /> <p>&nbsp;</p> <h4 style="font-size: 10px;">Intel’s Top Guns Compared</h4> <p><img src="/files/u154082/cpus17918.jpg" alt="haswell" title="haswell" width="620" height="413" /></p> <p><strong><strong>The LGA2011-based Core i7-4960X (left) and the LGA2011-v3-based Core i7-5960X (middle) dwarf the Core i7-4790K chip (right). Note the change in the heat spreader between the older 4960X and 5960X, which now has larger “wings” that make it easier to remove the CPU by hand. The breather hole, which allows for curing of the thermal interface material (solder in this case), has also been moved. Finally, while the chips are the same size, they are keyed differently to prevent you from installing a newer Haswell-E into an older Ivy Bridge-E board.</strong></strong></p> <h4>Benchmarks</h4> <p><strong>Performance junkies, rejoice! Haswell-E hits it out of the ballpark</strong></p> <p><img src="/files/u154082/x99-gaming_5-rev10.jpg" alt="x99 gigabyte" title="x99 gigabyte" width="620" height="734" /></p> <p><strong>We used a Gigabyte X99 motherboard (without the final heatsinks for the voltage-regulation modules) for our testing.</strong></p> <p>For our testing, we set up three identical systems with the fastest available CPUs for each platform. Each system used an Nvidia GeForce GTX 780 with the same 340.52 drivers, Corsair 240GB Neutron GTX SSDs, and 64-bit Windows 8.1 Enterprise. Since we’ve had issues with clock speeds varying on cards that physically look the same, we also verified the clock speeds of each GPU manually and also recorded the multiplier, bclock, and speeds the parts run at under single-threaded and multi-threaded loads. So you know, the 3GHz Core i7-5960X’s would run at 3.5GHz on single-threaded tasks but usually sat at 3.33GHz on multi-threaded tasks. The 3.6GHz Core i7-4960X ran everything at 4GHz, including multi-threading tasks. The 4GHz Core i7-4790K part sat at 4.4GHz on both single- and multi-threaded loads.</p> <p>For Z97, we used a Gigabyte Z97M-D3H mobo with a Core i7-4790K “Devil’s Canyon” chip aboard. &nbsp;An Asus Sabertooth X79 did the duty for our Core i7-4960X “Ivy Bridge-E” chip. Finally, for our Core i7-5960X chip, we obtained an early Gigabyte X99-Gaming 5 motherboard. The board was pretty early but we feel comfortable with our performance numbers as Intel has claimed the Core i7-5960X was “45 percent” faster than a quad-core chip, and that’s what we saw in some of our tests.&nbsp;</p> <p>One thing to note: The RAM capacities were different but in the grand scheme of things and the tests we run, it has no impact. The Sabertooth X79 &nbsp;had 16GB of DDR3/2133 in quad-channel mode, the Z97M-D3H had 16GB of DDR3/2133 in dual-channel mode. Finally, the X99-Gaming 5 board had 32GB of Corsair DDR4/2133. All three CPUs will overclock, but we tested at stock speeds to get a good baseline feel.&nbsp;</p> <p>For our benchmarks, we selected from a pile of real-world games, synthetic tests, as well as real-world applications across a wide gamut of disciplines. Our gaming tests were also run at very low resolutions and low-quality settings to take the graphics card out of the equation. We also acknowledge that people want to know what they can expect from the different CPUs at realistic settings and resolutions, so we also ran all of the games at their highest settings at 1920x1080 resolution, which is still the norm in PC gaming.&nbsp;</p> <p><strong>The results</strong></p> <p>We could get into a multi-sentence analysis of how it did and slowly break out with our verdict but in a society where people get impatient at the microwave, we’ll give you the goods up front: Holy Frakking Smokes, this chip is fast! The Core i7-5960X is simply everything high-end enthusiasts have been dreaming about.&nbsp;</p> <p>Just to give you an idea, we’ve been recording scores from $7,000 and $13,000 PCs in our custom Premiere Pro CS6 benchmark for a couple of years now. The fastest we’ve ever seen is the Digital Storm Aventum II that we reviewed in our January 2014 issue. The 3.3GHz Core i7-5960X was faster than the Aventum II’s Core i7-4960X running at 4.7GHz. Again, at stock speeds, the Haswell-E was faster than the fastest Ivy Bridge-E machine we’ve ever seen.</p> <p>It wasn’t just Premiere Pro CS6 we saw that spread in either. In most of our tests that stress multi-threading, we saw roughly a 45 percent to 50 percent improvement going from the Haswell to the Haswell-E part. The scaling gets tighter when you’re comparing the six-core Core i7-4960X but it’s still a nice, big number. We generally saw a 20 percent to 25 percent improvement in multi-threaded tasks.&nbsp;</p> <p>That’s not even factoring in the clock differences between the parts. The Core i7-4790K buzzes along at 4.4GHz—1.1GHz faster than the Core i7-5960X in multi-threaded tasks—yet it still got stomped by 45 to 50 percent. The Core i7-4960X had a nearly 700MHz clock advantage as well over the eight-core chip.</p> <p>The whole world isn’t multi-threaded, though. Once we get to workloads that don’t push all eight cores, the higher clock speeds of the other parts predictably take over. ProShow Producer 5.0, for example, has never pushed more than four threads and we saw the Core i7-5960X lose by 17 percent. The same happened in our custom Stitch.Efx 2.0 benchmark, too. In fact, in general, the Core i7-4790K will be faster thanks to its clock speed advantage. If you overclocked the Core i7-5960X to 4GHz or 4.4GHz on just four cores, the two should be on par in pure performance on light-duty workloads.</p> <p>In gaming, we saw some results from our tests that are a little bewildering to us. At low-resolution and low-quality settings, where the graphics card was not the bottleneck, the Core i7-4790K had the same 10 percent to 20 percent advantage. When we ran the same tests at ultra and 1080p resolution, the Core i7-5960X actually had a slight advantage in some of the runs against the Core i7-4790K chip. We think that may be from the bandwidth advantage the 5960X has. Remember, we ran all of the RAM at 2,133, so it’s not DDR4 vs. DDR3. It’s really quad-channel vs. dual-channel.</p> <p>We actually put a full breakdown of each of the benchmarks and detailed analysis on if you really want to nerd out on the performance.</p> <p><strong>What you should buy</strong></p> <p>Let’s say it again: The Core i7-5960X stands as the single fastest CPU we’ve seen to date. It’s simply a monster in performance in multi-threaded tasks and we think once you’ve overclocked it, it’ll be as fast as all the others in tasks that aren’t thread-heavy workloads.</p> <p>That, however, doesn’t mean everyone should start saving to buy a $1,000 CPU. No, for most people, the dynamic doesn’t change. For the 80 percent of you who fall into the average Joe or Jane nerd category, a four-core with Hyper-Threading still offers the best bang for the buck. It won’t be as fast as the eight-core, but unless you’re really working your rig for a living, made of money, or hate for your Handbrake encodes to take that extra 25 minutes, you can slum it with the Core i7-4790K chip. You don’t even have to heavily overclock it for the performance to be extremely peppy.</p> <p>For the remaining 20 percent who actually do a lot of encoding, rendering, professional photo editing, or heavy multi-tasking, the Core i7-5960X stands as the must-have CPU. It’s the chip you’ve been waiting for Intel to release. Just know that at purely stock speeds, you do give up performance to the Core i7-4790K part. But again, the good news is that with minor overclocking tweaks, it’ll be the equal or better of the quad-core chip.</p> <p>What’s really nice here is that for the first time, Intel is giving its “Extreme” SKU something truly extra for the $999 they spend. Previous Core i7 Extreme parts have always been good overclockers, but a lot of people bypassed them for the midrange chips such as the Core i7-4930K, which gave you the same core counts and overclocking to boot. The only true differentiation Extreme CPU buyers got was bragging rights. With Haswell-E, the Extreme buyers are the only ones with eight-core parts.</p> <p>Bang-for-the-buck buyers also get a treat from the six-core Core i7-5820K chip. At $389, it’s slightly more expensive than the chip it replaces—the $323 Core i7-4820K—but the extra price nets you two more cores. Yes, you lose PCIe bandwidth but most people probably won’t notice the difference. We didn’t have a Core i7-5820K part to test, but we &nbsp;believe on our testing with the Core i7-5960X that minor overclocking on the cheap Haswell-E would easily make it the equal of Intel’s previous six-core chips that could never be had for less than $580.</p> <p>And that, of course, brings us to the last point of discussion: Should you upgrade from your Core i7-4960X part? The easy answer is no. In pure CPU-on-CPU &nbsp;showdowns, the Core i7-4960X is about 20 percent slower in multi-threaded tasks, and in light-duty threads it’s about the same, thanks to the clock-speed advantage the Core i7-4960X has. There are two reasons we might want to toss aside the older chip, though. The first is the pathetic SATA 6Gb/s ports, which, frankly, you actually need on a heavy-duty work machine. The second reason would be the folks for whom a 20 percent reduction in rendering time would actually be worth paying for.&nbsp;</p> <p><strong>Click the next page to check out our Haswell-E benchmarks.</strong></p> <hr /> <h4><span style="font-size: 1.17em;">Haswell-E Benchmarks</span></h4> <p><strong>Haswell-E benchmarks overview</strong></p> <p><span style="font-size: 1.17em;">&nbsp;</span><img src="/files/u154082/haswell_e_benchmarks.png" alt="haswell e benchmarks" title="haswell e benchmarks" width="541" height="968" /></p> <p>&nbsp;</p> <p>&nbsp;</p> <p><strong>Benchmark Breakdown</strong></p> <p>We like to give you the goods on a nice table but not everyone is familiar with what we use to test and what exactly the numbers means so let’s break down some of the more significant results for you.&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p><img src="/files/u154082/cinebenchsinglethreaded.png" alt="cinebench 15 single" title="cinebench 15 single" width="620" height="472" /></p> <p><strong>Cinebench 15 single-threaded performance</strong></p> <p><span style="color: #000000;">We used Maxon’s Cinebench 15 benchmark to see just how fast the trio of chips would run this 3D rendering test. Cinebench 15 allows you to restrict it from using all of the cores or just one core. For this test, we wanted to see how the Core i7-5960X “Haswell-E” would do against the others by measuring a single core. The winner here is the Core i7-4790K “Devil’s Canyon” chip. That’s no surprise—it uses the same microarchitecture as the big boy Haswell-E but it has a ton more clock speed on default. The Haswell-E is about 21 percent slower running at 3.5GHz. The Devil’s Canyon part is running about 900MHz faster at 4.4GHz. Remember, on default, the Haswell-E only hits 3.5GHz on single-core loads. The Haswell-E better microarchitecture also loses to the Core i7-4960X “Ivy Bridge-E,” but not by much and that’s with the Ivy Bridge-E’s clock speed advantage of 500MHz. Still, the clear winner in single-threaded performance is the higher-clocked Devil’s Canyon chip.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/cinebenchmulti.png" alt="cinebench 15 multi" title="cinebench 15 multi" width="620" height="428" /></span></p> <p><span style="color: #000000;"><strong>Cinebench 15 multi-threaded performance</strong></span></p> <p><span style="color: #000000;">You don’t buy an eight-core CPU and then throw only single-thread workloads at it, so we took the handcuffs off of Cinebench 15 and let it render with all available threads. On the Haswell-E part, that’s 16 threads of fun, on Ivy Bridge-E it’s 12-threads, and on Devil’s Canyon we’re looking at eight-threads. The winner by a clear margin is the Haswell-E part. Its performance is an astounding 49 percent faster than the Devil’s Canyon and about 22 percent faster than Ivy Bridge-E. We’ll just have to continue to remind you, too: this is with a severe clock penalty. That 49-percent-faster score is with all eight cores running at 3.3GHz vs all four of the Devil’s Canyon cores buzzing along at 4.4GHz. That’s an 1,100MHz clock speed advantage. Ivy Bridge-E also has a nice 700MHz clock advantage than Haswell-E. Chalk this up as a big, huge win for Haswell-E.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/povray.png" alt="pov-ray" title="pov-ray" width="620" height="491" /></span></p> <p><span style="color: #000000;"><strong>POV-Ray performance</strong></span></p> <p><span style="color: #000000;">We wanted a second opinion on rendering performance, so we ran POV-Ray, a freeware ray tracer that has roots that reach back to the Amiga. Again, Haswell-E wins big-time with a 47 percent performance advantage over Devil’s Canyon and a 25 percent advantage over Ivy Bridge-E. Yeah, and all that stuff we said about the clock speed advantage the quad-core and six-core had, that applies here, too. Blah, blah, blah.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/premierepro.png" alt="premiere pro" title="premiere pro" width="620" height="474" /></span></p> <p><span style="color: #000000;"><strong>Premiere Pro CS6 performance</strong></span></p> <p><span style="color: #000000;">One sanity check (benchmark results Intel produces to let you know what kind of performance to expect) said Haswell-E would outperform quad-core Intel parts by 45 percent in Premiere Pro Creative Cloud when working with 4K content. Our benchmark, however, doesn’t use 4K content yet, so we wondered if our results would be similar. For our test, we render out a 1080p-resolution file using source material shot by us on a Canon EOS 5D Mk II using multiple timelines and transitions. We restrict it to the CPU rather than using the GPU as well. Our result? The 3.3GHz Haswell-E was about 45 percent faster than the 4.4GHz Devil’s Canyon chip. Bada-bing! The two extra cores also spit out the render about 19 percent faster than the six-core Ivy Bridge-E. That’s fairly consistent performance we’re seeing between the different workload disciplines of 3D rendering and video encoding so far, and again, big, big wins for the Haswell-E part.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/handbrake.png" alt="handbrake" title="handbrake" width="620" height="407" /></span></p> <p><span style="color: #000000;"><strong>Handbrake Encoding performance</strong></span></p> <p><span style="color: #000000;">For our encoding test, we took a 1080p-resolution video file and used Handbrake 0.9.9 to transcode it into a file using the Android tablet profile. Handbrake is very multi-threaded and leverages the CPU for its encoding and transcoding. Our results were still fairly stellar, with Haswell-E CPU performing about 38 percent faster than the Devil’s Canyon part. Things were uncomfortably close with the Ivy Bridge-E part though, with the eight-core chip coming in only about 13 percent faster than the six-core chip. Since the Ivy Bridge-E cores are slower than Haswell cores clock-for-clock, we were a bit surprised at how close they were. In the past, we have seen memory bandwidth play a role in encoding, but not necessarily Handbrake. Interestingly, despite locking all three parts down at 2,133MHz, the Ivy Bridge-E does provide more bandwidth than the Haswell-E part. One other thing we should mention: Intel’s “sanity check” numbers to let the media know what to expect for Handbrake performance showed a tremendous advantage for the Haswell-E. Against a Devil’s Canyon chip, Haswell-E was 69 percent faster and 34 percent faster than the Ivy Bridge-E chip. Why the difference? The workload. Intel uses a 4K-resolution file and transcodes it down to 1080p. We haven’t tried it at 4K, but we may, as Intel has provided the 4K-resolution sample files to the media. If true, and we have no reason to doubt it, it’s a good message for those who actually work at Ultra HD resolutions that the eight-cores can pay off. Overall, we’re declaring Haswell-E the winner here.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/x264pass1.png" alt="x264 pass 1" title="x264 pass 1" width="620" height="496" /></span></p> <p><span style="color: #000000;"><strong>X264 HD 5.01 Pass 1 performance</strong></span></p> <p><span style="color: #000000;">We’ve been using’s X264 HD 5.0.1 benchmark to measure performance on new PCs. The test does two passes using the freeware x264 encoding library. The first pass is seemingly a little more sensitive to clock speeds and memory bandwidth rather than just pure core count. A higher frame rate is better. The first pass isn’t as core-sensitive, and memory bandwidth clock speed have more dividends here. Haswell still gives you a nice 36 percent boost over the Devil’s Canyon but that Ivy Bridge-E chip, despite its older core microarchitecture, comes is only beaten by 12 percent—too close for comfort. Of course, we’d throw in the usual caveat about the very large clock differences between the chips, but we’ve already said that three times. Oh, and yes, we did actually plagiarize by lifting two sentences from a previous CPU review for our description. That’s OK, we gave ourselves permission.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X but not by much</span></p> <p><span style="color: #000000;"><img src="/files/u154082/x264pass2.png" alt="x264 pass 2" title="x264 pass 2" width="620" height="499" /></span></p> <p><span style="color: #000000;"><strong>X264 HD 5.01 Pass 2 performance</strong></span></p> <p><span style="color: #000000;">Pass two of the X264 HD 5.01 benchmark is more sensitive to core and thread counts, and we see the Haswell-E come in with a nice 46 percent performance advantage against the Devil’s Canyon chip. The Ivy Bridge-E, though, still represents well. The Haswell-E chip is “only” 22 percent faster than it. Still, this is a solid win for the Haswell-E chip. We also like how we’re seeing very similar scaling in multiple encoding tests of roughly 45 percent. With Intel saying it’s seeing 69 percent in 4K resolution content in Handbrake, we’re wondering if the Haswell-E would offer similar scaling if we just moved all of our tests up to 4K.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><strong>Click the next page for even more Haswell-E benchmarks.</strong></p> <hr /> <p>&nbsp;</p> <p><span style="color: #000000;"><img src="/files/u154082/stitch.png" alt="stitch" title="stitch" width="620" height="473" /></span></p> <p><span style="color: #000000;"><strong>Stitch.EFx 2.0 Performance&nbsp;</strong></span></p> <p><span style="color: #000000;">Again, we like to mix up our workloads to stress different tasks that aren’t always multi-threaded to take advantage of a 12-core Xeon chip. For this test, we shot about 200 images with a Canon EOS 7D using a GigaPan motorized head. That’s roughly 1.9GB in images to make our gigapixel image using Stitch.EFx 2.0. The first third of the render is single-threaded as it stitches together the images. The final third is multi-threaded as it does the blending, perspective correction, and other intensive image processing. It’s a good blend of single-threaded performance and multi-threaded, but we expected the higher clocked parts to take the lead. No surprise, the Devil’s Canyon 4.4GHz advantage puts it in front, and the Haswell-E comes in about 14 percent slower with its 1.1GHz clock disadvantage. The clock speed advantage of the 4GHz Ivy Bridge-E also pays dividends, and we see the Haswell-E losing by about 10 percent. The good news? A dual-core Pentium K running at 4.7GHz coughed up a score of 1,029 seconds (not represented on the chart) and is roughly 22 percent slower than the CPU that costs about 11 times more.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/7zip.png" alt="7-zip" title="7-zip" width="620" height="477" /></span></p> <p><span style="color: #000000;"><strong>7-Zip Performance</strong></span></p> <p><span style="color: #000000;">The popular and free zip utility, 7-Zip, has a nifty built-in benchmark that tells you the theoretical file-compression performance a CPU. You can pick the workload size and the number of threads. For our test, we maxed it out at 16-threads using an 8MB workload. That gives the Haswell-E familiar advantage in performance—about 45 percent—over the Devil’s Canyon part. Against that Ivy Bridge-E part though, it’s another uncomfortably close one at 8 percent. Still, a win is a win even if we have to say that if you have a shiny Core i7-4960X CPU in your system, you’re still doing fine.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/sandra.png" alt="sisoft sandra" title="sisoft sandra" width="620" height="421" /></span></p> <p><span style="color: #000000;"><strong>Sisoft Sandra Memory Bandwidth (GB/s)</strong></span></p> <p>Since this is the first time we’re seeing DDR4 in a desktop part, we wanted to see how it stacked up in benchmarks. But, before you get too excited, remember that we set all three systems to 2133 data rates. The Devil’s Canyon part is dual-channel and the Ivy Bridge-E and Haswell-E are both quad-channel. With the memory set at 2133, we expected Haswell-E to be on par with the Ivy Bridge-E chip, but oddly, it was slower, putting out about 40GB/s of bandwidth. It’s still more than the 27GB/s the Devil’s Canyon could hit, but we expected it to be closer to double of what the Ivy Bridge-E was producing. For what it’s worth, we did double-check that we were operating in quad-channel mode and the clock speeds of our DIMMs. It’s possible this may change as the hardware we see becomes more final. We’ll also note that even at the same clock, DDR4 does suffer a latency penalty over DDR3. That would also be missing the point of DDR4, though. The new memory should give us larger modules and hit higher frequencies far easier, too, which will nullify that latency issue. Still, the winner is Ivy Bridge-E.</p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/3dmarkgpu.png" alt="3d mark" title="3d mark" width="620" height="457" /></span></p> <p><span style="color: #000000;"><strong>3DMark Firestrike Overall Performance</strong></span></p> <p><span style="color: #000000;">Even though 3DMark Firestrike is primarily a graphics benchmark, not having a 3DMark Firestrike score is like not having coffee in the morning. Basically, it’s a tie between all three chips, and 3DMark Firestrike is working exactly as you expect it to: as a GPU benchmark.</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/3dmarkphysics.png" alt="3d mark physics" title="3d mark physics" width="620" height="477" /></span></p> <p><span style="color: #000000;"><strong>3DMark Firestrike Physics Performance</strong></span></p> <p><span style="color: #000000;">3DMark does factor in the CPU performance for its physics tests. It’s certainly not weighted for multi-core counts as other tests are, but we see the Haswell-E with a decent 29 percent bump over the Devil’s Canyon chip. But, breathing down the neck of the Haswell-E is the Ivy Bridge-E chip. To us, that’s damned near a tie. Overall, the Haswell-E wins, but in gaming tasks—at stock clocks—paying for an 8-core monster is unnecessary except for those running multi-GPU setups.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/valveparticle.png" alt="valve particle" title="valve particle" width="620" height="451" /></span></p> <p><span style="color: #000000;"><strong>Valve Particle Benchmark Performance</strong></span></p> <p><span style="color: #000000;">Valve’s Particle test was originally developed to show off quad-core performance to the world. It uses the company’s own physics magic, so it should give some indication of how well a chip will run. We’ve long suspected the test is cache and RAM latency happy. That seems to be backed by the numbers because despite the 1.1GHz advantage the Devil’s Canyon chip has, the Haswell-E is in front to the tune of 15 percent. The Ivy Bridge-E chip though, with its large cache, lower latency DDR3, and assloads of memory bandwidth actually comes out on top by about 3 percent. We’ll again note the Ivy Bridge-E part has a 700MHz advantage, so this is a very nice showing for the Haswell-E part.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/dirtlow.png" alt="dirt showdown low" title="dirt showdown low" width="620" height="438" /></span></p> <p><span style="color: #000000;"><strong>Dirt Showdown low-resolution performance</strong></span></p> <p><span style="color: #000000;">For our gaming tests, we decided to run the games at 1366x768 resolution and at very low settings to take the graphics card out of the equation. In one way, you imagine this as what it would look like if you had infinitely powerful graphics cards in your system. As most games are not multi-threaded and are perfectly fine with a quad-core with Hyper-Threading, we fully expected the parts with the highest clock speeds to win all of our low-resolution, low-quality tests. No surprise, the Devil’s Canyon part at 4.4GHz private schools the 3.3GHz Haswell-E chip. And, no surprise, the 4GHz Ivy Bridge-E also eats the Haswell-E’s lunch and drinks its milk, too.</span></p> <p><span style="color: #333399;">Winner: Core i7-4790K</span></p> <p><span style="color: #000000;"><img src="/files/u154082/dirtultra.png" alt="dirt showdown ultra performance" title="dirt showdown ultra performance" width="620" height="475" /></span></p> <p><span style="color: #000000;"><strong>Dirt Showdown 1080p, ultra performance</strong></span></p> <p><span style="color: #000000;">To make sure we put everything in the right context, we also ran the Dirt Showdown at 1920x1080 resolution at Ultra settings. This puts most of the load on the single GeForce GTX 780 we used for our tests. Interestingly, we saw the Haswell-E with a slight edge over the Devil’s Canyon and Ivy Bridge-E parts. We’re not sure, but we don’t think it’s a very significant difference, but it’s still technically a win for Haswell-E.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/hitmanlow.png" alt="hitman low" title="hitman low" width="620" height="502" /></span></p> <p><span style="color: #000000;"><strong>Hitman: Absolution, low quality, low performance&nbsp;</strong></span></p> <p><span style="color: #000000;">We did the same with Hitman: Absolution, running it at low resolution and its lowest settings. The Haswell-E came in about 12 percent slower the Devil’s Canyon part and 13 percent slower than the Ivy Bridge-E.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/hitmanultra.png" alt="hitman ultra" title="hitman ultra" width="620" height="479" /></span></p> <p><span style="color: #000000;"><strong>Hitman: Absolution, 1080p, ultra quality</strong></span></p> <p><span style="color: #000000;">Again, we tick the settings to an actual resolution and quality at which people actually play. Once we do that, the gap closes slightly, with the Haswell-E trailing the Devil’s Canyon by about 8 percent and the Ivy Bridge-E by 9 percent. Still, these are all very playable frame rates and few could tell the difference.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/tombraider.png" alt="tomb raider low" title="tomb raider low" width="620" height="465" /></span></p> <p><span style="color: #000000;"><strong>Tomb Raider, low quality, low resolution.</strong></span></p> <p><span style="color: #000000;">We did the same low quality, low resolution trick with Tomb Raider and while need to see 500 frames per second, it’s pretty much a wash here.&nbsp;</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/tomraiderulti.png" alt="tomb raider ultra" title="tomb raider ultra" width="620" height="472" /></span></p> <p><span style="color: #000000;"><strong>Tomb Raider, 1080p, Ultimate</strong></span></p> <p><span style="color: #000000;">At normal resolutions and settings we were a little surprised, as the Haswell-E actually had a 15 percent advantage over the Devil’s Canyon CPU. We’re not exactly sure why, as the only real advantage we can see is memory bandwidth and large caches on the Haswell-E part. We seriously doubt it’s due to the number of CPU cores. The Haswell-E also has a very, very slight lead against the Ivy Bridge-E part, too. That’s not bad considering the clock penalty it’s running at.</span></p> <p><span style="color: #333399;">Winner: Core i7-5960X</span></p> <p><span style="color: #000000;"><img src="/files/u154082/metrolastlight.png" alt="metro last light low" title="metro last light low" width="620" height="503" /></span></p> <p><span style="color: #000000;"><strong>Metro Last Light, low resolution, low quality</strong></span></p> <p><span style="color: #000000;">In Metro Last light, at low settings it’s a wash between all of them.</span></p> <p><span style="color: #333399;">Winner: Tie</span></p> <p><span style="color: #000000;"><img src="/files/u154082/metroveryhigh.png" alt="metro last light high" title="metro last light high" width="620" height="502" /></span></p> <p><span style="color: #000000;"><strong>Metro Last Light, 1080p, Very High quality</strong></span></p> <p><span style="color: #000000;">Metro at high-quality settings mirrors that of Hitman: Absolution, and we think favors the parts with higher clock speeds. We should also note that none of the chips with the $500 graphics card could run Metro at 1080p at high-quality settings. That is, of course, you consider 30 to 40 fps to be “smooth.” We don’t. Interestingly, the Core i7-4690X was the overall winner.</span></p> <p><span style="color: #333399;">Winner: Core i7-4960X</span></p> <p><strong>Conclusion:</strong> If you skipped to the very last page to read the conclusion, you’re in the wrong place. You need to go back to page 4 to read our conclusions and what you should buy. And no, we didn’t do this to generate just one more click either though that would be very clever of us wouldn’t it?</p> benchmarks cpu haswell e intel ivy bridge e maximum pc processor Review Specs News Reviews Features Fri, 29 Aug 2014 16:00:40 +0000 Gordon Mah Ung 28431 at Google Removes Authorship Information from Search Results <!--paging_filter--><h3><img src="/files/u69/google_news.jpg" alt="Google News" title="Google News" width="228" height="173" style="float: right;" />Boom, no more headshots!</h3> <p>You may have noticed a change on Google's news channels, which is the lack of headshots attached to articles with accompanying information about who wrote each piece. For better or worse, <strong>Google decided to end its authorship program </strong>on the basis that it just wasn't as useful to readers as Google hoped it would be, and even worse, it had become a distraction in some cases.</p> <p>John Mueller, Webmaster Trends Analyst at Google, <a href="" target="_blank">announced the withdrawal</a> of the authorship program on his Google+ page. According to Mueller, tests show that removing authorship generally doesn't reduce traffic to sites, nor does it increase ad clicks. In other words, no harm, no foul for abruptly ending the program.</p> <p>Some webmasters are likely to celebrate Google's decision. Implementing authorship never seemed to be as easy as it should have been, and sometimes it could turn into a downright convoluted process with unpredictable results. In any event, it's gone now.</p> <p>What's not gone is the ability to see Google+ posts from friends and pages when they're relevant to your search query. This will continue to happen both in the main results and on the right-hand side.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> authorship Google news News Fri, 29 Aug 2014 15:20:04 +0000 Paul Lilly 28447 at Microsoft is Cleaning up Windows Store, Removes 1,500 Copycat and Fraudulent Apps <!--paging_filter--><h3><img src="/files/u166440/windows_8_logo.jpg" alt="Windows 8 Logo" title="Windows 8 Logo" width="200" height="200" style="float: right;" />Cleaning is good for the Store</h3> <p>It’s easy to be overwhelmed when searching for applications on the various platforms out there. If it is not due to various apps having similar names then it is the ones that are copycats or even fraudulent. Well, <strong>Microsoft announced that it has removed 1,500 apps from its Windows Store</strong> in an attempt to make finding things easier and provide better quality and choice.&nbsp;</p> <p>To this end, Microsoft has modified its Windows Store app certification requirements. Some of the changes made involves the name of an app. An app’s name will need to “clearly and accurately reflect the functionality of the app.” Developers will also need to make sure that apps are properly categorized and icons must be different from other apps so that consumers will not mistake one for another. The policies are being applied to both new and existing apps for Windows and the Windows Phone Store.</p> <p>Windows Store general manager Todd Brix wrote of Microsoft’s progress on the <a title="Microsoft blog" href="" target="_blank"><span style="color: #ff0000;">official blog</span></a> saying, “These revised policies are being applied to all new app submissions and existing app updates for both the Windows and Windows Phone Store. We’ve also been working on titles already in the catalog, conducting a review of Windows Store to identify titles that do not comply with our modified certification requirements. This process is continuing as we work to be as thorough and transparent as possible in our review. Most of the developers behind apps that are found to violate our policies have good intentions and agree to make the necessary changes when notified. Others have been less receptive, causing us to remove more than 1,500 apps as part of this review so far.”</p> <p>Brix goes on to say that this is still an ongoing process and that the company is increasing resources to speed up the process.&nbsp;</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> apps microsoft Todd Brix Windows Windows apps windows store News Fri, 29 Aug 2014 01:02:52 +0000 Sean D Knight 28443 at No BS Podcast #231: AMD and Origin PC Settle Past Dispute on the Show <!--paging_filter--><h3>Plus: AMD's commitment to high-end CPUs, DDR4, 5-way GPU support, 20nm GPUs, and more!</h3> <p>In a bit of a surprise to us, <a title="amd" href="" target="_blank">AMD</a> and <a title="origin pc" href="" target="_blank">Origin PC</a> wanted to come into the podcast room together for <a title="No BS podcast 231" href="" target="_blank"><strong>episode 231 of the No BS Podcast</strong></a>. As you may recall, this pairing is kind of surprising considering that last October, Origin PC’s co-founder and CEO Kevin Wasielewski announced that the company would be <a title="origin pc drops amd gpus" href="" target="_blank">dropping AMD graphics cards from its systems</a>, stating, “This decision was based on a combination of many factors including customer experiences, GPU performance/drivers/stability, and requests from our support staff.” He then later added, “Based on our 15+ years of experience building and selling award winning high-performance PCs, we strongly feel the best PC gaming experience is on Nvidia GPUs.”</p> <p>Well, not only did we get Wasielewski in the room, but we also got AMD’s VP of Global Channel Sales Roy Taylor and AMD’s Director of Public Relations Chris Hook to come on. In the show, the two parties settle their past dispute with Taylor suggesting that AMD is now committed to giving hardware partners like Origin PC more support and communication. In the podcast, he outlines some of the strategies to do so. Wasielewski also confirmed that you can now get AMD video cards in Origin PCs again and shot down any <a href="" target="_blank">rumors</a> that Nvidia was compensating Origin PC to slander AMD late last year when the announcement came about.</p> <p>Taylor also asserts that AMD’s graphics drivers have gotten a lot better over the past year, but admits this wasn’t always the case and that the company is still getting burned by that bad reputation.&nbsp;</p> <p>While Gordon was away on vacation, he did submit several questions for the rest of the crew to ask on the air, and in the show we cover a ton of ground from topics that range from:</p> <ul> <li>The possibility of 5-way GPU support</li> <li>AMD’s renewed commitment to battling Intel at the high-end CPU market</li> <li>AMD’s plans to start using DDR4</li> <li>Origin PC and AMD’s thoughts on Valve’s upcoming <a title="maximum pc steam machine" href="" target="_blank">Steam Machine</a> initiative</li> <li>AMD’s take on the&nbsp;<a title="oculus rift" href="" target="_blank">Oculus Rift</a>/VR</li> <li>Freesync monitor availability</li> <li>Why <a title="AMD ssd" href="" target="_blank">AMD is getting into the SSD market</a></li> <li>AMD’s presence (or lack thereof) in the laptop/gaming notebook segment</li> <li>20nm GPUs</li> <li>And then we of course top it off with your fan questions!&nbsp;</li> </ul> <p><iframe src="//" width="620" height="349" frameborder="0"></iframe></p> <p>The old format isn’t going away, and Gordon’s rants will return, but in the meantime, give this episode a listen, and let us know what you think!</p> <p><a title="Download Maximum PC Podcast #231 MP3" href="" target="_blank"><img src="/files/u160416/rss-audiomp3.png" width="80" height="15" /></a>&nbsp;<a title="Maximum PC Podcast RSS Feed" href="" target="_blank"><img src="/files/u160416/chicklet_rss-2_0.png" width="80" height="15" /></a>&nbsp;<a href=""><img src="/files/u160416/chicklet_itunes.gif" alt="Subscribe to Maximum PC Podcast on iTunes" title="Subscribe to Maximum PC Podcast on iTunes" width="80" height="15" /></a></p> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Subscribe to the magazine for only 99 cents an issue:</h4> <h5><a title="Subscribe to Maximum PC Magazine" href="" target="_blank">In print</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Zinio" href="" target="_blank">On Zinio</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Google Play" href=";hl=en" target="_blank">On Google Play</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on iTunes" href="" target="_blank">On iTunes</a></h5> <h5><a title="Subscribe to Maximum PC Magazine on Amazon Kindle" href=";qid=1406326197">On the Amazon Kindle Store</a></h5> <h5><a title="Subcribe to Maximum PC Magazine on Your Nook" href="" target="_blank">On the Barnes &amp; Noble Nook Store</a></h5> <h4 style="margin: 0px 0px 5px; padding: 0px; border: 0px; outline: 0px; font-size: 19px; vertical-align: baseline; letter-spacing: -0.05em; font-family: Arial, sans-serif; font-weight: normal; color: #990000;">Stalk us in a number of ways:</h4> <p>Become a fan&nbsp;<a title="Maximum PC Facebook page" href="" target="_blank">on Facebook</a></p> <p>Follow us&nbsp;<a href="" target="_blank">on Twitter</a></p> <p>Subscribe to us&nbsp;<a title="Maximum PC Youtube page" href="" target="_blank">on Youtube</a></p> <p>Subscribe&nbsp;<a title="Maximum PC RSS Feed" href="">to our RSS feed</a></p> <p>Subscribe&nbsp;<a href="" target="_blank">to the podcast on iTunes</a></p> <p>email us at:&nbsp;<a href="">maximumpcpodcast AT gmail DOT com</a></p> <p>Leave us a voicemail at 877-404-1337 x1337</p> 231 amd cpu ddr4 episode graphics cards maximum pc No BS Podcast nvidia origin pc rumors Gaming News No BS Podcast Thu, 28 Aug 2014 20:37:32 +0000 The Maximum PC Staff 28441 at Civilization: Beyond Earth Hands-On <!--paging_filter--><h3>We play through the first 100 turns of Firaxis' next Civ game</h3> <p>We're still a couple months away from the retail release of Civilization: Beyond Earth (C:BE), but publisher 2K Games couldn't hold back the horde any longer. We've been eager to try it out because it's Civ, but also because it feels like a spiritual sequel to Alpha Centauri, which itself dealt with a nagging question from earlier entries in the series: What happens when you win the game by launching an interstellar ship into space? Where do those people go? At first glance, C:BE looks like a sci-fi Civilization V with an exotic color palette, but a number of new layers unfolded during our time with it.</p> <p>Most Civ games begin with selecting your starting conditions (unless you like to live on the edge and randomize all your choices). Your options include the usual things like world size, continent shape, and faction leader characteristics. In the build that we played, we could choose from three randomly generated planets. We could also let the game randomly choose one of those three for us, or we could tell C:BE to roll the dice and generate three new worlds. If that's not your cup of tea, we could also go to the "Advanced Worlds" menu and choose from about ten worlds with scripted conditions. 82 Eridani e, for example, has no oceans and little water. Or we could choose Archipelago, which was basically the opposite. Eta Vulpeculae b, meanwhile, has one large continent and an abundance of resources and wildlife.</p> <p><img src="/files/u160416/screenshot_terrain_lush02.jpg" width="600" height="354" style="text-align: center;" /></p> <p>Six of the worlds that are accessible from this menu come from the Exoplanets Map Pack, which you get by pre-ordering the game before October 24th. Each of these planets will randomize its geography each time you play, leading to an additional layer of replayability. We were not able to dig up a menu that allowed us to fine-tune specific map or gameplay attributes (such as disabling neutral factions or hostile wildlife), but this was not a final build.</p> <p>Then you can also choose to begin the game with a soldier or worker unit, instead of an explorer. Or you could have a clinic installed in your first city automatically. This building improves the city health stat, which indicates population growth and the happiness of your citizens. You will also choose what ship type you want to use to arrive on the planet. This determines bonuses like starting with 100 energy (the currency of C:BE); the initial visibility of coast lines, alien nests, certain resources; and the size of the fog of war around your first city.</p> <p><img src="/files/u160416/screen_combat_satellitebombard.jpg" width="600" height="341" style="text-align: center;" /></p> <p>Then you choose your colonist type. For example, the Refugee type adds +2 food to every city, which promotes growth. Engineers give you +2 production in every city, which decreases the time it takes to construct buildings. Scientists, unsurprisingly, give you +2 science in every city, which increases the speed at which you research new technology. Lastly, you designate your sponsor, which determines who your faction leader is. There are no historical leaders this time, like George Washington or Ghandi. This new gang consists of fictional characters set in a speculative future. We had eight sponsors to choose from. Going with the African Union grants us +10% food in growing cities when their Health rating is 1 or greater. With the Pan-Asian Cooperative, you get a 10% production bonus for Wonders, and 25% faster workers.</p> <p>So after agonizing over all of those branching decisions, you can finally drop into the game. If you're familiar with the last couple Civ games, the interface should be pretty familiar. Your resources appear in the upper right-hand corner, with positive and negative numbers indicating gains or losses per turn. Hovering the cursor over each one gives you a detailed breakdown of where the resources are coming from, and how they're being consumed. Your lower right-hand corner is for notifications and to run through your list of available actions The lower left-hand shows you your selected unit (if any) and its abilities.</p> <p style="text-align: right;"><a href=",1" target="_blank"><strong>Page 2: Exploration, affinities, and virtues</strong></a></p> <hr /> <p>But while the UI should be familiar, this is definitely an exotic planet, with unfamiliar formations like canyons and craters, clouds of poisonous gas, alien critters used for resources, and other alien critters that are actively hostile. It's definitely dangerous terrain for a fledgling civilization. But you'll find resource pods dotted throughout the landscape, which usually contain caches of energy or satellites. Satellites are launched into orbit and extract energy from the planet's surface, though it's not clear how. They stay up for a limited time, though, so you'll need to keep finding them, or produce them on your own. You'll also encounter stations, which behave similarly to city-states in Civ V.</p> <p>And your explorer (scout) unit can excavate native ruins and giant animal bones to grant more bonuses, like free technology. He can only carry one of these excavation kits at once, though, and he needs to return to a city to get more. It also takes five turns to excavate something. This slower pace maintains the unit's viability for a longer stretch than in previous games, and compels you to make more agonizing decisions. Competing factions also don't like it when you excavate something that's closer to their territory than to yours. So you have to balance your desire for discovery against your long-term political risks.</p> <p><img src="/files/u160416/screen_fielding_diplomacy.jpg" title="text-align: center;" width="600" height="341" /></p> <p>Meanwhile, you'll be conducting research on new buildings and units. Instead of going left to right and hitting up pretty much everything along the way, you begin from a central point on the research map and must choose between different branches, each of which contains "leaves" or individual research choices. Each branch has a theme, usually divided into cultural, military, and scientific categories. You can try focusing on one theme, or it might be better to balance as many as you can. Since we were limited to 100 turns, we weren't able to see which turned out to be the better strategy. The things you encounter on the map, the things you build, and the tech you research will frequently trigger binary choices. At one point, the game made us choose between two stations to conduct business with. One station specialized in converting military equipment for civilian use, while another could increase our science score. Both choices have effects on your relationship with the planet's flora and fauna, and you have three affinities to balance: Harmony, Supremacy, and Purity.</p> <p>Each choice grants you a mix of experience points in each affinity, and enough points in one will move you up a level and grant you a bonus. Hovering your mouse over each affinity (located in the upper left-hand corner) tells you what different levels will do. Level 1 of Harmony, for example, reduces the aggression level of the native creatures. Eventually you'll actually gain health from the poison clouds (called "miasma"), and the highest level of your primary affinity grants a critical element for one of the five available victory conditions. At the same time, you'll eventually be at odds with the factions that have different affinities than yours. You can attempt to smooth over relations by establishing lucrative trading routes, engaging in joint military actions, and good old-fashioned bribery. Or you can attempt to wipe them off the map, if you're not into the whole diplomacy thing.</p> <p><img src="/files/u160416/screen_ui_virtues.jpg" width="600" height="341" style="text-align: center;" /></p> <p>And let's not forget about the Virtue system. These operate like Civ V's social policies, but this time there are four of them with nine tiers, so there's more focus and depth to your choices here. On top of that is a grid of synergies, designed to encourage the exploration of multiple virtues. Activating the first tier of each virtue, for example, gives you a bonus activation of your choosing.</p> <p>Eventually, the 2K staff gently ushered us out the door, and we were reluctant to leave. Beyond Earth has a more layers of faction evolution and political intrigue than we're used to seeing in Civ, and we were eager to see the choices that the game would present us with next. We also wanted to build more stuff, of course, and establish more trade routes, explore more of the map, investigate the critters, and maybe start a war or two. Thankfully, we only have about eight more weeks until the game launches into orbit.</p> alpha centauri beyond earth civiliation pc game PC gaming pre-review Sci-fi Sid Meier strategy Games Gaming News Features Web Exclusive Thu, 28 Aug 2014 18:43:23 +0000 Tom McNamara 28439 at