News en Your Broadband Speeds Are About to Quadruple, But Not If Carriers Can Help It <!--paging_filter--><h3>The FCC has a dream, but it needs to dream bigger.</h3> <p>In the February issue of Maximum PC magazine, I wrote about the lack of true broadband speeds in Silicon Valley. This didn't even come close to addressing the entire U.S. It was literally a rant about Silicon Valley, of all places, that lacked broadband speeds competitive with the rest of the world. Well, it looks like this situation is going to change for the better <a href="">according to a report on <em>The Verge</em></a>.</p> <p>According to FCC Chairman Tom Wheeler, "we are never satisfied with the status quo. We want better. We continue to push the limit, and that is notable when it comes to technology."</p> <p style="text-align: right;"><img src="/files/u191083/fcc_tom_wheeler.jpg" alt="FCC Chairman Tom Wheeler" title="FCC Chairman Tom Wheeler" width="700" height="394" style="vertical-align: middle;" /><br /><em>FCC Chairman Tom Wheeler. (Andrew Harrer / Bloomberg)</em></p> <p>In the 2015 Broadband Progress Report, the FCC passed a vote that <a href="">changes the minimum download speeds on broadband</a> connections from 4Mbps to 25Mbps, and uploads from 1Mbps to 3Mbps. While this doesn't seem like a lot--and it isn't--it's still a huge improvement for a large percentage of the population. In fact, if the upgrade seems too paltry to you still, FCC Commissioner Jessica Rosenworcel says 100Mbps should be the bare minimum.</p> <p>100Mbps!</p> <p>"We invented the internet. We can do audacious things if we set big goals, and I think our new threshold, frankly, should be 100Mbps. I think anything short of that shortchanges our children, our future, and our new digital economy," Commissioner Rosenworcel said.</p> <p>But things aren't all rosy. The National Cable &amp; Telecommunications Association (NCTA) sent its objections to the FCC indicating that the FCC "dramatically exaggerate the amount of bandwidth needed by the typical broadband user."</p> <p><strong>I'm going to say it right now: this is shit.</strong></p> <p>For the country that played the leading role in developing the infrastructure of our modern internet, these political and lobbying bodies don't really give two-cents about your requirements as a user. What's important to them are the interests of the broadband providers. Naturally, Verizon was unhappy with the FCC's changes.</p> <p>A Verizon spokesperson who spoke to <em><a href="">Ars Technica</a></em>, said "we currently do not have any plans to enhance that." Meanwhile, the NCTA told the FCC that 25Mbps isn't required for good 4K streaming--something Netflix is pushing for.</p> <p>The issue around broadband runs deep. Local municipalities are often sectioned out, dominated by only one or two major providers. Living communities are even worse. The community where I live only provides Verizon services. Then there are the government and political wrangling that goes on. Last but not least, infrastructure is a problem too. Companies deter from investing heavily in infrastructure improvements because they're costly. The cost often comes from very old legacy infrastructure that requires tearing up.</p> <p>Alan Frisbie, who subscribes to the magazine, wrote in to let me know that the area he resides in, is already fortunate to get 768Kbps downstram and an abysmal 385Kbps upstream--for $50 per month. And because AT&amp;T has an iron hold on Alan's area, he's unable to get other services. For users like Alan, who are essentially jailed to their providers, the FCC's vote couldn't come sooner.</p> <p>"The fastest U-verse service they offer in my area is 768K downstream and 384K up, for $50 a month. The only alternative is a T-1 (1.5 megabits) line for $432 a month. DSL is not offered here," said Alan.</p> <p>The only way this kind of situation changes for the better of everyone, is to stand up, and speak out. The gorilla in the room are the very providers that bring broadband to you, and they are against delivering improvements. Lining shareholder pockets seem more like the priority.</p> <p>If we reviewed U.S. broadband speeds at Maximum PC, we would give it a score of -10 and an ass-kicking.</p> broadband cable internet dsl fcc ISP verizon fios News Fri, 30 Jan 2015 02:51:38 +0000 Tuan Nguyen 29336 at FCC Changes Definition of Broadband, Increases Download Speed to 25Mbps <!--paging_filter--><h3><img src="/files/u166440/fccseal.jpg" alt="FCC Seal" title="FCC Seal" width="200" height="200" style="float: right;" />New benchmark means 55 million people do not have access to broadband</h3> <p>Back in 2010, the Federal Communications Commission ruled that broadband would be defined by having a download speed of 4Mbps and an upload speed of 1Mbps minimum. However, the internet has experienced an exponential growth and an increase in media consumption. As a result, <strong>the FCC has voted to change the definition of broadband by increasing the minimum download speed to 25Mbps and the minimum upload speed to 3Mbps</strong>.</p> <p>The change in definition was announced alongside the publishing of the FCC’s 2015 Broadband Progress Report which reveals, under the new benchmark, that 17 percent (55 million people) of all Americans do not have access to 25Mbps/3Mbps speeds. In addition, the report states that around 35 percent of schools lack access to fiber and, in turn, would likely lack access to broadband speeds of 100Mbps per 1,000 users.&nbsp;</p> <p>“While significant progress in broadband deployment has been made, due in part to the Commission’s action to support broadband through its Universal Service programs, these advances are not occurring broadly enough or quickly enough, the report finds,” reads a statement on the <a title="FCC website" href="" target="_blank"><span style="color: #ff0000;">FCC’s website</span></a>. “The report concludes that more work needs to be done by the private and public sectors to expand robust broadband to all Americans in a timely way, and the accompanying Notice of Inquiry seeks comment on what additional steps the FCC can take to accelerate broadband deployment.”</p> <p>Do you think the new 25Mbps/3Mbps minimum is adequate for today’s internet consumption or should it be higher? Sound off in the comments below!</p> <p><em>Follow Sean on&nbsp;<a title="SeanDKnight Google+" href="" target="_blank"><span style="color: #ff0000;">Google+</span></a>, <a title="SeanDKnight's Twitter" href="" target="_blank"><span style="color: #ff0000;">Twitter</span></a>, and <a title="SeanDKnight Facebook" href="" target="_blank"><span style="color: #ff0000;">Facebook</span></a></em></p> 25Mbps broadband definition broadband fcc Federal Communications Commission Internet News Fri, 30 Jan 2015 02:25:47 +0000 Sean D Knight 29337 at Sapphire Adds Triple Fan Cooler to 8GB Radeon R9 290X, Tweaks Clocks and Lowers Cost <!--paging_filter--><h3><img src="/files/u69/sapphire_radeon_r9_290x_8gb_0.jpg" alt="Sapphire Radeon R9 290X 8GB" title="Sapphire Radeon R9 290X 8GB" width="228" height="225" style="float: right;" />More than just a big frame buffer</h3> <p>Sapphire was the first company to release an 8GB version of AMD's Radeon R9 290X graphics card, though it's no longer the only one -- a handful of other graphics card players jumped on board after AMD gave them a <a href="" target="_blank">reference design</a> to play with. Be that as it may, <strong>Sapphire is intent on standing out from the crowd, so it went and retooled its 8GB R9 290X with a triple fan cooler</strong> and some other changes.</p> <p>According to Sapphire, its Tri-X triple fan cooler is the first in the industry to use a central 10mm heatpipe in addition to four subsidiary heatpipes for even heat distribution throughout the heatsink. The fans themselves have dust repelling bearings with dual ball races and are equipped with aerofoil section blades. Topping it off is a fan cowling designed to guide the airflow for maximum cooling efficiency, Sapphire says.</p> <p>The company also points out that it builds its own PCB rather than outsourcing production. In this instance, its using a 6-phase VDDC power design.</p> <p>You'll find 8GB of GDDR5 memory on the new card, along with a 512-bit interface. The memory is "now clocked at 1375MHz (5.5GHz effective) delivering higher bandwidth than earlier models."</p> <p>Other features include a dual BIOS design, two 8-pin power connectors, and engine clock of up to 1020MHz.</p> <p>As for pricing? Good question -- Sapphire said the card comes it at a "slightly lower cost" but didn't specifiy an exact price. It's also not showing up in retail yet, though we'll update this article when/if we hear back from them. In the meantime, you can see more of the card on its <a href=";gid=3&amp;sgid=1227&amp;pid=2548&amp;psn=&amp;lid=1&amp;leg=0" target="_blank">product page</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> 8GB Build a PC Gaming graphics card Hardware radeon R9 290x sapphire Video Card News Thu, 29 Jan 2015 18:45:34 +0000 Paul Lilly 29334 at Leaked Roadmap Details AMD's Unreleased 'Godavari' APU Line <!--paging_filter--><h3><img src="/files/u69/amd_a_series.jpg" alt="AMD A Series" title="AMD A Series" width="228" height="136" style="float: right;" />A dozen APUs are planned for this summer</h3> <p><strong>A Chinese-language website has posted what it claims is a legitimate roadmap of AMD's forthcoming "Godavari" APUs</strong>. You can think of Godavari as a Kaveri refresh, as the new parts will feature the same Steamroller architecture for both the CPU and GPU portions. If the leaked roadmap proves accurate, AMD is planning to release a dozen Godavari APUs this summer, culminating in the A10-8850K.</p> <p>That unlocked part will feature four cores (and a four thread design) clocked at 3.7GHz base and 4.1GHz boost. It will also have a Radeon R7 graphics core based on AMD's Sea Islands GCN architecture with 512 streaming scores and 856Mhz clockspeed. Other bits include 4MB of L2 cache, DDR3 memory support up to 2,133MHz, and a 95W TDP. According to <em><a href="" target="_blank">wccftech</a>, </em>pricing will close to the current generation A10-7850K APU at about $149.</p> <p>Most of the 12 new chips will be branded as AMD's A-8000 series and will remain compatible with current boards based on the FM2+ platform via BIOS updates. It's not yet known what internal optimizations AMD might be making with Godavari, though from the roadmap, the new chips will feature faster clockspeeds without bumping up TDPs. Performance should improve anywhere from 5 percent to 15 percent.</p> <p>Two of the new chips will be branded as Athlon parts. They include the Athlon X4 870K, a quad-core part clocked at 3.5GHz base and 3.7GHz boost with 4MB of L2 cache and a 95W TDP, and the Athlon X4 850K, also a quad-core chip but clocked at 2.9GHz base and 3.2GHz boost and with a 65W TDP.</p> <p>You can view the <a href="" target="_blank">full roadmap here</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> amd apu Godavari Hardware processor roadmap News Thu, 29 Jan 2015 17:44:52 +0000 Paul Lilly 29333 at Qantas Teams with Samsung to Test Virtual Reality on Airplane Flights <!--paging_filter--><h3><img src="/files/u69/qantas_samsung_vr.jpg" alt="Qantas and Samsung VR" title="Qantas and Samsung VR" width="228" height="152" style="float: right;" />Join the mile high VR club</h3> <p>Imagine that you're 40,000 feet above the ground, but instead of peering out a small oval window and looking at clouds (or darkness), you turn your head and see a dingo wandering about. Don't worry, it's not on the plane's wing feasting on wires and electronics, he's in your Gear VR headset. This is what Australian airline Qantas is working towards. <strong>Along with Samsung, Qantas has launched a new trial entertainment service that gives fliers a Gear VR headset during their flight</strong>.</p> <p>At the outset, the initiative is being tested in Sydney and Melbourne International First Lounges, along with first class cabins on select A380 services. The trial will last for three months, after which Qantas will assess customer feedback, presumably so it can decide whether to expand the program or nix it.</p> <p>Qantas sees multiple possibilities here. From a marketing standpoint, the company can partner with third-parties to provide 3D content that might inspire tourism to a particular attraction or region. In fact, Qantas is already working with Tourism NT, which will provide a special 3D experience from Kakadu National Park. Whether or not they'll include dingoes in that experience isn't known, but there's plenty to experiment with there.</p> <p>For live-action content, Qantas has partnered with Jaunt -- it's not clear exactly what the company has in store, though it will include "destination footage."</p> <p><img src="/files/u69/qantas_samsung_vr_first_class.jpg" alt="Qantas and Samsung VR in First Class Cabin" title="Qantas and Samsung VR in First Class Cabin" width="620" height="413" /></p> <p>"From an inflight entertainment perspective, it’s an industry first," <a href="" target="_blank">said Olivia Wirth</a>, Qantas Group Executive, Brand, Marketing &amp; Corporate Affairs. "Qantas is committed to being at the forefront of innovation to give our passengers the very best and latest in-flight experiences, like accessing the virtual worlds of their favorite Hollywood blockbusters from the comfort of their seat 40,000 feet above the ground."</p> <p>The initiative will kick off in mid-February in the First Class Lounge in Sydney and Melbourne, and in mid-March on select A380 flights between Australia and Los Angeles for first class fliers.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Gear VR Hardware Qantas samsung virutal reality vr News Thu, 29 Jan 2015 16:50:02 +0000 Paul Lilly 29332 at Bill Gates Talks HoloLens, Drinking Water Made From Human Waste in Third Reddit AMA <!--paging_filter--><h3><img src="/files/u69/bill_gates_drinking_water.jpg" alt="Bill Gates Drinking Water" title="Bill Gates Drinking Water" width="228" height="152" style="float: right;" />Microsoft co-founder talks a little of everything in his third Reddit AMA appearance</h3> <p>Guilty as charged -- we used the bit about drinking water made from human waste as a hook, and we'll do you a solid by not wasting any time getting right down to it. According to Bill Gates, the short answer is that it's "just like drinking any other kind of water, except that people get a little freaked out about it." He also has a sense of humor about it, which he proved by posting a photo in Reddit's <a href="" target="_blank">Photoshop Battles </a>sections.</p> <p>Moving on, <strong>there's quite a bit more to digest from Bill Gates third Reddit AMA session</strong>, including his thoughts on HoloLens, life regrets, and more.</p> <p>As for regrets, even billionaire philanthropists have them, and one that Bill Gates laments is not having learned any foreign languages.</p> <p>"I feel pretty stupid that I don't know any foreign languages. I took Latin and Greek in High School and got A's and I guess it helps my vocabulary but I wish I knew French or Arabic or Chinese," Gates said. "I keep hoping to get time to study one of these - probably French because it is the easiest. I did Duolingo for awhile but didn't keep it up. Mark Zuckerberg amazingly learned Mandarin and did a Q&amp;A with Chinese students - incredible."</p> <p>Regarding HoloLens, Gates think its "pretty amazing" and calls it the "start of virtual reality." At the same time, VR technology has to address the issue of nausea.</p> <p>"Making the device so you don't get dizzy or nauseous is really hard -- the speed of the alignment has to be super, super fast. It will take a few years of software applications being built to realize the full promise of this," Gates said.</p> <p>He also shared some thoughts on Microsoft under the direction of CEO Satya Nadella, though not in great detail. He noted that Nadella gets to take a fresh view of the company, both its strengths and weaknesses, adding that "a new person gets to step back and change the focus in some ways. He is off to a great start."</p> <p>It's an interesting read, as he answers a ton of questions. You can check it out <a href="" target="_blank">here</a>, or filter just his replies by clicking <a href="" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> ama Bill Gates Hololens microsoft reddit News Thu, 29 Jan 2015 16:18:09 +0000 Paul Lilly 29331 at Nvidia Will Help Disgruntled GTX 970 Owners Get a Refund, Says a Driver Update is Coming <!--paging_filter--><h3><img src="/files/u69/gtx_970.jpg" alt="Nvidia GeForce GTX 970" title="Nvidia GeForce GTX 970" width="228" height="156" style="float: right;" />Upcoming driver could improve GTX 970's memory performance</h3> <p>Nvidia really stepped in a pile of PR poo when it was discovered that there was an internal communication gaffe over the way the GeForce GTX 970 handles its 4GB of onboard memory and the resulting specs. In short, the GTX 970 has 56 ROPs and 1,792KB of L2 cache instead of matching the GTX 980's 64 ROPs and 2,048KB of L2 cache as originally advertised. However, <strong>Nvidia wants to make things right and has offered to help GTX 970 owners obtain a refund</strong>, if need be. Should you go that route?</p> <p>In most cases, probably not. Before reading any further, however, we highly recommend familiarizing yourself with the situation by <a href="" target="_blank">reading this</a>. Don't worry, we won't go anywhere -- we'll be right here when you get back.</p> <p>Finished? Great, now here's the deal. Nvidia stated on its forum that it's working on a driver update that will do a better job managing the memory scheme on the GTX 970, and expects to improve performance. Granted there's only so much that can be done on the software side to address a physical design, but given that Nvidia built the card the way it did, it stands to reason that it also knows how to properly tune it. We'll see.</p> <p>If you ultimately decide that you don't want the card, however, that's your choice, and Nvidia says it will help you obtain a refund if you're unable to do so on your own. Here's the <a href="" target="_blank">full statement</a>.</p> <p style="padding-left: 30px;">"Hey,</p> <p style="padding-left: 30px;">First, I want you to know that I'm not just a mod, I work for Nvidia in Santa Clara</p> <p style="padding-left: 30px;">I totally get why so many people are upset. We messed up some of the stats on the reviewer kit and we didn't properly explain the memory architecture. I realize a lot of you guys rely on product reviews to make purchase decisions and we let you down.</p> <p style="padding-left: 30px;">It sucks because we're really proud of this thing. The GTX 970 is an amazing card and I genuinely believe it's the best card for the money that you can buy. We're working on a driver update that will tune what's allocated where in memory to further improve performance.</p> <p style="padding-left: 30px;">Having said that, I understand that this whole experience might have turned you off to the card. If you don't want the card anymore you should return it and get a refund or exchange. If you have any problems getting that done, let me know and I'll do my best to help.</p> <p style="padding-left: 30px;">--Peter"</p> <p>It's important to note that Peter says he'll do his best to help, which is different than saying Nvidia will take care of things. In other words, if you're having trouble getting a refund, there's a chance you'll be stuck with it anyway. However, given the PR hit Nvidia's already taken on this one, we suspect those scenarios will be few and far between, if at all.</p> <p>For most people, what this boils down to is that your GTX 970 is going to get even faster courtesy of some forthcoming optimizations.&nbsp; And for the few that are truly affected by the way the GTX 970 handles memory above 3.5GB, you now have someone at Nvidia that's willing to help you obtain a refund.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC driver geforce gtx 970 gpu graphics card Hardware nvidia Video Card News Wed, 28 Jan 2015 19:13:53 +0000 Paul Lilly 29330 at Microsoft Sticks a Fork in Surface 2 and Windows RT <!--paging_filter--><h3><img src="/files/u69/surface_2_4.jpg" alt="Surface 2" title="Surface 2" width="228" height="133" style="float: right;" />Windows RT becomes a footnote</h3> <p>Looking back through the years, it's pretty easy to pick out certain forgettable versions of Windows. You know the ones -- Windows ME, Windows Vista (before the first Service Pack), and now Windows RT, the most recent of the bunch. <strong>Microsoft has reportedly stopped producing Surface 2 tablets</strong>, which also means that the future of Windows RT is nonexistent at this point. You'll have to excuse us for not weeping.</p> <p>Microsoft has moved on, with the Surface Pro 3 contributing heavily to $1.1 billion in Surface revenue during Microsoft's most recent quarter, up 24 percent year-over-year. Surface Pro 3 tablets outsold Surface 2 during the quarter by a ratio of three to one, making it an easy decision for Microsoft to focus on its newest hardware and leave the old behind.</p> <p>"We are no longer manufacturing Surface 2; however, those still eager to buy Surface should visit Microsoft Retail Stores,, third-party retailers, and resellers for the latest availability," <a href="" target="_blank">Microsoft told <em>The Verge</em></a>.</p> <p>Windows RT has been a dead OS walking almost from the moment it arrived. Microsoft already had the challenge of convincing users to adopt a brand new interface in Windows 8, but to also pile on a gimped version that couldn't run x86 programs was too much to ask. As a result, Microsoft ended up taking a <a href="">$900 million charge</a> on unsold Surface RT inventory a year and a half ago.</p> <p>Manufactures could still build devices running Windows RT if they really want to, but with even Microsoft giving up on the OS, there wouldn't be much point.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Hardware laptop microsoft notebook operating system OS Software surface 2 tablet windows rt News Wed, 28 Jan 2015 18:27:13 +0000 Paul Lilly 29329 at Former Opera Boss Just Built a New Browser Called Vivaldi <!--paging_filter--><h3><img src="/files/u69/vivaldi.jpg" alt="Vivaldi" title="Vivaldi" width="228" height="182" style="float: right;" />New browser seeks feedback from power users</h3> <p>Jon von Tetzchner, co-founder and former CEO of Opera Software, is back in the browser game with a new startup. It's called Vivaldi Technologies, of which Tetzchner is also the co-founder and CEO of, and <strong>he's just made a technical preview of his Vivaldi browser available</strong> to the public. While still early in development, it's already fleshed out with features that may attract Opera fans, like mouse gestures and a speed dial interface for displaying your favorite tabs on the new tabs page.</p> <p>Why build a new browser? Tetzchner explains that he's disappointed with the direction Opera has taken in recent years, noting that it's no longer serving its community of users and contributors who helped build it.</p> <p>"So we came to a natural conclusion: We must make a new browser. A browser for ourselves and a browser for our friends. A browser that is fast, but also a browser that is rich in functionality, highly flexible and puts the user first. A browser that is made for you," Tetzchner explains.</p> <p>The build that's available is the first technical preview, and it's intended to show the direction the browser is headed towards. It's missing some key features and needs a lot more optimizing, though even in its early form, there's quite a bit to play with. You'll even find some new goodies, like being able to combine multiple sites into a single tab -- a feature that could come in handy if you're looking to organize similar sites.</p> <p>You can take notes, too, leaving yourself reminders of why you thought a site was worth bookmarking, or for any other reason.</p> <p>As this is a work in progress, the Vivaldi team is seeking feedback. If you're interested in kicking the browser's tires and (optionally) offering up your thoughts, <a href="" target="_blank">head here</a> and give it a download.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> browser Internet jon von tetzchner Opera Software Vivaldi News Wed, 28 Jan 2015 17:53:43 +0000 Paul Lilly 29328 at YouTube Punts Adobe Flash in Favor of HTML5 <!--paging_filter--><h3><img src="/files/u69/html5.jpg" alt="HTML5" title="HTML5" width="228" height="152" style="float: right;" />A major win for open standards</h3> <p>Thanks to the continued advancements into HTML5, <strong>YouTube has decided to kick Adobe Flash to the curb</strong> and default to the open standard instead for playing videos. YouTube would have made the move earlier, but said there were limitations that prevented HTML5 from becoming its preferred platform -- most notable was the lack of support for Adaptive Bitrate (ABR) that allows the streaming site to show more videos with less buffering.</p> <p>"Adaptive Bitrate streaming is critical for providing a quality video experience for viewers - allowing us to quickly and seamlessly adjust resolution and bitrate in the face of changing network conditions," YouTube explained in a <a href="" target="_blank">blog post</a>. "ABR has reduced buffering by more than 50 percent globally and as much as 80 percent on heavily-congested networks. MediaSource Extensions also enable live streaming in game consoles like Xbox and PS4, on devices like Chromecast and in web browsers."</p> <p>YouTube said it's been working with browser vendors over the last four years, along with the broader community to close the gaps and get to this point. Now satisfied with the state of HTML5, YouTube now uses the HTML5 &lt;video&gt; tag by default in Chrome, Internet Explorer 11, Safari 8, and in beta version of Firefox, the Google-owned streaming site said.</p> <p>"We're also deprecating the 'old style' of Flash &lt;object&gt; embeds and our Flash API," YouTube added. "We encourage all embedders to use the &lt;iframe&gt; API, which can intelligently use whichever technology the client supports."</p> <p>By making the switch the HTML5 for video, YouTube joins content providers like Netlifx and Vimeo, along with companies such as Microsoft and Apple, that have backed the open standard.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Adobe flash html5 streaming video youtube News Wed, 28 Jan 2015 17:24:41 +0000 Paul Lilly 29327 at Asus ET2040 All-in-One PC Stays Running if You Trip Over the Power Cord <!--paging_filter--><h3><img src="/files/u69/asus_et2040.jpg" alt="Asus All-in-One ET2040" title="Asus All-in-One ET2040" width="228" height="176" style="float: right;" />An all-in-one with a built-in battery backup</h3> <p>As far as we're concerned, every all-in-one PC should have a built-in battery backup, especially now that they're becoming thin and light enough to move from room to room with relative ease. <strong>Battery backup is one of the features found on Asus' ET2040 AIO</strong> (stays running on battery for up to an hour), though that's not the only thing unique about this system. It also ventures off the beaten path by recognizing gestures without having to touch the screen.</p> <p>Using the Hand-Gesture Recognition Software (HGSR) in conjunction with the built-in camera, you can perform gestures in front of the AIO and do things like play songs, adjust player settings, zoom or rotate photos, and more without leaving behind finger smudges on the display (which is non-touch, by the way).</p> <p>Those are some neat amenities on what's otherwise a mostly underwhelming AIO, albeit one that's sufficiently spec'd for a secondary PC or a general purpose machine.</p> <p>The ET2040 comes with a 19-inch HD (1366x768) non-touch display, Intel Pentium J2900 quad-core processor clocked at 2.41GHz to 2.66Ghz, 2GB of RAM, 500GB hard drive (5400 RPM), 802.11n Wi-Fi, 1MP webcam, three USB 3.0 ports, three USB 2.0 ports, HDMI output, 3-in-1 card reader, Windows 8.1 with Bing, and a few other basics.</p> <p>For the time being, this one's only available in India for 25,000 Rupees (about $407 in U.S. currency). Asus didn't say when or if it plans to make the <a href=";node=5731179031" target="_blank">ET2040</a> available stateside.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> all-in-one asus ET2040 Hardware OEM rigs News Wed, 28 Jan 2015 16:37:27 +0000 Paul Lilly 29326 at Gamers Petition for GeForce GTX 970 Refund Over Error in Specs <!--paging_filter--><h3><img src="/files/u69/nvidia_geforce_gtx_970.jpg" alt="Nvidia GeForce GTX 970 Diagram" title="Nvidia GeForce GTX 970" width="228" height="184" style="float: right;" /></h3> <h3>Internal miscommunication at Nvidia led to confusion over the GTX 970's specs</h3> <p>Sometimes the tech world can be like a geek version of a soap opera, and this is one of those times. The main characters in this case are Nvidia and the GeForce GTX 970. If you're looking for a quick summary of events, it's this: Gamers noticed a slowdown in performance when games tried to access more than 3.5GB of memory on the GTX 970. This in turn led to Nvidia explaining a new memory architecture in the GTX 970, along with clarification of specs that were different than originally reported. In light of all this, <strong>there's a petition floating around demanding a refund for anyone who purchased a GTX 970</strong>, but to really understand what's going on, a deeper explanation is necessary.</p> <p>This all began a week ago when users on various forums began investigation a memory issue with the GTX 970. At a glance, it seemed that the card was only using 3.5GB of its 4GB of GDDR5 memory. Upon closer look, it was discovered that a serious performance drop could occur when accessing that final .5GB of VRAM, which isn't an issue on the GTX 980.</p> <p>To clarify what was happening, Nvidia issued the following statement:</p> <p>"The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system," Nvidia said. "To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.</p> <p>"We understand there have been some questions about how the GTX 970 will perform when it accesses the 0.5GB memory segment. The best way to test that is to look at game performance. Compare a GTX 980 to a 970 on a game that uses less than 3.5GB. Then turn up the settings so the game needs more than 3.5GB and compare 980 and 970 performance again."</p> <p>Nvidia Senior VP of GPU Engineering, Jonah Alben, <a href="" target="_blank">spoke with <em>PC Perspective</em></a> and broke things down even further with a quite a few technical details. He also offered a helpful diagram, seen below.</p> <p><img src="/files/u69/gtx_970_diagram.jpg" alt="Nvidia GeForce GTX 970 Diagram" title="Nvidia GeForce GTX 970 Diagram" width="620" height="479" /></p> <p>As you can see in the graph, there are 13 enabled SMMs, each with 128 CUDA cores for a total of 1,664. There are also three that are grayed out -- they've been disabled from the full GM204 found on the GTX 980. But what's really important is the memory system, which is connected to the SMMs through a crossbar interface.</p> <p>"That interface has 8 total ports to connect to collections of L2 cache and memory controllers, all of which are utilized in a GTX 980. With a GTX 970 though, only 7 of those ports are enabled, taking one of the combination L2 cache / ROP units along with it. However, the 32-bit memory controller segment remains," <em>PC Perspective</em> writes.</p> <p>There are a couple of takeaways there. First is the GTX 970 has less ROPs and L2 cache than the GTX 980 even though it was reported otherwise. Why? Nvidia blames the gaffe on an error in the reviewer's guide, which is usually a PDF (or actual paper) containing detailed info on a product prior to its launch that manufacturers send out to reviewers, and a misunderstanding between the engineering team and the technical PR team on how the architecture actually functioned.</p> <p>Bottom line is, the GTX 970 has 56 ROPs and 1,792KB of L2 cache instead of 64 ROPs and 2,048KB of L2 cache like the GTX 980.</p> <p>That's actually not as big of a deal as it sounds, as the SMMs are the true bottleneck, not the ROPs.</p> <p>"A quick note about the GTX 980 here: it uses a 1KB memory access stride to walk across the memory bus from left to right, able to hit all 4GB in this capacity," <em>PC Perspective</em> writes. "But the GTX 970 and its altered design has to do things differently. If you walked across the memory interface in the exact same way, over the same 4GB capacity, the 7th crossbar port would tend to always get twice as many requests as the other port (because it has two memories attached). In the short term that could be ok due to queuing in the memory path. But in the long term if the 7th port is fully busy, and is getting twice as many requests as the other port, then the other six must be only half busy, to match with the 2:1 ratio. So the overall bandwidth would be roughly half of peak. This would cause dramatic underutilization and would prevent optimal performance and efficiency for the GPU."</p> <p>There are a LOT more details to digest, and rather than continue to quote bits and pieces, we suggest you read <em>PC Perspective's</em> <a href="" target="_blank">detailed report</a>. If after doing so you come to the conclusion that it's much ado about nothing, great, there's nothing more to see here. However, if you fall on the other side of the fence and feel duped, you can check out and sign the <a href="" target="_blank">petition at</a>.</p> <p>Our take? It's an unfortunate situation Nvidia created for itself, and gamers have a right to be angry over the misreported specs. At the same time, it appears that the impact on real-world performance is negligible, at least for now -- this could be a bigger issue as higher resolution game play becomes more common. Even still, it remains a great card for the price.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC Gaming geforce gtx 970 graphics card Hardware petition Video Card News Tue, 27 Jan 2015 20:07:53 +0000 Paul Lilly 29322 at An Inside Look at How Logitech Designs Its Gaming Mice <!--paging_filter--><h3><img src="/files/u154082/dsc01600.jpg" alt="logitech gaming mouse" title="logitech gaming mouse" width="250" height="141" style="float: right;" />The science and testing behind Logitech’s gaming mice</h3> <p><em>This is part two of our in-depth tour of Logitech’s facilities in Switzerland. This article focuses on how Logitech designs and develops its gaming mice. For an inside look at how the company is attempting to reinvent the mechanical keyboard, click <a title="logitech mechanical keyboard" href="" target="_blank">here</a>.</em></p> <p>While Logitech is generally viewed as a peripheral manufacturer, the company views itself as a technology company. In an attempt to show PC gamers that it uses cutting-edge design methodologies, Logitech invited us to its headquarters in Lausanne, Switzerland to show us how the company designs and tests it gaming mice.</p> <p style="text-align: center;"><iframe src="//" width="560" height="315" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Logitech explains how its G402 mouse uses two sensors</strong></p> <p style="text-align: left;"><img src="/files/u154082/g402_hyperion_fury.jpg" alt="logitech g402 hyperion fury" title="logitech g402 hyperion fury" width="200" height="214" style="float: left; margin: 5px;" /></p> <p style="text-align: left;"><strong>Logitech G402 Hyperion Fury<br /></strong>The company’s most interesting mouse today is arguably the G402 Hyperion Fury, which it claims to be “the world’s fastest gaming mouse.” Logitech boasts that the G402 can move a blistering 12.5 meters a second. To achieve this, Logitech says it uses a combination of two sensors. At slow-to-moderate speeds, the mouse uses a traditional optical sensor. Optical sensors are arguably the most common sensors used in gaming mice and use high-speed cameras to take blazing-fast images of the surface it rests upon. From here, the sensor then overlaps the images to create a movement map. While the cameras used in Logitech’s optical sensors are magnitudes faster than the traditional point-and-shoot cameras you find at your camera store (think about 12,000 shots a second), the company says that even they have detectable lag when you’re trying to move a mouse at 12.5 meters a second. Therefore, beyond a certain speed threshold, the G402 switches over to an accelerometer/gyroscope solution. It uses a small ARM processor that can switch on the fly, and Logitech claims less than a millisecond of delay results from the switch. While a gyroscope solution isn’t the most accurate sensor at low speeds, Logitech says they excel when there is a quick burst of movement, thus the G402 uses a hybrid solution that aims to leverage both sensor’s strengths to achieve its speed.</p> <p style="text-align: center;"><iframe src="//" width="560" height="315" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>An indepth interview with Logitech's mouse expert Chris Pate</strong></p> <p><img src="/files/u154082/logitech_g302.jpg" alt="Logitech G302 Daedalus Prime" title="Logitech G302 Daedalus Prime" width="200" height="166" style="float: left; margin: 5px;" /></p> <p><strong>Logitech G302 Daedalus Prime<br /></strong>While this hybrid sensor seems advantageous for the end user, we were surprised to hear that the company’s even newer G302 Daedalus Prime mouse opts instead to support a more traditional optical solution. Logitech told us the reason the hybrid solution wasn’t included was because the G302 was designed to be a smaller, lighter MOBA mouse, and trying to house two sensors along with the G402’s ARM processor wasn’t ideal to achieve this compact form factor. This isn’t to say the G302 doesn’t have its element of uniqueness, however.</p> <p style="text-align: center;"><strong><iframe src="//" width="560" height="315" frameborder="0"></iframe></strong></p> <p style="text-align: center;"><strong>Logitech says its mice are good for at least 20 million clicks</strong></p> <p>Because MOBAs like League of Legends and DOTA 2 feature tons of clicking, the Daedalus Prime is largely focused on eliminating the travel between the mouse’s buttons and its microswitches that activate commands. The G302 is able to do this by separating the left and right mouse buttons from the body of the mouse (Logitech says most mice use a monolithic design), and having them rest directly on top of the microswitch. This means that there is no air travel between the button and the switch at all. In the absence of air travel, Logitech designed a new metal spring tensioning system that rests between the button and the switch. When we asked Logitech if this could potentially add unwanted tension, which could theoretically create microscopic amounts of lag in and of itself, the company assured us that it didn’t, but rather aided in a consistent clicking experience.</p> <p style="text-align: center;"><strong><iframe src="//" width="560" height="315" frameborder="0"></iframe></strong></p> <p style="text-align: center;"><strong>A Logitech contraption that measures mouse accuracy</strong></p> <p><img src="" alt="logitech g602" title="logitech g602" width="200" height="165" style="float: left; margin: 5px;" /></p> <p><strong>Logitech G602<br /></strong>One of the best-selling mice that Logitech currently offers is its G602 wireless mouse. According to Logitech, when you look at the mouse industry as a whole, wireless mice outsell wired ones. This might not be true for gaming, but with the G602, Logitech worked to overcome many of gamers’ fears.</p> <p>The most obvious concern for gamers is lag. According to Logitech, lag on the G602 is imperceptible. The company ran an experiment where it asked a group of gamers if they could detect any noticeable lag using its wireless gaming mouse. People said they believed it felt laggier than a traditional wired mouse. When Logitech plugged in a faux wired cable (that did nothing), the same users said it felt much more responsive. Essentially, Logitech asserts that it was merely the placebo effect at play. According to Logitech, the G602 is capable of delivering a two millisecond response time. The company says that most people can only detect latency at four milliseconds and beyond. According to its own studies, some people can’t even perceive 40 milliseconds of lag.</p> <p style="text-align: center;"><iframe src="//" width="560" height="315" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Logitech has a special room that removes all wireless signals to detect wireless dead zones for its wireless mice.</strong></p> <p>Logitech claims it could have gotten the G602’s response time under two milliseconds, but at the cost of battery life, which is actually the true obstacle of a wireless gaming mouse. By scaling it back to two milliseconds, Logitech says it was able to get much more battery life out of the G602, which it asserts is able to get 250 hours of use out of a single charge. How is the company able to achieve those figures? Logitech says that it designed the G602 with battery in mind and created a sensor specifically for gaming wirelessly. The G602 also uses Logitech’s proprietary USB interface. When we asked them why it didn’t use Bluetooth, the company informed us that the response rate of Bluetooth devices are at the mercy of the host (computer) device. The G602, in particular, uses a 1,000Hz polling rate through USB.</p> <p style="text-align: center;"><strong><iframe src="//" width="560" height="315" frameborder="0"></iframe></strong></p> <p style="text-align: center;"><strong>Logitech proving that there is no added acceleration to its mice.</strong></p> <p>Other interesting things we learned about mice from Logitech is that no sensor is 100 percent accurate. You might see that terminology used to market mice from other vendors, but Logitech asserts that these claims are simply false.</p> <p>Another question we had pertained to laser mice. Several years ago, laser mice were quite popular because they tracked on a wider range of surfaces compared to optical. While laser mice aren’t terrible, optical mice have one key advantage over them, and that comes down to accuracy variance, more commonly referred to as “mouse acceleration.” Mouse acceleration is undesired for gaming and generally equates to an inconsistent movement experience. According to Logitech, with laser mice, you get about a five to six percent variance, making for an inconsistent experience, compared to an optical sensor’s one percent equivalent.</p> <p>One final interesting tidbit that we learned is that many gamers prefer braided cables on their mice, but Logitech’s data shows that more pros actually prefer plastic cables as they tend to offer more flexibility. So if you want to play like a pro, you might want to consider ditching the braided cable.</p> <p>For more pictures and information from the event, check out our image gallery below.&nbsp;</p> Daedalus Prime esports G302 G402 g602 gaming mice Hardware hyperion fury logitech moba mouse shooter wireless Gaming News Mice Features Tue, 27 Jan 2015 19:35:46 +0000 Jimmy Thang 29321 at Intel Teases First NUC Desktop with Core i7 Broadwell CPU <!--paging_filter--><h3><img src="/files/u69/nuc_broadwell.jpg" alt="NUC Meets Broadwell" title="NUC Meets Broadwell" width="228" height="140" style="float: right;" />New frontier for the NUC</h3> <p>We were intrigued with the potential of the NUC when it first came out -- here was this tiny box with fairly respectable hardware inside powerful enough to serve as a secondary PC or, for the right person, a primary system. There have been several follow-up models since then, but the best is yet to come. <strong>Intel has gone and updated its NUC product page with a new model that will be the first to feature a Core i7 processor inside</strong>.</p> <p>Not a lot of details are available on the Core i7 model (NUC5i7RYH), which is one of several new NUCs based on the chip maker's 5th Generation Core processor (14nm Broadwell) line. According to the listing, it will feature a Core i7 part, 2.5-inch drive support, and mosey into retail sometime in the second quarter of this year.</p> <p>The updated NUC site also lists six other Broadwell-based systems, half of them sporting Core i5 processors (one with a Core i5 5300U vPro chip and two with Core i5 5250U CPUs) while the other half come equipped with Core i3 chips (Core i3-5010U).</p> <p>What they all have in common is support for up to 16GB of RAM, 2.5-inch and M.2 SSD storage support, four USB 3.0 ports, and 802.11ac Wi-Fi. If you need HDMI output, only the Core i5 models will oblige (and potentially the forthcoming Core i7 model).</p> <p>You can find out more details on each one <a href="" target="_blank">here</a>.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> broadwell core i7 Hardware intel mini pc nuc rigs News Tue, 27 Jan 2015 17:49:52 +0000 Paul Lilly 29320 at Crucial Ballistix Elite RAM Now Available in DDR4 Memory Kits <!--paging_filter--><h3><img src="/files/u69/ballistix_ddr4.jpg" alt="Crucial Ballistix DDR4" title="Crucial Ballistix DDR4" width="228" height="124" style="float: right;" />Another memory option for Intel X99 platforms</h3> <p>The number of DDR4 memory kits is growing and will continue to do so as more people build (or buy) systems based on Intel's X99 chipset. One of the newest is <strong>Crucial's Ballistix Elite line, now available in DDR4 form</strong> as a single 4GB module and in 8GB (2x4GB) and 16GB (4x4GB) kits (Crucial says a 32GB kit is also available, though it's not listed on the company's web store yet). As both kits use essentially the same 4GB module, the performance ratings are the same across the board.</p> <p>Crucial's 4GB DDR4 Ballistix Elite module is rated at DDR4-2666 (PC4-2133), which Crucial calls an "introductory" speed -- we take that to mean there should be some overclocking headroom, especially since the Ballistix Elite series is aimed at "extreme enthusiasts, gamers, and overclockers." The sticks also support Intel XMP 2.0 profiles, feature a custom-designed baclk PCB with anodized aluminum heat spreaders, and sport 16-17-17 timings at 1.2V.</p> <p>If you do plan to overclock, you might want to take advantage of Crucial's exclusive Ballistic Memory Overview Display utility, otherwise known as M.O.D. You can use M.O.D. to read information from the modules, including real-time temperatures from the integrated thermal sensor, voltages, and more.</p> <p>Pricing on <a href="" target="_blank">Crucial's website</a> breaks down as follows:</p> <ul> <li>4GB Ballistix Elite DDR4-2666: $95</li> <li>8GB (2x4GB) Ballistix Elite DDR4-2666: $190</li> <li>16GB (4x4GB) Ballistix Elite DDR4-2666: $380</li> </ul> <p>Newegg also carries the kits, though they're in <a href=";IsNodeId=1&amp;N=100006519%2050001455%2040000147%20600531811&amp;Manufactory=1455" target="_blank">pre-order form</a>. Pricing looks like this:</p> <ul> <li>4GB Ballistix Elite DDR4-2666: $100 (out of stock)</li> <li>8GB (single stick) Ballistix Elite DDR4-2666: $220 (releases March 10, 2015)</li> <li>8GB (2x4GB) Ballistix Elite DDR4-2666: $200 (releases February 6, 2015)</li> <li>16GB (2x8GB) Ballistix Elite DDR4-2666: $352 (releases March 10, 2015)</li> <li>16GB (4x4GB) Ballistix Elite DDR4-2666: $380 (releases February 6, 2015)</li> <li>32GB (4x8GB) Ballistix Elite DDR4-2666: $704 (releases March 10, 2015)</li> </ul> <p>Shipping charges range from $1 to $3, depending on the kit.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> ballistix elite Build a PC Crucial ddr4 Hardware Memory ram News Tue, 27 Jan 2015 17:01:16 +0000 Paul Lilly 29319 at