News en GDC 2015: Getting to Know the Realm Resistance Training Controller [VIDEO] <!--paging_filter--><h3><img src="/files/u69/realm_controller.jpg" alt="Realm Controller" title="Realm Controller" width="228" height="138" style="float: right;" />Adding physicality to gaming</h3> <p>One of the things that motion controllers have helped popularize is the fitness gaming category. We've seen it on consoles and the PC alike, and it's a trend that isn't going away. Just the opposite, there are new products coming out to make gaming even more physical. <strong>While at GDC, we stopped by to check one of them out -- the Realm resistance training controller</strong>, which is intended to make you forget you're getting a workout.</p> <p>Whether it's slashing zombies with a knife or boxing an opponent, the greater the force you put into your swings, the harder you'll hit your virtual opponent. According to Realm, the fitness benefit is simply a byproduct of playing whatever particular title has you working up a sweat.</p> <p>The controller consists of a strap that goes around your waist. Hand grips attach to both sides. Inside the hand grips are a bunch of sensors that communicate with a camera, be it a Kinect or a webcam, to measure how much force is being used when you flail about. It's also a wireless control scheme that communicates with a USB dongle.</p> <p>Check it out below, and if you like what you see, you can <a href="" target="_blank">back the project on Kickstarter</a>.</p> <p><iframe src="" width="620" height="349" frameborder="0"></iframe></p> controller GDC 2015 Hardware realm News Fri, 06 Mar 2015 18:03:56 +0000 Jimmy Thang and Paul Lilly 29555 at GDC 2015: Oxide Games and Stardock Discuss Mantle, DirectX 12, and Vulkan [VIDEO] <!--paging_filter--><h3><img src="/files/u69/oxide_gdc.jpg" alt="Oxide GDC" title="Oxide GDC" width="228" height="125" style="float: right;" />Take a peek at the first game using Oxide's Nitrous engine</h3> <p>The future of AMD's Mantle is up in the air since AMD recently <a href="">told developers</a> to focus on DirectX 12 instead. However, it doesn't appear as though AMD is ready to completely dismantle its API, which will have a future in Vulkan, the next version of the OpenGL API. You may recall that Oxide Games was a big proponent of Mantle -- check out <a href="">our interview</a> from a year ago. How does Oxide feel today? To find out, <strong>we headed to Oxide's booth at GDC</strong> and talked about a number of things.</p> <p>The first thing Oxide showed us was a forthcoming game called Ashes of the Singularity. It's a massively large RTS game developed with Stardock and the first to use Oxide's Nitrous engine, which the company claims can render 10,000 individual units at the same time. The goal with Ashes of the Singularity (other than to make money, of course) is to bring "an unprecedented scale" to the RTS category.</p> <p>Oxide tells us the Nitrous engine has been ported to DX12. The company is also working with Vulkan to make sure it emerges as a top class API. Unfortunately, Oxide wasn't willing to divulge much about Vulkan at this early stage.</p> <p>One thing gamers with high-end rigs will be happy to know is that Oxide developed Ashes of the Singularity to take advantage of top-shelf hardware, if you have it. The beefier your rig, the more settings you can crank up. On the flipside, owners of lower end hardware can dial things down for a playable experience.</p> <p>Early Access will be available this summer, and if all goes well, the game will release this winter. Here's more.</p> <p><iframe src="" width="620" height="349" frameborder="0"></iframe></p> amd directx 12 dx12 games GDC 2015 mantle Oxide Games Software Stardock Vulkan News Fri, 06 Mar 2015 17:40:50 +0000 Jimmy Thang and Paul Lilly 29554 at GDC 2015: An Epic Discussion on Unreal Engine and VR Technology [VIDEO] <!--paging_filter--><h3><img src="/files/u69/epic.jpg" alt="Epic at GDC" title="Epic at GDC" width="228" height="135" style="float: right;" />An Epic perspective on VR</h3> <p>Epic Games earlier this week <a href="">announced</a> that it was dropping its subscription fee to license Unreal Engine 4. Now instead of paying $19 per month on top of any applicable royalties, developers can dive in and get access to UE4's complete C++ source code hosted on GitHub. They can even make a little bit of pocket change without sharing the wealth -- up to $3,000. After that, a 5 percent royalty per quarter applies. Not a bad deal, and <strong>we caught up with Epic at GDC</strong> to talk about this and more.</p> <p>General Manager Ray Davis was manning Epic's booth. He explained that the subscription removal and 5 percent royalty above $3,000 is designed to remove any barriers that developers might have from using UE4. Not all projects are going to be hits, and so Epic set up a model where it succeeds when and only if developers succeed.</p> <p>"These days there's a wide diversity of developers and the types of products we're going for and so we want to make sure that there's no friction for people to get started, to pick up the tools and start actually building whatever their idea may be," Davis explained.</p> <p>Davis also touched on UE4 supporting multiple VR technologies, including the new Steam VR hardware, Oculus Rift, and others. With so many hardware options, we asked if he was concerned about a standards war. For right now the answer is no, as Davis noted that until there's a clear de facto standard, they're all helping to push VR forward.</p> <p>Here's more:</p> <p><iframe src="" width="620" height="349" frameborder="0"></iframe></p> <p>We also have some footage of Epic's Crescent Bay Unreal Engine demo. Here it is:</p> <p><iframe src="" width="620" height="349" frameborder="0"></iframe></p> crescent bay Epic GDC 2015 Hardware Software Unreal Engine virtual reality vr News Fri, 06 Mar 2015 17:12:05 +0000 Jimmy Thang and Paul Lilly 29553 at GDC 2015: Meeting Up with Crytek (and a T-Rex) [VIDEO] <!--paging_filter--><h3><img src="/files/u69/crytek_t-rex.jpg" alt="Crytek and T-Rex" title="Crytek and T-Rex" width="228" height="138" style="float: right;" />Going face to face with a T-Rex</h3> <p><strong>What is it like to experience VR's latest prototype called "Crescent Bay?"</strong> How does it feel to have a T-Rex breathe down your neck as you stand in a pile of her unhatched eggs? Does the T-Rex really have a <a href="" target="_blank">walnut-sized brain</a>? Awesome, scary, and watch Land of the Lost. Those are our quick answers if you're in a rush. For everyone else, let us elaborate a bit about what we saw at GDC.</p> <p>We stopped by Crytek's booth to check out its T-Rex demo, which is a modernized version of Dinosaur Island, a 14-year-old demo that was originally designed to show what Crytek could do with graphics. Crytek overhauled the graphics and cleaned it up for a VR demonstration using CryEngine. The response was so overwhelming positive that Crytek extended the demo to three and a half minutes for GDC, while collaborating with Oculus VR.</p> <p>Here's a look at the demo in action:</p> <p><iframe src="" width="620" height="349" frameborder="0"></iframe></p> <p>So, why doesn't the T-Rex go postal in the above demo and swallow us whole? The demo is actually in the perspective of a baby dinosaur that just hatched. You're surrounded by unhatched siblings when all of a sudden the mother T-Rex comes charging over to investigate.</p> <p>We spoke with Crytek's director of production, David Bowman, who told us that the responses from the T-Rex demo have been mixed. Some people have yelled when the dinosaur comes running up to them, others have tried to play it cool before flinching, and yet others have tried to crouch down and hide behind an egg. All of these are natural responses to what's ultimately a realistic VR experience.</p> <p>Bowman wasn't allowed to talk about Crescent Bay's hardware, but one thing he noted, and that we experienced, is the lack of nausea. Of course, this is a standing a demo without a lot of motion, but previous to Crescent Bay, a user might feel nauseous by swinging his or her head from side to side.</p> <p>Here's more of what Bowman had to say:</p> <p><iframe src="" width="620" height="349" frameborder="0"></iframe></p> crescent bay CryEngine Crytek Dinosaur Island GDC 2015 oculus rift virtual reality vr News Fri, 06 Mar 2015 14:17:38 +0000 Jimmy Thang and Paul Lilly 29552 at GDC 2015: Nvidia Shield and GRID <!--paging_filter--><p>Does anyone buy CDs or Blu-ray discs anymore? You can stream so much stuff for a few bucks a month that it's hard to make an argument for physical media these days. Those two mediums have nearly leapfrogged the downloading phase that PC games have been in for a decade, since the dawn of Steam. Now Nvidia is making a push for streaming games, too, and its new Shield console is central to that effort. We sat down today for a talk presented by Eric Young, an engineering manager at Nvidia, who gave us some more details about how the Shield handles streaming from the company's cloud-based service dubbed GRID.</p> <p>Nvidia GRID has been streaming major PC titles to the Shield Tablet and Shield Portable for nine months now, so its existence is probably not news to our readers. In case it is, Nvidia pitches it like this: You can buy a game and be playing it in less than a minute, you can use it on mobile devices that would otherwise choke, games update themselves without you needing to download a patch, and it solves a lot of problems with piracy.</p> <p><img src="/files/u160416/grid_620.jpg" width="620" height="396" /></p> <p>The secret sauce is that the game runs in the cloud itself, then a video of the gameplay is shot down the series of tubes to your computer. You move your mouse, and you see the mouse move in-game, "In half of the blink of an eye," according to company CEO Jensen Huang. If you're close enough to the GRID server farm, Nvidia says that your latency can be as low as 150 milliseconds. That's low enough to play a racing game like Grid 2, which they demonstrated live at the Shield console press event on March 3rd.</p> <p>Another critical element is hardware-accelerated video encoding and decoding. The latest Nvidia video cards have this, and so does the Shield console. It can do both H.264 and H.265. If your Internet connection can handle bandwidth in the 15-25mbps range, you can get 60 frames per second at 1080p. Eric Young says that it takes only 10ms to encode frames on the shield, and the bundled controller's Wifi Direct tech lowers input latency to 10ms as well. The console takes 5ms to send video to your display, which has its own processing latency of a few milliseconds (this last step varies widely from one TV to another). Users of the Shield tablet or Shield Portable will be limited to 60 FPS at 720p, unless you plug in an Ethernet cable via an OTG adapter.</p> <p><img src="/files/u160416/grid_620_2.jpg" width="620" height="420" /></p> <p>The shield uses its hardware acceleration to also play 4K movies and TV shows, where available. Netflix has just started doing that, so the Shield is ready out-of-the-box.</p> <p>Developers who want to integrate their games into GRID will be getting access to the GRID Link SDK "very soon." We suspect that it will coincide with the company's GPU Technology Conference happening later this month.</p> android cloud gaming cloud streaming grid nvidia shield Shield game console News Fri, 06 Mar 2015 04:28:59 +0000 Tom McNamara 29550 at GDC 2015: Xbox Live Comes to Windows 10 <!--paging_filter--><h3>Microsoft aims for a unified experience</h3> <p>If you've heard of Games for Windows Live (GFWL), then you're probably familiar with some of its troubles. The difficulties some users had with fundamental things like logging in and updating the GFWL could produce some epic tales of woe. GFWL was deactivated last year, and with it went its online matchmaking system, meaning that games that used this service to create multiplayer sessions either no longer had multiplayer or had to plug into something else, such as SteamWorks. With the next big version of Windows coming out this year, Microsoft wants to give it another shot, and thankfully their using a different set of tools and also introducing some interesting new features. We sat down for a lecture on the subject, conducted by Microsoft engineers Vijay Gajjala and Brian Tyler.</p> <p>First of all, the Xbox Live service on Windows 10 is running native desktop PC code, in a Windows 10 app, rather than being a port. Windows tablets and phones running Windows 10 all get their native apps, which in theory should mean smoother operation. Microsoft is also talking about single sign-on, so you can use one Microsoft account across all Win10 devices to log into XBL as well as logging into the device itself. When asked about this service's availability for earlier versions of Windows, and for Android or Max OSX devices, the company had no firm plans on the horizon. Don't hold your breath.</p> <p><img src="/files/u160416/xblwin10_620_1.jpg" width="620" height="465" /></p> <p>When you log into XBL on Windows 10, you get the complete XBL experience – your profile page, achievements, leaderboards, messages, activity feeds, game clips, privacy settings, basically everything that's exposed to an Xbox One user. There's a "Game DVR" where you can record gameplay and upload it to be viewed on other devices logged into XBL. (And in case you haven't heard, you can stream Xbox One games to your PC and play them there.)</p> <p>As far as Microsoft is concerned, there is no difference between what Xbox One users can do and see, and what PC users can do and see when logged into XBL. Microsoft is also adding a Twitter-like feature where users can be followed, rather than requiring an accepted friend request to see another XBL player's activity. This can be disabled in your privacy settings, if you wish.</p> <p>Windows 10 developers who want to use XBL now get access to a fancy telemetry system. In a nutshell, a dev can flag certain player activities like acquiring a specific weapon, loading a particular level, or completing a lap in a racing game, and have those events uploaded to a "stats engine" in Microsoft's cloud. This is used to handle achievements, but the data can also be aggregated and analyzed to better understand how people are playing the game. So you could answer questions like, "Is this boss fight too hard?" or "Is this secret room a little *too* secret?" This data feed is an optional service that developers can subscribe to or ignore.</p> <p><img src="/files/u160416/xblwin10_620_2.jpg" width="620" height="351" /></p> <p>Devs also get access to the Windows Dev Center, which is a desktop client where they can do things like add a challenge to the game (a limited-time achievement), or create a leaderboard for a specific activity, like number of bad guys killed or number of explosions caused. These things can be staged on a test server before being published live, to catch problems before they're published live.</p> <p>In a separate but closely linked GDC session conducted by Ferdinand Schober, a software development engineer also at Microsoft, we gleaned some details about cross-platform multiplayer. Session data is stored in Microsoft's cloud, so it will be difficult to misplace. Like GFWL, Microsoft will handle matchmaking on its own servers, and developers can find-tune the player skill ranges to search for, and how long the matcher searches before expanding its searching area another notch. Like on the Xbox One, invites and joining in progress are also handled with a universal UI, rather than game-customized interfaces or notifications.</p> <p><img src="/files/u160416/xblwin10_620_3.jpg" width="620" height="343" /></p> <p>This stuff isn't available in the Windows 10 Technical Preview yet, but it's never too soon to familiarize yourself with Microsoft's next flagship operating system. Win10 will be free for Windows 7 and 8 users anyway, during the first 12 months of its availability (after which you pay a one-time fee like usual, not a subscription fee as some had feared). We're looking forward to getting our hands on it.</p> cross-platform multiplayer game streaming microsoft windows 10 xbox live News Fri, 06 Mar 2015 03:01:09 +0000 Tom McNamara 29549 at Nvidia GeForce GTX Titan X: We Touched It <!--paging_filter--><h3>A quick peek into the future</h3> <p>In the land of video cards, Nvidia's GTX Titan is generally considered the king. The original gangster came out in February 2013, followed by the Titan Black a year later, each sporting an unprecedented 6GB of RAM, 7 billion transisters, and more shader processors than you could shake a stick at (eventually tipping the scales at 2880). Nvidia capped it off in March 2014 with the Titan Z, which put two Titan Black GPUs on one card. And now it's been nearly a year since we've seen activity from them on the super-premium end. But the company hasn't been idle. Today we got up close and personal with this obsidian brick of magic, the GTX Titan X.</p> <p>How close? This close:</p> <p><img src="/files/u160416/titanx_620.jpg" alt="Nvidia GeForce GTX Titan X video card" title="Nvidia GeForce GTX Titan X video card" width="620" height="465" /></p> <p>Unfortunately, we were forced to double-pinky swear that we wouldn't give you any specifics about the card just yet, other than the fact that it's got 12GB RAM, eight billion transistors, and is probably the fastest video card on Earth. But we can confirm that it was running several live demos on the show floor of the Game Developers Conference this week, conducted by Epic, Valve and Crytek. This is obviously not going to be a paper launch -- the card is already here. The Titan X is just waiting in the wings until it can get a proper introduction at Nvidia's GPU Technology Conference, which starts on March 17th. In the meantime, we took some nifty photos for you. Hope you brought a bib for the drool!</p> geforce gpu GTC nvida Titan X Video Card News Fri, 06 Mar 2015 02:00:47 +0000 Tom McNamara 29548 at NVIDIA Shield vs. Razer Forge TV: Hands On <!--paging_filter--><h3>One of these devices comes out ahead, far ahead</h3> <p>One of the biggest launches to come out of this year's GDC was <a href="">NVIDIA's Shield console</a>. Showing the device off to a packed audience, CEO Jensen Huang demonstrated a console that was a combination of both cloud streaming and local Android-based entertainment. Out of all the Android TV style devices that have been announced, the Shield is the most interesting.</p> <p>Shield can do several things: play Android games, play triple-A Android games made for Shield, handle your online media needs, and stream from NVIDIA's Grid streaming service. Grid has been in the making for several years, and NVIDIA hopes to be first to deliver a playable, lag free experience. At launch, NVIDIA will have roughly 50 playable titles, all of which should be the most recent PC hits. NVIDIA's vision is to deliver all games, at maximum graphics settings, without the requirement for having a high-end gaming rig.</p> <p>Backtrack to CES 2015 and you have <a href="">Razer's Forge TV</a>, a device that's meant to allow you to stream all your games to the living room, lag free. Forge TV is also an Android device, but it doesn't have the power that NVIDIA's Shield has. For reference, Forge TV is equipped with an Adreno 420 GPU, while the Shield's graphics duties are handled by NVIDIA's own Tegra X1, which is based on its current flagship Maxwell architecture. Specs aside, the Shield can do everything the Forge TV can do, and much more. Shield is also 4K ready, while Forge TV is not.</p> <p>We get our hands on both at GDC. What's our impression?</p> <p>Well, put it simply, the Shield is where it's at. We tried both local content made for the Shield as well as streamed content. The real deal though is what's possible when <a href="">NVIDIA's Grid grows as a platform</a>. Trying out several games from platformers to big titles like Saints Row 4, and Batman: Arkham Origins, we honestly couldn't detect any indication that the games were actually being streamed from Grid. It was impressive. Local games were equally impressive too, and were of much higher quality than your typical library of Android games.&nbsp;</p> <p style="text-align: center;"><img src="/files/u191083/dscf0095.jpg" alt="NVIDIA Shield Saints Row 4" title="NVIDIA Shield Saints Row 4" width="650" height="975" /><br /><em>NVIDIA's Shield playing Saints Row: 4 over Grid</em></p> <p>Many who watched NVIDIA's keynote over Twitch's livestream indicated that Shield was dropping frames. But I think it had to do with the setup and the stream rather than the actual Shield itself. During the keynote, it appeared like frames were dropped during gameplay of Witcher 3: Wild Hunt, but NVIDIA told us that the game was buggy and the dropped frames were due to the game, and not the Shield.&nbsp;</p> <p>Moving over to the Forge TV felt like a downgrade. Granted, NVIDIA has quite a lot more invested in Shield than Razer does in its own platform, but at the end of the day, both products are vying for your attention. The games on Forge TV are nowhere near as crisp and bold as on the Shield, and the titles aren't big hitters, but this has a lot to do with the two company's ability to negotiate deals with publishers. I'm sure as these devices become more popular, better titles will be released. However, <a href="">the Maxwell based Tegra X1</a> is an order of magnitude more capable and powerful than the GPU inside the Forge TV, and NVIDIA's ability to get native content on the Shield to demonstrate its GPU capabilities are worth noticing. The Shield has a huge lead in graphics, compared not only to the Forge TV, but basically any other Android device.</p> <p style="text-align: center;"><img src="/files/u191083/razer_forge_tv.jpeg" alt="Razer Forge TV" title="Razer Forge TV" width="650" height="975" /><br /><em>Razer's Forge TV playing Asphalt 8: Airborne</em></p> <p>Streaming wise, both the Shield and the Forge TV are capable of streaming all your local PC gaming content to the living room. However, to get local streaming working over the Forge TV, you'll have to invest another $40 to get Razers's Cortex: Stream, a proprietary solution that handles encoding duties. Razer informed us that it was not demoing Forge TV's biggest feature, Cortext: Stream, on the show floor. This seems odd to us since Cortex: Stream is arguably Forge TV's most touted feature. Despite the $40 added cost to get Cortex Stream on Forge TV, NVIDIA's Shield costs more from the get-go though, launching at $199, and Grid still hasn't received formal pricing, though NVIDIA indicated it would be "similar" to Netflix.</p> <p>NVIDIA's Grid streaming service provides a significant advantage to the Shield, removing the hardware requirements of a PC, and in fact, removing the need for a PC period. You can literally just get a Shield as a primary gaming system, if entertainment is all you want to do. We're looking forward to testing Shield and Grid in different networking situations, so look out for a report on that later. But for now, Grid works, and it works really well. We'll have to see how it performs in the wild.</p> <p>The question that needs to be asked now is why not just hook up a small form-factor PC to the TV and play all my games in their native glory? Well, consider convenience and price. Both NVIDIA's Shield and Razer's Forge TV cost significantly less than a PC capable of playing the latest games. And if you already have a PC, you can use the local streaming features of both systems to play at 1080p. NVIDIA's Shield is capable of native 4K output, but we didn't get to confirm whether or not the Shield will render games playable at 4K, and you can probably forget Grid gaming at 4K. Your network and Internet setup aside, the Shield reperesents an extremely attractive option.</p> <p>If both the Shield and Forge TV were available right now, I'd put my money on the Shield.</p> <p><em>[February 5, 2015 @ 18:17 PST: Edited for Clarity]</em></p> Forge TV GDC 2015 nvidia Nvidia Grid Nvidia shield razer razer forge tv shield News Thu, 05 Mar 2015 21:32:47 +0000 Tuan Nguyen 29547 at First Look: Logitech G303 Daedalus Apex Performance Edition <!--paging_filter--><h3>Same compact G302 chassis, but with new and improved sensor</h3> <p>Logitech recently came by the Lab to show off its new premium gaming mouse, the G303 Daedalus Apex. If you’re thinking it looks just like the G302 Daedalus Prime, that’s because it uses the same lightweight and portable body, which weighs 87 grams and measures 11.5x6.5x3.7 cm. Logitech says this is the enthusiast version of the G302, thus the “Performance Edition” moniker.</p> <p>The main feature that allowed the G302 to stand out was that it was originally designed as a MOBA mouse and had a new metal spring tensioning system. This system is guaranteed for at least 20 million clicks, which Logitech says is equivalent to a pro gamer practicing 10 hours a day, every day, for two years. More importantly, however, the spring mechanic eliminates air travel time between the two buttons and activating commands. This ensures a speedy, consistent clicking experience. The G303 maintains that system along with the G302’s five DPI settings, but the Apex also has a few new tricks up its sleeve.</p> <p style="text-align: center;"><img src="/files/u154082/g303_ctg_orange_72_dpi.jpg" alt="g302" title="g302" width="620" height="620" /></p> <p style="text-align: center;"><strong>The Apex features 16.8 million colors.</strong></p> <p>The biggest addition to this Daedalus is the PMW3366 sensor, which Logitech uses in the bigger G502. While it isn’t as fast as its G402 sensor, which uses an optical/gyroscope hybrid solution, which allows it to travel up to 12.5 meters a second, Logitech considers the PMW3366 to be its most accurate sensor. Logitech says this makes it about 2-3 pixels more accurate than the G302, and while the company admits that this isn’t a monumental improvement, says that it should amount to a slightly more responsive and accurate feel for the end user. Logitech also asserts that the sensor mitigates unwanted mouse acceleration and adds zero smoothing. The Apex offers a DPI range between 200-1200. In addition, the sensor is much faster than the G302 before it, going from a cap of 120 inches per second to 300 inches per second. Logitech says this is fast enough for any real-world use and it’s able to achieve this speed via the sensor’s clock tuning ability which also helps prevent degradation of speed over time. This essentially extends the life of the mouse. To top it off, the sensor also features sensor surface tuning, which tunes the mouse’s parameters to match your desk surface for a consistent scrolling experience. All of this on top of a 32-bit ARM processor.</p> <p>Beyond the sensor improvements, Logitech is also jumping on the RGB train (RGB… it’s so hot right now). Some of you have clamored for more color options out of Logitech rather than the company’s default blue hue, and your voices were heard loud and clear. The G302 will feature 16.8 million colors (you can count them all to be sure) and you’ll be able to adjust the brightness or even have the LEDs pulsate, or you could just turn off the fancy lights if they don’t tickle your fancy. Wireless mouse fans may be disappointed to hear that it uses a cable, and a braided one at that, but Logitech says it went out of its way to make the cable more flexible than the average braided solution, so that that you get the freedom of a plastic wire with the durability of a braided solution.</p> <p>You’ll be able to get your hands on the G302 today for $70. Expect a full review of it sometime in the near future.</p> Daedalus Apex gaming mouse Logitech G303 maximum pc mice Performance Edition News Features Thu, 05 Mar 2015 19:15:57 +0000 Jimmy Thang 29471 at Acer Speeds Up Chromebox CXI Line with Intel Core i3 Models <!--paging_filter--><h3><img src="/files/u69/acer_chromebox.jpg" alt="Acer Chromebox" title="Acer Chromebox" width="228" height="212" style="float: right;" />A faster Chromebox</h3> <p>There are plenty of mini PC options in the Windows space, but does anyone remember that Chromeboxes exist? For those who care, <strong>Acer is expanding its Chromebox CXI line with a couple of new models that have been upgraded with Intel's 4th Generation Core i3 4030U dual-core processor</strong> clocked at 1.9GHz (3MB cache, Hyper Threading support, 15W TDP), a speedy replacement for the Celeron chips that power the existing models.</p> <p>Acer's primary targets are users in education and small to medium (SMB) businesses, along with any consumers who are into Google's Chrome ecosystem. For those who can benefit from a Chrome OS system, the Core i3 upgrade offers enough speed to work on multiple projects at the same time, Acer says.</p> <p>There are two models -- the first is the CXI-i34GKM with 4GB of DDR3-1600 RAM and the second is the CXI-i38GKM with 8GB of memory. Both sport 16GB of internal storage upgradeable via microSD (up to 32GB), HDMI and DisplayPort outputs, 802.11n Wi-Fi, Bluetooth 4.0, GbE LAN, four USB 3.0 ports, and of course Google's Chrome OS. The systems measure 6.51 by 5.12 by 1.3 inches and are VESA mountable.</p> <p>Upon first boot, you're thrust into Google's ecosystem and signed into its services. The Chromeboxes come with certain web apps already installed, and there are now over 30,000 additional apps, themes, and extensions available in app store.</p> <p><img src="/files/u69/acer_chromebox_vesa.jpg" alt="Acer Chromebox VESA" title="Acer Chromebox VESA" width="620" height="568" /></p> <p>Should things go wrong and/or there's a need to wipe the system clean and strat from scratch, both Chromeboxes have a "Powerwash" option that enables IT to quickly reset them to their original factory states with the touch of a button.</p> <p>The 4GB and 8GB models are available now for $350 and $400, respectively. Just as with Chromebooks, there are Windows-based alternatives that cost about the same, which will make these a tough sell considering their limitations.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Acer chromebox CXI-i34GKM CXI-i38GKM Hardware mini pc OEM rigs News Thu, 05 Mar 2015 14:01:20 +0000 Paul Lilly 29545 at GDC 2015: Virtual and Augmented Reality Roundtable <!--paging_filter--><p>At GDC today, a number of VR and AR developers gathered in a casual forum moderated by Chris Pruett, who does developer relations for Oculus VR. What followed was an interesting jam session as creative minds shared their ideas, triumphs, and frustrations with virtual platforms. Pruett stated at the beginning that he was not there as a representative of Oculus, and in fact he was not the original planned leader of the session.</p> <p>He began by taking an informal survey of the room. By a show of hands, he estimated that about 70% of the people there were actively developing, though only two people raised their hands when he asked if anyone had been working on VR or AR for five years or more. Most of the room also expected to release their product within eight months. Interestingly, not everyone was in it to make money. A few of them were in it for public feedback and planned to use that to iterate development, possibly into a retail product, but not necessarily.</p> <p>They all agreed that motion sickness was a prevalent problem, and they discussed the ways that they were combating it in software (as opposed to the device itself using a motion-sensing camera to keep the user's head correctly oriented). There was a consensus around creating a virtual copy of the user's body within the world, but it had to synchronize with the user's movement, or else the disorientation and nausea would be even worse than it was without the copy.</p> <p><img src="/files/u160416/hololens.jpg" width="620" height="349" /></p> <p>Creating a goggle-like frame around the edges of the user's vision also helped (such as when a movie camera simulates looking through binoculars). Limiting navigation in the world and instead sending the content to the user was also beneficial, as was establishing a visual horizon and a virtual floor beneath the user's feet.</p> <p>The general consensus around VR and AR was that it felt like the early days of 3D gaming in the mid 1990s. Meaning, developers were still learning how the technique works -- and doesn't work, and implementing hacks to create certain illusions when the hardware couldn't handle the processing requirements of doing it for real. Pruett mentioned a trick he'd figured out with mirrors. Ordinarily, mirrors are a problem in a virtual space, because they force the GPU to render a reflected 3D space in addition to what it was already handling, which can kill performance. His workaround was to lower the refresh rate to 30Hz, which looked fine as long as there wasn't a lot of movement in the scene.</p> <p>The attendants largely did not mention their names or what they were working on, but conversation did start to flow once people started letting their guard down a bit. Some hesitation is understandable, since in many cases these people are working on products that aren't ready to for the public eye yet, and they may be using clever ideas that they'd prefer to keep to themselves. But the takeaway from this roundtable is that VR and AR could benefit quite a lot from developers sharing some ideas and discoveries with one another. Because while devices like the Oculus VR are amazing technology, they'll become historical curiosities without compelling content to drive them forward. In an environment as collaborative as software development, two heads are better than one.</p> augmented reality game development oculus rift virtual reality vr News Thu, 05 Mar 2015 07:43:39 +0000 Tom McNamara 29544 at GDC 2015: Nvidia GameWorks in Far Cry 4, Assassin's Creed Unity, and War Thunder <!--paging_filter--><h3>Tools for digital hair, water, light shafts, and shadows</h3> <p class="MsoNormal">Most of us think of Nvidia as a hardware company. Video cards, tablets, and now a game console. But they've been doing a lot on the software side, working directly with developers in a program called GameWorks. This is a set of graphical tools that a developer can select from like a buffet, to fill in gaps in the game-creation process, or to accelerate it. Today, Ubisoft Kiev, the guys who worked on the PC ports of Far Cry 4 and Assassin's Creed Unity, gave some real-world examples of different GameWorks elements that they used to improve their visuals. Also along for the ride was free-to-play online shooter Warthunder, who makes liberal use of some interesting water effects.</p> <p class="MsoNormal">With ShadowWorks, Ubisoft had tools to smoothly blend together shadows cast by multiple objects, and create better blurring (a shadow can look fake if it's sharp in the wrong places). TXAA is Nvidia's proprietary method of anti-aliasing in a way that causes less "shimmer" than standard MSAA, and without MSAA's performance impact. (Shimmer is a side effect of jagged edges that causes them to kind of ripple as you move your POV around a scene. ) They showed AC Unity running in real-time and compared the different AA methods side-by-side. TXAA definitely caused the least shimmer.</p> <p class="MsoNormal">With Far Cry 4, meanwhile, they made liberal use of HairWorks, since the game is full of wild, furry critters. It's similar to AMD's TressFX in that it renders individual hairs and tufts. This doesn't look very good without anti-aliasing, so use can use TXAA once again to sand those rough edges off. They also did anti-aliasing in a separate pass from the AA applied in the rest of the scene. HairWorks provides a real-time viewer so that you can see how the effect looks in-game, and you can tweak different settings and see the effects right away. Integration of HairWorks took them about one month, with two technical artists from Nvidia assisting a software engineer at Ubisoft.</p> <p class="MsoNormal"><img src="/files/u160416/nvidia_gameworks.jpg" width="620" height="358" /></p> <p class="MsoNormal">The Far Cry 4 porting team also made use of GodWorks, which is Nvidia's tool for god rays. These are basically light shafts caused by the sun in-game, or another sufficiently bright light source. At first, the team was using gray, smoke-colored rays, but they decided that the aesthetics were much better with yellowish-gold light. They had to be careful not to over-use the effect, though, or things would get too "foggy" to see clearly. They also made the effect evolve over the course of an in-game day, so that it was lightest around noon, and heaviest in the early morning and late afternoon.</p> <p class="MsoNormal">The developers of WarThunder took the game next and talked about WaveWorks. WarThunder renders a lot of water in its maps, and the tea wasn't getting the visual effect that it wanted, so it turned to Nvidia for some tools. They needed something that looked dynamic and realistic, didn't have a high performance hit, and could convincingly interact with objects in the game. With WaveWorks, they were able to plug in things like reflection, refraction, dynamic ocean foam, bubbles, light scattering, shadows, atmospherics, and displacement.</p> <p class="MsoNormal">They wanted to keep physical interaction the same for all players, so the PhysX part is calculated on the CPU rather than the GPU. This also allows them to more easily deal with the different APIs (DirectX 9, DirectX 11, OpenGL) that GPUs use on different platforms; each API has different limitations and advantages that would be a headache to deal with otherwise. Since the team was developing for Windows, OSX, and PS4, getting everyone's physics on the same page was pretty important. The CPUs in their game servers could also help with physics.</p> <p class="MsoNormal">Once they'd figured out how to make shore waves look realistic – by adding noise, using the seabed to push waves up, some under-the-hood math to take energy out of the waves as they came to shore – the last step was integrating everything into an LOD system. They implemented three levels of detail, because you don't need all of the effects going at full blast when the player is flying a thousand feet above the ocean. This helps with performance on both the server and client side. The team said that WaveWorks took one man week to integrate, and the results speak for themselves.</p> Assassin's Creed Far Cry War Thunder HairWorks Nvidia GameWorks ubisoft WaveWorks News Thu, 05 Mar 2015 03:08:32 +0000 Tom McNamara 29543 at Valve’s VR Experience Is the Closest Thing to the HoloDeck We Have <!--paging_filter--><h3>The best VR experience yet</h3> <p>I just walked out of Valve’s SteamVR demo and can say that it is the best VR experience I’ve ever had. And this is coming from a guy who has tried nearly all of the VR headsets out there, &nbsp;including Oculus VR’s newest Crescent Bay prototype. This is the closest thing to a modern-day holodeck we have at the moment.</p> <p>Built in partnership with HTC, and named the "Vive," the head-mounted display (HMD) here uses two 1080x1200-resolution displays, one for each eye. The display uses a low-persistence, global display solution that turns the display on and off at the same time.&nbsp;</p> <p style="text-align: center;"><img src="/files/u154082/steamvr_vive.jpg" width="620" height="388" /></p> <p style="text-align: center;"><strong>We couldn't take any pictures of our VR experience but here's what the headset looks like.</strong></p> <p>One Valve rep tells me the FOV is around 100 degrees, while another tells me its 110, I'm more inclined to believe the former. While I could still see pixels and there is, of course, room for improvement, it’s hardly distracting and is definitely sharp enough for consumer release and, dare I say, slightly sharper than Oculus’s Crescent Bay prototype.&nbsp;</p> <p>Like the Oculus Rift HMD, the Vive will be a wired experience, and like Crescent Bay, it supports a 90Hz refresh rate. Beyond that however, there are are some key differences that set the two HMDs apart. Instead of relying on a single external camera for head tracking, Valve set up two “light towers” on two pillars and placed them on opposite ends of the room I was in (the room measured roughly 25x25 feet). The light towers simply need to be powered (they don’t need to be plugged into your PC) and they emit red lasers that assist the Vive in mapping out your room so you can get 360-degree room scale tracking, which allows you to map out your walkable space when you're in VR. The light towers also help to identify where Valve’s new VR controllers are.</p> <p>The controllers are very similar to the Razer Hydra controllers, except will be wireless (the prototype unit we tested used a wired solution, but we hear there are working wireless ones out there in the wild). The controllers have sensors that work in conjunction with the light towers to allow the HMD to detect where they are in your virtual reality experience. Assuming you're holding these sticks, this essentially means you can see your hands in the game. The controllers have a circular touchpad on the front that is roughly one inch in diameter, &nbsp;a trigger button on the back that essentially allows you to grab things (a la crab hands), and long buttons on the side of the stick that you can squeeze (think stress ball). The controls were nearly 1:1 and are definitely the best VR controllers out there, even better than Sixense’s similar Stem VR system. There are also a bunch of little cameras on the front of the headset that leverage the position of the light towers to provide positional tracking, which not only lets you lean into objects but to walk around as well. One big problem with VR pertains to response time; I tried shaking my head as fast as I could to see if I could experience any judder and am glad to report that I experienced no such lag. It felt completely smooth and natural.</p> <p style="text-align: center;"><strong><img src="/files/u154082/dsc03154.jpg" width="620" height="349" /></strong></p> <p style="text-align: center;"><strong>This is more or less how our VR room was set up.</strong></p> <p>While the headset that I used didn't have integrated audio, Valve told me that the consumer version will come with an integrated solution that users will be able to detach, in case they want to user their own high-end audio headset.&nbsp;</p> <p>Now, on to the really fun part: the demos! I tried roughly half a dozen demos during my session with Valve. The first placed me into a white room with a bunch of virtual posters of the demos I was about to experience. What was immediately pretty weird was that I saw the controllers in VR floating my way. It was the Valve rep handing the controllers to me. As soon as I held both controllers in my hand, I immediately felt at home. I quickly came to the realization that the pinpoint precision and accuracy of being able to move my hands on a 1:1 basis was the big piece of the VR puzzle that I had been missing this whole time. I began the demo by using my left hand to press down on the "play" button in front of me. After I did that, I started to see a bunch of little white pillars appearing all around me. These pillars would shift up and down, and there were hundreds of them surrounding me. While it’s a very simple demo, it felt extremely polished and certainly gave me a sense of presence.&nbsp;</p> <p>The next demo was called Blue and it took me to the bottom of the ocean atop an old sunken ship. The point of this demo is to show off three-dimensional depth. I should mention that I'm nearsighted and wear glasses, and prior to starting this demo was prepared to take them off, but was advised that the HMD “renders to infinity” (I assume this means it renders as far as the human eye can see) and that I could and should leave them on. With my prescription glasses on underneath the HMD, I looked straight up and it seemed like I was half a mile away from the surface. Faintly in the distance above, I could see the sun’s rays piercing the top of the ocean. I really felt submerged (and this is coming from a licensed scuba diver). Another interesting element of this demo is that barriers of my real physical space were taken into account within the game. Essentially, the walkable area on the deck of the ship represented the walkable area of space within the room. Valve says these experiences will dynamically shift depending on one’s real space constraints, though our rep didn’t elaborate on how. Considering that all the VR experiences I’ve tried so far have been designed for the seated experience, I still couldn't help but not trust these markers. Valve says some games will draw boundary lines on the ground or even render virtual walls once you get close to the bounds of the walkable area. Even with these walls in place, however, I just felt safer taking a small step here and there. In this demo, I saw a bunch of fish and manta rays swim around me and it felt extremely polished and immersive. This felt much more real than the Ocean Rift demo on the DK2. But the real kicker came when a giant blue whale swam by the ship and looked at me. I felt like I was on an alien planet, and basically just kept on smiling and nodding my head as if to suggest to myself, “Yep, you guys have done it.”</p> <p>The next demo took me to a virtual kitchen and presented me with some ingredients on a virtual counter top and placed recipe instructions on a wall. It asked me to pick up tomatoes from the table in front of me and then walk over to the right to place them in a pot. I then had to find a mushroom, but didn’t see it on the table, so I walked over to the fridge on my left and opened it. The missing mushroom was in there, so I picked it up and walked across the kitchen to place it in the pot. From there, I dinged the bell sitting atop a table to signal that dinner was ready. It was a cartoony demo in the style of Surgeon Simulator and the graphics weren’t very intensive, but it just felt like a complete joy. Ringing the bell, picking up the various objects, opening the fridge... it all felt incredibly natural and instinctive. It didn't feel like I was experiencing a demo, but instead accomplishing real work.</p> <p>The next experience was called Tilt Brush. It leveraged the full range of motion that Valve’s VR controller provided and allowed me to use my hands to paint floating 3D art in the air. The way it works is that your right hand presents options for you to change your brush type and brush color. You can then use your left hand to point and select what sort of brush you want. You’re not relegated to just paint, but can paint with fire, stars, ice particles, and more. So there I was, painting fiery three-dimensional Christmas trees. From here, I could walk around my floating artwork and admire it from all angles. I suggested that Valve should allow users to 3D print their works of art, similar to what Microsoft is doing with its HoloLens and HoloStudio software suite.&nbsp;</p> <p>The next demo I tried was called The Gallery: Six Elements, which is a full-fledged game being designed by Canadian developer CloudHead Games (look forward to an in-depth video interview with them shortly). This demo started me off in an ancient fantasy-style elevator in dark mines, think the Mines of Moria from the Lord of the Rings. I could walk around this elevator and pick up Skyrim-like helmets and nuts and bolts. Off in the distance was a giant rock monster, like something you’d see in God of War. The rock monster talked and seemed friendly enough. Me? I was mainly focused on pulling levers, using my hands to swat at dangling cables, and picking up little bolts throughout the room and inspecting them with a childlike wonderment. The rock monster continued rambling on, so I decided to see if I could chuck a bolt at him, and it worked! Throwing objects felt extremely natural. Eventually, the elevator started falling apart, and walls started falling down all around me. The elevator eventually took me to the top, where I could see an expansive fantasy-like vista with a bridge just in front of me. The rock monster asked me to follow him, and that’s where the demo ended. I wanted more of it, and suffice it to say, I'm eagerly awaiting the game's release.</p> <p style="text-align: center;"><iframe src="" width="560" height="315" frameborder="0"></iframe></p> <p style="text-align: center;"><strong>Here's a short video snippet of Valve's Portal VR demo.</strong></p> <p>The last demo was a pleasant surprise and was developed by Valve itself. It took me to a laboratory within Aperture Labs where I was greeted with narration provided by the opening narrator from the original Portal. The narrator asked me to perform various tasks in the lab, which included opening drawers along a wall. I encountered a bug, however, where I couldn’t pull out one of the drawers and the demo had to be reset (a downside to showing off pre-release hardware and software, I’m afraid.) Once the demo booted up again, I was able to pull the drawers out. One of the drawers contained a piece of rotted cake (the cake is real and I have seen it!). Another drawer contained a bunch of little cartoon stick figures working inside a tiny office. The narrator said that because I had looked at them, I was now their god. The drawer then closed and the narrator jokingly suggested that the tiny little community inside would be incinerated. It wouldn’t be Portal without a little Valve humor. Eventually the narrator asked me to walk to the other end of the lab and hold down a latch. Doing so opened up a garage-like door and out came Atlas, one of the robots from Portal 2. He came stumbling out and looked really sick. The narrator asked me to pull Atlas's face off, and out popped his robotic guts right in front of me. The narrator then said I needed to fix the robot and quickly jabbered a bunch of nonsensical technical instructions and gave me a quick destruction timer. Eventually, Atlas pulled himself together and the walls started collapsing, revealing more of the underbelly of Aperture Labs. Atlas then falls out of the room and after he falls, none other than a giant Glados comes rolling around. She started spouting off about me as she looked at me, and the demo ended.&nbsp;</p> <p>Compared to other VR solutions, Valve is at the top of the heap. Its headset is sharp, offers a great sense of depth, has excellent tracking, allows you to walk around, didn't make me motion sick, and comes with an excellent VR controller that works well. In addition, all of the demos looked excellent and polished. Valve says a dev kit should be released by the summer, and the consumer release should be coming at the end of the year. If I do have one concern about Valve/HTC's solution, however, it pertains to price. All of this sounds expensive, but I might just sell my own legs for this if it meant I could get virtual ones.&nbsp;</p> GDC 2015 htc Review steamvr Valve virtual reality vive vr News Features Thu, 05 Mar 2015 02:43:41 +0000 Jimmy Thang 29542 at GDC 2015: John Carmack on the Future of Mobile VR <!--paging_filter--><h3>Shooter godfather has advice for developers</h3> <p>The lead designer on some games you might have heard of, like Wolfenstein 3D and Doom, has been away from the forefront of first-person shooters for a few years, but he has not been idle. Aside from building rockets that fly into space, John Carmack also been dabbling in virtual reality. In August 2013, he became Chief Technology Officer of Oculus VR, founded by fellow techno-wunderkind Palmer Luckey. Perhaps sensing a kindred spirit, Carmack tackled the technical underpinnings of the company's purely mobile plans, specifically the Samsung Gear VR headset, which uses the company's mobile phones to act as the brains and display of the device. Today, in front of a packed house of hundreds of developers and journalists, Carmack gave a talk on how that process had worked, and what he expects of the platform in the future. There were no revelations about the Oculus Rift, but a lot of the work that he's putting into Gear VR can spill over into that.</p> <p>Our story begins at a Samsung R&amp;D facility in Dallas, Texas a few years ago, where the company was working on their first Gear VR (they've just released its sequel, the Gear VR 2). Carmack's base of operations has been in Texas since the early days of id Software, so it was a natural geographic fit. Carmack was enthused by the engineering challenges of VR and found the mobile variant especially interesting. In fact, he sees devices like Gear VR as the primary platform. By definition, they are far more portable than a device like the Oculus Rift; even if you can throw the Rift in a carry-on bag, you still need to bring your PC with you too.</p> <p>Gear VR, meanwhile, needs only a mobile phone slapped into a headset, though it is admittedly currently limited to a small handful of Samsung phones. Carmack mentioned that you can take the device with you on vacation, giving it more visibility in the headset market than a device that's tethered to a PC. He quipped, "The most fun thing to do with Gear VR is to show it to other people," because their reactions are so entertaining. He called this "an infection vector for virtual reality."</p> <p>But in Carmack's opinion, the content system needed some work. When Facebook bought Oculus VR, they were able to bring some people over from their new owner who could help with the infrastructure behind purchasing and downloading games over the Internet. Carmack sees the GearVR store as a competitor to Steam, in fact.</p> <p><img src="/files/u160416/gearvr.jpg" width="620" height="402" /></p> <p>With that in place, Carmack seems confident that the hardware itself is suitable to act as a commercial development platform, and they would be aggressively promoting Gear VR to create a user base. There are still some technical limitations compared to the Oculus Rift, chiefly positional tracking. The Rift uses a sophisticated motion sensor to synchronize your head movement with camera movement; so in addition to the Gear VR's ability to detect your head turning, the Rift can tell when you lean forward, lean, back, and tilt your head. When this detail is absent, the result can cause nausea. The Rift is also using the power of your PC, so its visual effects can be a lot more complicated.</p> <p>For the Gear VR, Carmack encouraged developers to aim for a level of complexity on par with that of a GameCube game. He noted that Wolfenstein 3D and Doom were essentially Gauntlet from a first-person perspective, so it wasn't necessary to re-invent the design wheel or blow people away with amazing visuals to make a compelling game. You could just iterate on an idea in a way that took interesting advantage of virtual reality. He added, "We still don't know what the best application will be."</p> <p>Even with more modest performance targets in mind, a technique called Asynchronous Time Warp is necessary for the hardware to keep up with the game engine's demands. ATW injects "filler" frames when the device can't maintain 60 frames per second. This avoids judder, which can cause disorientation. Oculus is also getting the word out about their layering system. In a 3D scene, you designate multiple layers for the engine to see as different distances. Tagging these beforehand means that the GPU doesn't have to figure it out in real time, but it also helps with anti-aliasing, especially with text.</p> <p>They're also working on multi-view rendering, where the same set of 3D engine instructions are sent to both eyes, which also cuts down on the calculations that the GPU needs to make. This raises the ceiling on the things that the CPU part of the phone or tablet can do, such as animation, AI, and some physics.</p> <p>At the end of the talk, Carmack had a Q&amp;A session, during which he gave us his opinion on augmented reality. He saw the platform as not competing directly with VR, and that the latter would be where innovations happened first. AR also uses cameras to simulate a set of eyes, but since the cameras can't be where your eyes actually are, this spatial gap can create disorientation. Nevertheless, he expressed enthusiasm for <a href="" target="_blank">Microsoft's Hololens initiative</a>.</p> 3D headset GearVR john carmack oculus rift oculus vr samsung virtual reality News Thu, 05 Mar 2015 00:24:41 +0000 Tom McNamara 29541 at Nvidia Unveils Titan X Graphics Card at GDC <!--paging_filter--><h3><img src="/files/u69/titan_x.jpg" alt="Titan X" title="Titan X" width="231" height="177" style="float: right;" />A new top-end GPU</h3> <p>It was speculated that Nvidia might announce a new Titan graphics card during GDC, and that's what the company did—in a somewhat dramatic fashion. It happened at the tail end of an Unreal Engine panel. As Epic founder Tim Sweeny wrapped up his discussion on the state of Unreal, <strong>Nvidia CEO Jen-Hsun Huang surprised attendees by emerging on stage to unveil the company's Titan X</strong>.</p> <p>He called it the "world's most advanced GPU," though was short on details. What he <em>was</em> willing to divulge about the card is that it has 12GB of onboard memory and 8 billion transistors. For the sake of comparison, Titan Black has 7.1 billion transistors and 6GB of GDDR5 memory.</p> <p>"It’s the most advanced GPU the world has ever seen," Jen-Hsun said.</p> <p>He then presented the company's first production unit to Sweeny, though not before autographing the box in came in.</p> <p>Nvidia will release more details about the card during the upcoming GTC event that runs from March 17–20.</p> <p><em>Follow Paul on <a href="" target="_blank">Google+</a>, <a href="!/paul_b_lilly" target="_blank">Twitter</a>, and <a href="" target="_blank">Facebook</a></em></p> Build a PC Gaming GDC 2015 graphics card Hardware nvidia Titan X Video Card News Wed, 04 Mar 2015 19:16:25 +0000 Paul Lilly 29540 at