According to Nvidia, Windows 7, which recently reached RTM, will be the catalyst that propels the concept of GPGPU computing into the mainstream.
"Previously, GPUs were almost exclusively limited to rendering and accelerating graphics and video," Chris Daniel, product manager for software at Nvidia, wrote in a Microsoft Partner blog. "With the introduction of Windows 7, the GPU and CPU will exist in a co-processing environment where each can handle the computing task they are best suited for. The CPU is exceptionally good at performing sequential calculations, I/O, and program flow, whereas the GPU is perfectly suited for performing massive parallel calculations."
Nvidia went on to say that by introducing the DirectX Compute in Windows 7, Microsoft is providing a huge shot in the arm for developers to make better use of the GPU for more than just graphics acceleration. Such tasks include high-quality video playback, high performance transcoding, enabling new media scenarios, and offering extended control over media libraries.
"As an example of the real world benefits of DX Compute, you will be able to use the massive parallel capabilities of the GPU to significantly reduce the time it takes to manager your media files compared with just using the CPU alone," Nvidia added.
AMD has been talking up its CPU/GPU combo chip codenamed Fusion for some time now, but it might not see the light of day for another three years, according to the latest rumor.
Initially expected in late 2008 or early 2009, Fusion in 45nm form was ultimately scrapped due to design challenges. The same might be happening with 32nm, says news and rumor site Fudzilla, who claims AMD has now decided to wait until it moves to a 22nm manufacturing process, currently scheduled for the second half of 2012.
That sounds like a long time to wait, especially as Intel puts the pressure on with a CPU/GPU chip of its own (Larrabee). For that reason, it's possible AMD may opt to follow in Intel's footsteps and release Fusion constructed with a 32nm IGP and CPU as two separate dies on the same chip. If AMD went this route, it could conceivably have Fusion parts ready by the second half of 2010, Fudzilla says.
Maingear claims its new eX-L18 laptop is the "World's Most Power Gaming Notebook," and while we've seem some desktop replacements built around the Core i7 platform that might dispute that title, the eX-L18 is at least one of the fastest spec'd Core 2 notebooks on the block.
Sporting a generous 18.4-inch LCD, Maingear's latest lappy comes configurable with up to an Intel Core 2 Extreme X9300 processor (2.53GHz), 4GB or 8GB of DDR3-1333 memory, up to three 2.5-inch SATA or SSD drives with RAID support, a DVD or Blu-ray optical drive, and the crème de la crème of mobile gaming hardware: A pair of GTX 280M GPUs in SLI.
"Maingear has equipped the eX-L18 with the world's fastest notebook graphics solution," said Rene Hass, GM of the notebook business unit at Nvidia. "With Nvidia GeForce GTX 280M GPUs, Maingear's customers will experience breathtaking in gaming physics from titles such as Terminator Salvation or Darkest of Days and are ready for GPU computing applications such as Badaboom, vReveal, and Arcsoft SimHD."
Gamers will also experience a noticeably lighter wallet with pricing starting out at $3,000 for a base configuration. All configurations include a "Nighthawk Black Automotive Finish."
MSI this week announced the R4890 Cyclone series graphics card. Like other HD 4890 videocards, the Cyclone comes equipped with 800 stream processors and 1GB of GDDR5 with a 256-bit memory bus, but what separates this card from the pack is its cooling solution.
According to MSI, the Cyclone is the only HD 4890 to sport a 10cm PWM fan. The cooling solution also packs four 8mm heatpipes, which the company says is 60 percent thicker than traditional heatpipes and offers up to 90 percent better cooling efficiency. The end result is a 1GHz core clockspeed, making the Cyclone the fastest clocked HD 4890 yet.
Taking the marketing blitz to another level, MSI boasts "Military Class Components." These include Hi-c capacitors made of Tantalum, an all-in-one solid chock, and all solid caps.
After four months in beta, Microsoft today released its Silverlight 3 web application framework. A direct rival of Adobe Flash, the newest version of Silverlight supports Internet Explorer 6/7/8 and Firefox 2/3 in Windows, and Safari 3/4 on the Mac platform (Opera and Chrome users are left out in the cold).
Silverlight 3 sports a bevy of new features and APIs, not the least of which is support for GPU hardware acceleration. It also includes new codec support (H.264, AAC, MPEG-4), improved logging for media analytics, deep linking, improved text quality, multi-touch support, and a bunch more.
According to Microsoft, "Silverlight 3 also ushers in a new generation of high-quality and high-definition video experiences with true high-definition video in full-screen mode, with stutter-free live and on-demand video." In addition, Silverlight's Smooth Streaming technology allows users to start playing HD content at any point in time, instantly.
Without any press releases that we could find, Nvidia has launched a pair of low-level graphics cards, both of which are being aimed at the OEM market.
The first is the GT220, a half-height card with 48 processor cores chugging along at 615MHz (GPU) with 1GB of GDDR3 memory running a 790MHz on a 128-bit memory interface. That adds up to 25.3GB/s of memory bandwidth. For what it's worth, the OEM card also boasts support for DirectX 10.1.
Also launched is the G210, another half-height OEM card sporting DirectX 10.1 support. As you might have surmised from the number scheme, the G210 checks in with lower specs than the GT220. Specifically, 16 processor cores with the GPU clocked at 589MHz, and 512MB of DDR2 memory clocked at 500MHz on a 64-bit bus. The lower clocks and bus chops the memory bandwidth down to 8GB/s.
No word on price or which OEMs are expected to carry the new cards.
AMD's new ATI Graphics Scout is a visual wizard designed to help you find the "perfect" ATI GPU for your needs. Graphics Scout provides feature selections in four categories: video applications, pictures and photos, games, and office applications. Select the most important feature or features in some or all categories, and Graphics Scout (which resembles a Star Wars R2-D2 with a flat-panel upgrade) suggests a suitable match.
Earlier this week, The Inquirercomplained that Graphics Scout was pushing out some questionable suggestions. Thankfully, as an update to the original story indicates, ATI's been making some changes, and in our tests today, it made recommendations that make sense:
When we selected video editing, photo editing, DirectX 10+ gaming, and Microsoft Office applications, it suggested the top-of-the-line HD 4890.
When we changed our mind and selected big-screen TV connections with Blu-Ray support, photo viewing and editing, online gaming, and web browsing, Graphics Scout suggested the mid-line HD 4550.
The ability to move up and down the GPU line to see what upgrading or downgrading the recommended selection is handy, as is the ability to compare any other card with the recommended card. For its intended UK audience, Graphics Scout is great, as it provides links to various UK dealers. For users in other countries, it's still useful, but you'll need to use a site such as Cnet's Shopper.com to find actual products for sale. Take Graphics Scout for a spin and join us after the jump to chime in on its recommendations.
Some recent reports have suggested that Nvidia is planning to launch their new 40nm GeForce GT 220 and GeForce G210 GPUs at the end of September.
Until now, Nvidia has had to delay the launch of their 40nm GPUs due to low yield rates from TSMC. But, recently the rate has improved a great deal, allowing Nvidia to schedule a launch before the end of the year and most importantly – in time for the holidays!
Think your dual-GPU GTX 295 videocard is anything to write home about? It's still the king of desktop videocards, but it does't come anywhere close to offering 800 teraflops of processing power. That's the amount a Japanese company has to work with, which has mashed together nine 73-core chips into a single system. And as daunting as that may sound, it fits inside a typical ATX desktop setup.
Before anyone asks, the answer is 'no,' it won't run Crysis. Not because it can't, but because it's not aimed at gaming. Those 800 TFLOPs of number crunching provide real-time ray traced rendering and is being aimed at automotive design.
As for how the 45nm super GPU works, Arstechnia has put together a fantastic article describing all the gritty details, includng the complex bus directing all that traffic.
Give it a glance here, then hit the jump and tell us what you'd like to use this kind of GPU computing power for (Folding, anyone?).
At long last, Nvidia may finally adding DirectX 10.1 support to its videocards, assuming Fudzilla is right on the money. According to the news and rumor site, Nvidia's GT200 will be refreshed to a 40nm manufacturing process and the new chips will sport DX10.1.
To date, ATI has been the only one to offer DX10.1 support on some of its videocards (yes, we're completely ignoring S3's Chrome series), a minor extension to DX10 that thus far hasn't meant much for gamers. To to fuel the conspiracy flames, that could change with Nvidia jumping on board. Remember that DX10.1 instructions did at one point show a performance boost on ATI cards in Assassin's Creed, but after a patch removed support for the instruction set, some accused Ubisoft of bowing to pressure from Nvidia after the GPU maker sponsored the title with its The Way It's Meant To Be Played program.
In any event, it looks like refresh will come on the tail end of summer or early fall.