According to Valve’s November hardware report, a majority of you gamers using Steam are favoring Windows XP, Nvidia graphics cards and Intel processors.
These numbers come as very little surprise. Windows XP has remained dominant for gamers due to a lack of any significant DirectX 10 enabled titles, Nvidia has been heavily strutting their stuff in the graphics game and Intel is up to their usual, benchmark-crushing shenanigans.
The exact numbers show that there really is a startling majority. 70 percent of users were running Windows XP, 65 percent viewing on Nvidia, and 64 percent thinking with Intel.
Be sure to check out the survey yourself and check out what piece of the pie you reside in!
If you sometimes use your computer for something other than gaming, your ultra-powerful GPU might be twiddling its thumbs, waiting for some 3D deathmatches - until now. This week, Nvidia released the final 1.0 version of its OpenCL specification, which enables programmers to use the power of the GPU for general-purpose data crunching (aka General Purpose GPU or GPGPU). OpenCL enables programmers who aren't accustomed to shoving around vertices or telling hardware T&L registers what to do to write code for GPU execution without using OpenGL or DirectX commands.
Nvidia isn't exactly new to GPGPU, as its CUDA parallel processing architecture is somewhat similar to OpenCL. CUDA is currently supported by virtually all current GeForce, GeForce Mobile, and Quadro FX GPUs when equipped with at least 256MB of dedicated video memory.
To demonstrate the "Open" in OpenCL 1.0, Nvidia has worked closely with Apple Computer, which first proposed a parallel processing standard as part of its forthcoming Snow Leopard OS X release, with arch-rival ATI's parent company AMD, and with other partners including 3DLABS, Activision Blizzard, Apple, ARM, Barco, Broadcom, Codeplay, Electronic Arts, Ericsson, Freescale, HI, IBM, Intel Corporation, Imagination Technologies, Kestrel Institute, Motorola, Movidia, Nokia, NVIDIA, QNX, RapidMind, Samsung, Seaweed, TAKUMI, Texas Instruments and Umeå University.
So, who's managing the OpenCL standard, and what about Microsoft's rival DirectX 11 Compute standard? Updated 12-11-08:And, what class of computers can benefit from OpenCL coding?To learn more, and for your chance to sound off, join us after the jump.
According to Engadget, a pair of problems have popped up on Apple's refreshed MacBook line, the first of which has to do with maxing out the RAM. Some users have complained that running 4GB, whether it be from Apple or a third party, is turning their MacBooks and MacBook Pros into pricey paper weights. With 4GB of RAM installed, affected users claim their MacBooks suffer from random freezes and the only solution is to downgrade to 3GB or 2GB. Apple hasn't yet acknowledged any known issues with maxing out the RAM, but forum users aren't the only ones reporting problems - mobile technology blog site jkOnTheRun reports seeing the same thing.
The other issue rumored to be affecting Apple's new MacBooks comes from news and rumor site The Inquirer, who claims that the MacBook Pro's Nvidia 9600M GPU suffers from the same material defect that affected previous MacBook Pros equipped with Nvidia's 8600M GT GPU. As The Inq tells it, to see the problem:
"You would need to buy a MacBook off the shelf, disassemble it, desolder the chips, saw them in half, encase them in lucite, and run them through a scanning electron microscope equipped wiht an X-ray microanalysis system like this. This is exactly what we did."
The Inq posted several pics with accompanying analysis, which it claims proves that at least some current MacBooks are still using older Nvidia chips containing 'bad bumps,' which in the past has led to blank screens and other video errors in some cases.
Earlier this month Nvidia reiterated interest in the mini-laptop market, essentially saying it was taking a wait-and-see approach. The graphics chip maker must have liked what it has seen since then, because it appears the company isn't going to wait much longer.
According to DigiTimes, Intel and Nvidia are taking their suddenly cozy relationship into the netbook sector. The two, who just recently finally resolved a licensing dispute allowing SLI technology on Intel chipsets, are said to be working together to enable Nvidia chipset support for the Atom platform. If the rumor pans out, Nvidia's MCP7A chipset will be the first to support Atom processors, with Asus, Gigabyte, and MSI ready to take advantage of the collaboration.
Rumors of a partnership between Intel and Nvidia have been going on since last summer. At the time, Nvidia and VIA had entered into an alliance, leading many to speculate the move was intended to give Nvidia a bargaining chip in convincing Intel to let its Atom chip support Nvidia's MCP73 IGP chipset, or face stiff competition from what could be a potent VIA Nano platform.
No matter what prompted the change of heart, this partnership can be viewed as another major win for Nvidia, who has had a tumultuous year. But more recently, the company has managed to wiggle its way into Apple's refreshed MacBook line, and now appears to be in position to profit from one of the few markets withstanding the global economic storm.
If you've been thinking about upgrading to Nvidia's GeForce GTX 260 videocard, you may want to hold off for a few weeks. According to Chinese site Expreview, Nvidia will release a new 55nm-based GTX 260 along with a 55nm GTX 295 (GTX 260 GX2) in January 2009. And if history tells us anything, Nvidia tends to do well with core revisions (G92-based 8800GT, for example). Expreview posted several pics of the revised GTX 260, which it claims were sent in from Zotac.
In addition to a die shrink, the new GTX 260, or at least Zotac's version, looks to be built with a 10-layer PCB design rather than 14 layers as found on current GTX 260/280 videocards, Expreview says. The new revision also upgrades its 3+2 phase power modules to 4+2 phase.
Other specs look to remain the same, such as the number of stream processors (216) and core and memory frequencies. This means you might not see a leap in stock performance, but in theory, the power consumption, heat output, and overclocking potential should all be improved.
No word yet on projected pricing, which could either sweeten or spoil the whole deal.
Nvidia's nZone website has posted download links to new beta videocard drivers, version 180.84, for both Vista and XP. Little information has been given about the new drivers, other than that they're intended to improve gameplay with Rockstar's new Grand Theft Auto IV videogame.
"Nvidia recommends that you update your system with the following GeForce v180.84 driver for the best experiences on Grand Theft Auto IV," nZone writes.
Users who have installed and played GTA IV on the PC have complained of varying issues, including missing textures and intermittent crashes. GTA IV's support page lists several troubleshooting steps, one of which recommends users download the newest drivers with a link to the nZone page containing the beta release. However, no specific bug fixes or performance issues have been identified with the new drivers, so it might be hard to tell what difference they're making.
As always, take proper precautions whenever experimenting with pre-release code. As Nvidia discloses regarding beta drivers, they "may include significant issues." When you're ready to take the leap:
Last week several Xbox 360 and Roku set-top box owners complained of loss of quality and irritating delays when firing up a movie through Netflix's streaming download service. At the time, the glitch had Netflix stumped, but now it appears Netflix has identified the problem and fixed whatever was causing the issue.
"This was a temporary issue that we believe we have resolved," Netflix wrote on its blog site. "Working with our content distribution partners and key carriers, we made some specific changes that should restore everyone's experience to where it was before - high quality streaming."
However, there might still be work to do. Netflix posted its update on Friday, December 5th, but users throughout the weekend were still reporting lingering issues in the comments section.
If Rambus could find a way to take people to court just for using the word 'memory,' we have little doubt it would. In the meantime, the legal beagles at Rambus have set their sights on Nvidia and has been granted its request by the U.S. International Trade Commission (ITC) to investigate the GPU maker, along with any company using Nvidia products beleived to be infringing.
"In its complaint, Rambus has alleged infringement of nine Rambus patents," Rambus wrote in a press release. "The accused products include NVIDIA products that incorporate DDR, DDR2, DDR3, LPDDR, GDDR, GDDR2, and GDDR3 memory controllers, including graphics processors, and media and communications processors."
The dispute over Nvidia's products isn't a new one and dates back to July, when Rambus accused Nvidia of violating 17 patents covering chipsets, graphics processors, and media communication processors. At the time, Rambus claimed it had spent six years trying to sell Nvidia a license to use its technology, and wanted an injunction preventing Nvidia from selling allegedly infringing products.
It's hard to fathom anyone using a netbook as their primary PC. There's only so much you can do with an under-powered ultraportable ill-equipped to run Photoshop, let alone try to attempt any kind of gaming. But as a secondary unit, the pint-sized PCs have proven extremely popular. Is there potential for netbooks to be even more?
Nvidia this week reiterated interest in the mini-laptop market, however hesitant the company might be. Taking a wait-and-see approach, Marv Burkett, the company's chief financial officer, said "we're not saying we're not interested; it's a matter of how the market will evolve." Ironically enough, Nvidia jumping on board might be just the evolutionary step the netbook market needs.
Hit the jump to find out what impact Nvidia coudl have on the netbook market, and why you should care.
“Personal” and “supercomputer” aren’t words that would usually appear side by side, unless you’re a mastermind at Nvidia. With the announcement of their latest machine, the Tesla Personal Supercomputer, they’re looking to bring what was normally thought of as gigantic, to the small time.
The Tesla only costs 1/100th of what a normal supercomputer cluster would cost, and only takes up a small fraction of the space. Thanks to heterogeneous computing, the process of CPUs acting in tandem with GPUs, it all fits right into a desktop form factor.
It’s reported that the Tesla is based off of Nvidia’s CUDA architecture, making it possible for the system to be programmed in the C language. 960 cores can be working side by side inside the system, and it’s claimed that these systems are already in use at MIT, Cambridge and other environments.
How much will your own personal supercomputer run you? An admittedly reasonable 10 large. Hey, 960 cores is a bargain at that rate.