If you've been thinking about upgrading to Nvidia's GeForce GTX 260 videocard, you may want to hold off for a few weeks. According to Chinese site Expreview, Nvidia will release a new 55nm-based GTX 260 along with a 55nm GTX 295 (GTX 260 GX2) in January 2009. And if history tells us anything, Nvidia tends to do well with core revisions (G92-based 8800GT, for example). Expreview posted several pics of the revised GTX 260, which it claims were sent in from Zotac.
In addition to a die shrink, the new GTX 260, or at least Zotac's version, looks to be built with a 10-layer PCB design rather than 14 layers as found on current GTX 260/280 videocards, Expreview says. The new revision also upgrades its 3+2 phase power modules to 4+2 phase.
Other specs look to remain the same, such as the number of stream processors (216) and core and memory frequencies. This means you might not see a leap in stock performance, but in theory, the power consumption, heat output, and overclocking potential should all be improved.
No word yet on projected pricing, which could either sweeten or spoil the whole deal.
Nvidia's nZone website has posted download links to new beta videocard drivers, version 180.84, for both Vista and XP. Little information has been given about the new drivers, other than that they're intended to improve gameplay with Rockstar's new Grand Theft Auto IV videogame.
"Nvidia recommends that you update your system with the following GeForce v180.84 driver for the best experiences on Grand Theft Auto IV," nZone writes.
Users who have installed and played GTA IV on the PC have complained of varying issues, including missing textures and intermittent crashes. GTA IV's support page lists several troubleshooting steps, one of which recommends users download the newest drivers with a link to the nZone page containing the beta release. However, no specific bug fixes or performance issues have been identified with the new drivers, so it might be hard to tell what difference they're making.
As always, take proper precautions whenever experimenting with pre-release code. As Nvidia discloses regarding beta drivers, they "may include significant issues." When you're ready to take the leap:
Last week several Xbox 360 and Roku set-top box owners complained of loss of quality and irritating delays when firing up a movie through Netflix's streaming download service. At the time, the glitch had Netflix stumped, but now it appears Netflix has identified the problem and fixed whatever was causing the issue.
"This was a temporary issue that we believe we have resolved," Netflix wrote on its blog site. "Working with our content distribution partners and key carriers, we made some specific changes that should restore everyone's experience to where it was before - high quality streaming."
However, there might still be work to do. Netflix posted its update on Friday, December 5th, but users throughout the weekend were still reporting lingering issues in the comments section.
If Rambus could find a way to take people to court just for using the word 'memory,' we have little doubt it would. In the meantime, the legal beagles at Rambus have set their sights on Nvidia and has been granted its request by the U.S. International Trade Commission (ITC) to investigate the GPU maker, along with any company using Nvidia products beleived to be infringing.
"In its complaint, Rambus has alleged infringement of nine Rambus patents," Rambus wrote in a press release. "The accused products include NVIDIA products that incorporate DDR, DDR2, DDR3, LPDDR, GDDR, GDDR2, and GDDR3 memory controllers, including graphics processors, and media and communications processors."
The dispute over Nvidia's products isn't a new one and dates back to July, when Rambus accused Nvidia of violating 17 patents covering chipsets, graphics processors, and media communication processors. At the time, Rambus claimed it had spent six years trying to sell Nvidia a license to use its technology, and wanted an injunction preventing Nvidia from selling allegedly infringing products.
It's hard to fathom anyone using a netbook as their primary PC. There's only so much you can do with an under-powered ultraportable ill-equipped to run Photoshop, let alone try to attempt any kind of gaming. But as a secondary unit, the pint-sized PCs have proven extremely popular. Is there potential for netbooks to be even more?
Nvidia this week reiterated interest in the mini-laptop market, however hesitant the company might be. Taking a wait-and-see approach, Marv Burkett, the company's chief financial officer, said "we're not saying we're not interested; it's a matter of how the market will evolve." Ironically enough, Nvidia jumping on board might be just the evolutionary step the netbook market needs.
Hit the jump to find out what impact Nvidia coudl have on the netbook market, and why you should care.
“Personal” and “supercomputer” aren’t words that would usually appear side by side, unless you’re a mastermind at Nvidia. With the announcement of their latest machine, the Tesla Personal Supercomputer, they’re looking to bring what was normally thought of as gigantic, to the small time.
The Tesla only costs 1/100th of what a normal supercomputer cluster would cost, and only takes up a small fraction of the space. Thanks to heterogeneous computing, the process of CPUs acting in tandem with GPUs, it all fits right into a desktop form factor.
It’s reported that the Tesla is based off of Nvidia’s CUDA architecture, making it possible for the system to be programmed in the C language. 960 cores can be working side by side inside the system, and it’s claimed that these systems are already in use at MIT, Cambridge and other environments.
How much will your own personal supercomputer run you? An admittedly reasonable 10 large. Hey, 960 cores is a bargain at that rate.
Nvidia has released new WHQL-certified videocard drivers for GeForce 200-series, 9-series, and 8800-series GPUs only (owners of older videocards need not apply). The approximately 73MB download enables finally brings to fruition a license agreement between Nvidia and Intel by enabling SLI on SLI-certified Intel X58-based motherboards. The new driver also supports multi-monitor support in an SLI-configuration, which previously had only been available with beta drivers. PhysX acceleration is also enabled when installing the new driver.
On the gaming front, Nvidia claims double-digit percentage performance gains in a number of titles, including a giant 80 percent boost in Lost Planet: Colonies. Far Cry 2 is the other big beneficiary with a purported 38 percent performance gain. Devil May Cry 4, Assassin's Creed, BioShock, Comapny of Heroes: Opposing Fronts, Crysis Warhead, Race Driver: GRID, and World of Conflict all receive performance gains ranging from 10 percent to 25 percent, according to Nvidia. And for you benchmarking gurus, 3DMark Vantage's performance preset should perform 10 percent better as well.
Mirror's Edge may not be wall-running onto PCs until January, but at least it's sticking the landing. Today, DICE announced that -- if your machine has the cojones to run it -- Mirror's Edge will support PhysX's Newtonian prowess, giving Faith's PC adventure console-eclipXing effects.
"With the NVIDIA PhysX physics engine, the world of Mirror's Edge comes to life with real affects of wind, weapons impact, and in-game movements. Every-day objects within the game become part of the overall experience. Cloth, flags, and banners can now impact weapons and players; ground fog interacts with the player's footsteps; explosions fill the air with smoke and debris; and weapon impacts are enhanced with interactive particles," read the press release.
But how's it look? Well, GameTrailers has a new trailer if you'd like a tantalizing taste of the eye-candy.
So then, MPC readers, now that DICE is sliding a few pieces of realistically billowing cloth under the table, are you cool with the seemingly arbitrary delay? Or is your rage simply too fiery -- fueled by your 143rd run through Mirror's Edge 2D and the completion of our your stark white Mirror's Edge skyscraper case mod, complete with custom Faith action figure?
A report by Jon Peddie Research (JPR) earlier this week confirmed that AMD's recent success with its Radeon 4000 series has helped the company take back some market share from rival GPU maker Nvidia, while also forcing Nvidia and its partners to lower prices on the recently released GTX 200 series. It appears even more cuts are on the way.
DigiTimes, citing un-named sources at graphics cards makers, says that Nvidia "is planning to cut its graphics card prices in an attempt to curb further loss of market share" to AMD. For its part, AMD isn't finished taking it to Nvidia and anticipates grabbing 50 percent of the market following lowered prices on its ATI Radeon HD 4000 series.
In short, it continues to be a great time to be a PC gamer, and it only looks to get better as AMD and Nvidia battle on the pricing front.
Nvidia looks to take on both Intel and Apple and make a bid for the mobile device market with its Tegra chip. The low powered "computer on a chip" boasts an ARM based processor core, HD video decoder capable of 1080p playback, a variation of the GeForce graphics core, an integrated media processor, and more.
Right now the chip is in the development phase, which company president and CEO Jen-Hsun Huang says is going exceedingly well. Barring any snags, Huang says we can expect to see Tegra shipping sometime between April and June of 2009. The launch would likely kick off with the Tegra 600 running at 700MHz, Tegra 650 running at 800MHz, and the Tegra APX runing at 600MHz.
It remains unclear which partners plan to utilize Tegra, but given the specs, it shouldn't be hard to find willing manufacturers.