Nvidia has reiterated that it won’t provide open source drivers for Linux. They claim that there is no need for it. It does however provides binary Linux drivers and has open sourced a few drivers.
According to ZDnet Nvidia said in a statement: “NVIDIA supports Linux, as well as the Linux community and has long been praised for the quality of the NVIDIA Linux driver. NVIDIA’s fully featured Linux graphics driver is provided as binary-only because it contains intellectual property NVIDIA wishes to protect, both in hardware and in software,”, which was in response to developers’ direct criticism of hardware vendors that produce just closed source drivers.
Followers of the ongoing soap opera between Intel and Nvidia know no love has been lost between the two tech titans over the years. When AMD and ATI merged back in July of 2006 the internet was abuzz with rumors that an Intel/Nvidia merger couldn’t be far behind. As time pressed on and this possibility began to seem increasingly less likely, a competitive culture began to form between the two companies. The saber rattling has reached deafening proportions of late, and a seemly endless stream of jabs has dominated the headlines. Any merger pushed through now might require barbed wire to separate the water coolers. Both organizations seem determined to earn a slice of the other’s market share, and for once they seem willing to do it the hard way, though innovation. As Intel’s pushes into accelerated graphics with its Larrabee platform, Nvidia wants us to believe the CUDA API for its graphics cards will allow video accelerators to dominate the CPU.
What is CUDA, and will it allow your GeForce to replace your CPU? More after the jump.
Owning the performance crown isn't enough; Nvidia wants to rule the mainstream, too. The GPU maker's highly popular G92 core has nearly defined the term 'bang-for-buck,' and Nvidia plans to tweak the core one more time to steal some thunder away from AMD's upcoming Radeon HD 4850. The new 9800 GTX+ will shrink the G92 core from 65nm to 55nm, and push the core, shader, and memory clockspeeds to 738MHz, 1836MHz, and 1000MHz respectively. The new card will retail at $249. And if that wasn't enough, the original 9800 GTX will drop down to $199. In other words, game on, AMD.
There's never been a better time to get more GPU bang for your gaming buck, but if you're shopping for an Nvidia graphics card, be prepared to be inconvenienced. Why? Because for some inexplicable reason, Nvidia has decided to enforce Manufacturer Advertised Pricing (MAP), which prevents e-tailers from advertising a price below a pre-determined cutoff point. So instead of seeing the actual price you can expect to pay, you must first add the item to your shopping cart before seeing the final cost. E-tailers that choose not to comply will face a series of penalties, but it's you, Joe Consumer, that really pays for this new pricing strategy.
Watching the ongoing race between AMD and Nvidia to build the ultimate graphics processor reminds us of the tale of the tortoise and the hare. AMD has played the hare, aggressively bounding ahead of Nvidia in terms of process size, number of stream processors, frame buffer size, memory interface, die size, and even memory type. Yet Nvidia always manages to snag the performance crown. The GeForce 200 series is but the latest example. We lay hands on the smokin’-fast GeForce GTX 280. Could this be the graphics processor to finally tame Crysis? We reveal what makes the card unique and how its architectural advances translate in the benchmarks!
This week, we test the theory that absence makes the heart grow fonder and bring back the podcast crew after a two-week layoff. On the show, Tom, Will, Gordon, and Andy explain just what could have taken them away from their podcasting duties. As Maximum PC nears its 10th anniversary, we also take a look back at our greatest accomplishments... and our biggest blunders! You can add your thoughts on this subject in the comments.
Compared to AMD’s gracefully engineered Radeon 3870 X2, Nvidia’s GeForce 9800 GX2 (represented here by Gigabyte’s implementation) is something of a kludge. But when we consider the performance that Nvidia’s design delivers, it’s hard to complain about elegance.