Some time ago I purchased a Dell E1705 laptop with almost all the options. I was very happy with the laptop and its GeForce 7900 GS. It allowed me to play just about any game on the market. Everything was great until I upgraded my machine to Vista, but I can’t find any Vista drivers for my 7900 GS.
I’ve been waiting for more than a year now, and there’s still nothing from nVidia or Dell. So I was wondering: Do you know how I can get my card to work right? I would even take homemade drivers at this point if I knew where to find some!
Sporting almost the same configuration as the reference design we previewed last month, BFG’s GeForce GTX 280 delivers amazing performance with the second-generation DirectX 10 chipset from Nvidia. It soundly spanks ATI’s new 4870, as well as all but the dual-GPU graphics solutions from the previous generation—and even against those, the GTX 280 wins all but a few benchmarks. The real question we’re asking is, Do we need this much power?
When it comes to graphics, killing two birds with one stone means squeezing out better performance from a newly released GPU while also reducing the power draw, and that's exactly what Nvidia has done. The 9800M and 9700M graphics cores are Nvidia's newest additions to its Geforce Mobile line, bringing desktop-like performance to the laptop.
The 9800M comes in three models, with the 9800M GTX taking residence at the top of the heap. Boasting the same G92 core that was so popular on the desktop, the 9800M GTX comes clocked at 500MHz and uses 112 shaders running at 1,250MHz each. Combined with a 256-bit memory interface, that translates into 420 gigaflops of processing power, putting it nearly on par with its desktop counterpart, the 8800 GT. And for the hardcore mobile gamers, the flagship model is SLI capable. As for the rest of the cards:
Followers of the ongoing soap opera between Intel and Nvidia know no love has been lost between the two tech titans over the years. When AMD and ATI merged back in July of 2006 the internet was abuzz with rumors that an Intel/Nvidia merger couldn’t be far behind. As time pressed on and this possibility began to seem increasingly less likely, a competitive culture began to form between the two companies. The saber rattling has reached deafening proportions of late, and a seemly endless stream of jabs has dominated the headlines. Any merger pushed through now might require barbed wire to separate the water coolers. Both organizations seem determined to earn a slice of the other’s market share, and for once they seem willing to do it the hard way, though innovation. As Intel’s pushes into accelerated graphics with its Larrabee platform, Nvidia wants us to believe the CUDA API for its graphics cards will allow video accelerators to dominate the CPU.
What is CUDA, and will it allow your GeForce to replace your CPU? More after the jump.
If you’re already gaming with a G92-based 8800 GTS, there’s very little reason to move up to a G92-based 9800 GTX such as PNY’s XLR8. The architecture in both GPUs is nearly the same, with 128 stream processors, a 256-bit interface, and 512MB of GDDR3. Slightly faster clock speeds yield only a modest bump in performance. That’s not to say the 9800 brings nothing to the table, but you’ll have to decide for yourself whether its offerings are worth the price.
Watching the ongoing race between AMD and Nvidia to build the ultimate graphics processor reminds us of the tale of the tortoise and the hare. AMD has played the hare, aggressively bounding ahead of Nvidia in terms of process size, number of stream processors, frame buffer size, memory interface, die size, and even memory type. Yet Nvidia always manages to snag the performance crown. The GeForce 200 series is but the latest example. We lay hands on the smokin’-fast GeForce GTX 280. Could this be the graphics processor to finally tame Crysis? We reveal what makes the card unique and how its architectural advances translate in the benchmarks!
Compared to AMD’s gracefully engineered Radeon 3870 X2, Nvidia’s GeForce 9800 GX2 (represented here by Gigabyte’s implementation) is something of a kludge. But when we consider the performance that Nvidia’s design delivers, it’s hard to complain about elegance.