Thanks to fierce competition between two GPU juggernauts and a worldwide economic recession, never has there been a better time for gamers to trade in their scratch for the latest videocard technology from either AMD/ATI or Nvidia. The price to performance ratio is at an all time high, but before we get too spoiled on falling prices for increasingly powerful GPUs, AMD has made it clear that it has no intention of duking it out with Nvidia in a price slashing war.
"Are we interested in winning share by losing money on every GPU we ship? No," said Rick Bergman, AMD's senior vice president. "We're not going to engage in that and we haven't had to."
Bergman's comments came in response to questions about what the chip maker was doing to compete with Nvidia at the low end. But according to Bergman, AMD has been able to entice OEMs with better stability and performance per dollar versus Nvidia's aggressive pricing strategy.
"If you go and look at Dell, HP, or Acer's website, you'll actually see a lot of ATI graphics at the entry level," Bergman added.
Bergman also played off any concerns AMD might have with Intel's upcoming Larrabee, while also adding that in a year from now, AMD will "have something new and exciting," but did not elaborate on what that might be.
Lenovo’s IdeaPad S12 is the soul of a netbook trapped in the anatomy of a notebook. It has now become clear that Lenovo plans to release three variants of this 12-inch netbook, which it had announced as the world’s first Ion-based netbook last month – the Ion-based SKU will be available later in the summer. Lenovo has begun accepting pre-orders for a Nano-based variant of this netbook. Of course, an Atom-powered SKU is also available.
The Via Nano powered IdeaPad S12 features a VIA Nano ULV 2250 processor and VIA Chrome9 HC3 graphics. The combination is expected to outperform the Atom-based S12 variant, featuring the Atom N270 processor along with Intel 945GSE chipset. The Nano-powered S12 can be ordered for $449, whereas its Atom-toting counterpart is priced $499.
Just in case you missed our review of the new GTX 295 reference board last month, we’ll revisit the high points. To make a GeForce GTX 295, Nvidia sandwiched a fairly large heatsink between a pair of boards—that’s one kick-ass sandwich!
The GTX 295’s GPUs are basically modified GTX 280 GPUs. They’ve got the same shader core configuration as the GTX 280, but Nvidia shrunk the chip’s die from 65nm to 55nm, and lowered the core clock speed to 576MHz (the same as the GTX 260). These two adjustments help keep power requirements and heat generation under control, while the full complement of 240 shader cores keeps the frame rate up in shader-limited benchmarks, such as Crysis and Far Cry 2.
At long last, Nvidia may finally adding DirectX 10.1 support to its videocards, assuming Fudzilla is right on the money. According to the news and rumor site, Nvidia's GT200 will be refreshed to a 40nm manufacturing process and the new chips will sport DX10.1.
To date, ATI has been the only one to offer DX10.1 support on some of its videocards (yes, we're completely ignoring S3's Chrome series), a minor extension to DX10 that thus far hasn't meant much for gamers. To to fuel the conspiracy flames, that could change with Nvidia jumping on board. Remember that DX10.1 instructions did at one point show a performance boost on ATI cards in Assassin's Creed, but after a patch removed support for the instruction set, some accused Ubisoft of bowing to pressure from Nvidia after the GPU maker sponsored the title with its The Way It's Meant To Be Played program.
In any event, it looks like refresh will come on the tail end of summer or early fall.
Zotac, a relative newcomer to the videocard market, has doubled up the amount of GDDR3 memory found on most GTX 275 videocards to 1792MB. Sparkle and EVGA are the only other two GPU partners to pack the same amount of memory on the GTX 275.
"We try to deliver the best performance value for gamers. With the new Zotac GeForce GTX 275 1792MB, we've managed to achieve a balance of performance and value for those that demand more video memory for gaming at extreme HD resolutions," said Carsten Berger, marketing director, Zotac International.
Additional memory aside, Zotac's GTX 275 follows closely Nvidia's reference specification, with core, shader, and memory clockspeeds checking in at a 633MHz, 1404MHz, and 2268MHz, respectively, 240 stream processors, and a 448-bit memory interface.
Finally, here’s a 3D gaming solution that doesn’t send us headfirst into a vomit bag. GeForce 3D Vision is Nvidia’s attempt to revive stereoscopic 3D, a century-old technology that has never been implemented successfully in PC gaming (despite many headache-inducing efforts in the late ’90s). Along with wireless shutter glasses and an IR emitter, this $200 kit comes with the promise that you’ll be able to enhance your existing library of DirectX games by turning them into true 3D experiences—if you’re running a GeForce 8800 GT or better videocard. And for the most part, the promise is delivered —but not without some serious issues.
According to Rick Bergman, AMD’s Senior Vice President for Platforms, he and his crew are looking to beat Nvidia to the world of DX11.
According to Bergman, “We want to supply hardware to Microsoft and software developers so they can make DX11 games on our hardware first.” This would put AMD ahead of Nvidia, something that hasn’t happened for several years, thanks to Nvidia’s dominance in the DX10 market. “We were kind of fighting from behind, but with DX11 it feels like we’re ahead this round.”
Despite reports that very few game titles would take advantage of DX11, Bergman is keeping up his enthusiasm. Reportedly, he knows of a handful of independent software vendors that are working “eagerly” to release games.
Nvidia this week released new WHQL videocard drivers, now in version 186.18. A handful of bugs have been squashed in this newest update, most of which relate to Windows 7. Some of the resolved issues include:
Resolves issue where PhysX option would be disabled by default in multi-GPU configurations (XP)
Resolves issue where the system would not resume from Standby mode when running a 9800 GX2 (Vista and Windows 7)
Changes made to program settings from the Nvidia Control Panel in 3D Settings are now preserved after closing and reopening when running a 9500 GS (Windows 7)
SLI focus display can now be switched using "Set SLI configuration" for GTX 260 owners (Windows 7)
Not a whole lot has been done to improve performance if you're upgrading from a previous 186.xx or 185.xx driver release. However, if you're upgrading from 182.xx, Nvidia claims double digit performance boosts in a number of titles, including up to 45 percent in Mirror's Edge with antialiasing enabled, 30 percent better performance in Half-Life 2 engine games with tri- and quad-SLI enabled, and a 25 percent boost in The Chronicles of Riddick: Assault on Dark Athena.
For some time now there’s been speculation as to just what processor is under the hood of the Zune HD. Now, it has finally been confirmed that it is the Nvidia Tegra that’s allowing potential users to view video in HD.
PC Perspective’s Ryan Shrout was able to confirm the news after hearing about the Tegra’s role in the new Zune at Computex in early June. The Tegra was chosen due to its ability to decode a video stream using only 150 mWatts of power and output audio at only 20 mWatts.
With the Zune HD’s 3.3-inch 480x272 OLED display, it’ll be able to playback H.264 content and output video via HDMI at 720p.
ASRock, a subsidiary of Asus which made a name for itself offering hybrid AGP/PCI-E motherboards in the socket 939 days without a performance penalty, plans to release a netbook built around Nvidia's Ion platform. Or as ASRock wants to call it, a Multibook.
The 12.1-inch Multibook G22 will come with Intel's dual-core Atom 330 processor (1.60GHz), 2GB of DDR2-667 memory, Nvidia Ion graphics, 320GB hard drive (with support for up to 500GB), a 10-in-1 card reader, 1.3MP webcam, DVD burner, 3 USB 2.0 ports, HDMI, and a bunch of other connections.
At 3.3 pounds sans battery and over an inch thick, it might be tough to classify the G22 as a netbook, which seems to be just fine with ASRock.