It's been a rough ride for Nvidia as of late, who not only has had to contend with a suddenly competitive ATI, but also finds itself battling a bad batch of mobile GPUs (which might turn out to be a bigger problem than initially stated). The struggles have turned financial with the graphics chip maker reporting a net loss of $120.9 million in the second quarter, or 22 cents a share. This is in stark contrast to one year ago when the company posted a profit of $172.7 million, or 29 cents a share.
The quarter's results include a $196 million charge Nvidia took to cover warranty, repairs, and other costs associated with an "abnormal failure rate" among its mobile GPUs. Nvidia executives are hopeful for a somewhat better third quarter, saying they expect revenue to grow "slightly."
"We didn't lose any share, the market just got soft on us," said chief executive Jen-Hsun Huang. And while Huang admitted that the second quarter results are "disappointing," the company still saw its shares rise by 10 percent after announcing a $1 billion boost to its stock buyback program.
AMD Cinema 2.0 is a technology every gaming aficionado, game developer, movie buff and filmmaker would die for. Photo-realistic 3D rendering is the Holy Grail that researchers and developers have been chasing for a long time. Now that AMD is unwrapping its Cinema 2.0 tech layer after another, it seems as though the wall of technological disability that has stood between virtual reality and the real world is about to be razed to the ground.
But for more details of the groundbreaking technology you will have to make the "jump" to the rest of this entry.
Overclock.net forum member nitteoclaims to have built a Folding@Home farm with no less than 51 GPUs, and he has the pics to prove it. In them are a mixture of 8800GT and 8800GS videocards spread out across a variety of MSI and Gigabyte motherboards. Final numbers are still be tallied, but nitteoestimates he'll pull in over 250,000 points per day on his new setup, and things only look to get better with the CUDA-based folding client.
That's all well and good for Overclock.net (and the Folding community in general), but that also means Team Maximum PC has to keep it kicked up into high gear. Maximum PC currently holds the 4th spot in team rankings and could use your help. If you want to Fold for your favorite magazine, add team 11108 to your client's profile, and drop by the forum for tips on how to optimize your production.
No one has been more critical of Nvidia then rumor and news outlet The Inquirer, who recently declared that all of the chipmaker's G84 and G86 parts are bad. The extent of the problem is still to be determined, but here's what's known so far.
A batch of bad GPUs have found their way into the wild causing an "abnormal failure rate" among certain laptop models
To deal with the problem, Nvidia said it was setting aside a one-time hit of $150 to $200 million to cover warranty and repair costs associated with the faulty mobile parts
Both HP and Dell have released a list of notebook models potentially affected by the faulty GPUs and are encouraging owners to update their BIOS as a preventive measure (the newer BIOS kicks on the cooling fan earlier than it normally would). HP has also extended their warranty for the affected models.
Nvidia has since moved on to its 9-M series GPUs, and in the process has presumably solved whatever problem affected the previous generation parts, right? Not so fast, says the The Inq. According to the rumor site, the fundamental flaw in the manufacturing process still exists, and now G92 and G94 parts are reportedly failing. The Inq claims that no less than four partners are already seeing the new chips go bad at high rates, and believes that Nvidia "is simply stonewalling everyone" about the alleged problem.
If true, another batch of parts could be disastrous for the chip maker, who continues to lose graphics market share to Intel and has seen its stock price plummet in the wake of a disappoint 8-K filing.
Is the problem bigger than Nvidia's letting on, or will it be this latest rumor that ultimately turns out to be the dud?
Not without their share of pre-release hype, AMD's 4870 X2 videocards lived up to every bit of it by obliterating the competition in this year's Dream Machine (a single 4870 X2 churned out twice as many frames as Nvidia's GTX280 in 3DMark Vantage). And they did it months before they were supposed to go public, which means there were architectural tweaks yet to be made.
The wait is over, and at long last, AMD has finally announced what it rightfully calls the world's fastest graphics card, the ATI Radeon 4870 X2. Built on a 55nm manufacturing process, the dual-GPU videocard comes with the computational muscle to deliver 2.4 teraFLOPS, and ATI can still lay claim as the only manufacturer to support DirectX 10.1 instructions. Rounding out the feature-set, the 4870 X2 ships with 2GB GDDR5, 1600 stream processors, and a 750MHz core clockspeed (reference). MSRP has been set to $549 with stock available now.
AMD also made mention of it's upcoming 4850 X2 videocard. As the name implies, this card will also be a dual-GPU solution (clocked at 625MHz), and like it's bigger brother it will come with 1600 stream processors. Instead of GDDR5, the 4850 X2 will ship with 2GB of GDDR3. Look for availability this September with an estimated sub-$400 street price.
As Intel gears up to sample Larrabee later this year, the chip maker continues to build hype over the architecture's x86 roots. Intel is quick to point out that developers will be able to program in C or C++ languages just as they're used to doing on x86 processors, giving them an easy way to port applications from other platforms over to Larrabee.
Meanwhile, Nvidia also wants to build hype, but over its competing CUDA architecture. DailyTech has posted Nvidia's comments on the issue, which read:
CUDA is a C-language compiler that is based on the PathScale C compiler. This open source compiler was originally developed for the x86 architecture. The NVIDIA computing architecture was specifically designed to support the C language - like any other processor architecture. Competitive comments that the GPU is only partially programmable are incorrect - all the processors in the NVIDIA GPU are programmable in the C language.
NVIDIA's approach to parallel computing has already proven to scale from 8 to 240 GPU cores. Also, NVIDIA is just about to release a multi-core CPU version of the CUDA compiler. This allows the developer to write an application once and run across multiple platforms. Larrabee's development environment is proprietary to Intel and, at least disclosed in marketing materials to date, is different than a multi-core CPU software environment.
Andrew Humber from Nvidia also went on to clarify that CUDA is a brand name for the C-compiler rather than being two different things.
Anyone else feel chilly when Nvidia and Intel are in the same room?
It doesn't matter if you seek solace in Creationism or prescribe to the theory of evolution, everyone should be equally stoked about what Nvidia's calling "Big Bang II." No, the graphics chip maker isn't gearing up to end the debate on man's existence, but even better, the company will improve man's quality of life with a new driver package that looks poised to earn its codename by bringing gamers at least one big, long overdue improvement.
Bang Part I
The biggest news associated with Nvidia's ForceWare Release 180 (R180) is the introduction of SLI multi-monitor support. Ever since Nvidia introduced SLI, the inability to run a second monitor while gaming has been a major complaint, and even more so as LCD displays have fallen in price. That finally looks to no longer be the case with the new driver release, and gamers will be able to frag opponents while simultaneously keeping an eye on their email inbox, incoming IMs, and everything else that would previously be blacked out on a second monitor.
Find out what else is bangin' with the new driver after the jump.
Having already moved on to its 9-M series GPUs, Nvidia presumably has solved whatever problem led to an "abnormal failure rate" in the what the company still contends only affects a limited batch of previous generation GPU and MCP products. Exactly how limited that batch is might never be fully disclosed, but it appears the problem may be more widespread than consumers were led to believe.
Just over a week ago Dell made available a list of its notebooks that could possibly be affected by the GPUs believed to be suffering higher than expected failure rates and is recommending owners update their BIOS to reduce their risk of running into a problem. The updated BIOSes modify the fan profile to help regulate GPU temperature fluctuations, but as Dell notes, the new parameters won't help customers who are already suffering video-related issues.
Dell isn't alone, and now HP has also released a list of models that qualify for 'Warranty Service Enhancement' (curiously absent is the DV97xx series). And like Dell, HP is also recommending its owners update their BIOS as a preventive measure.
So are all G84 and G86 parts bad like The Inq surmised early in July? No one but Nvidia knows for sure, but looking over the list of affected models would seem to indicate the allegation could hold some merit.
Did Nvidia drop the ball harder than they're letting on?
AMD's acquisition of graphics chip maker ATI continues to be a sour point whenever the company talks about its finances, most recently coming up when AMD said it would take a near billion dollar charge in the second quarter. Given AMD's financial status, it's easy to criticize the company's decision to overpay for a company that has yet to benefit impatient investors. That could change if AMD's Fusion ends up revolutionizing the PC landscape.
Up to this point, AMD hasn't gone into specifics regarding its upcoming CPU+GPU chip, but according to TGDaily, industry sources aren't being as tight-lipped. If the rumblings are to be believed, the first Fusion processor (code-named Shrike) will consist of a dual-core Phenom CPU and an ATI RV800 GPU core, Previous rumors had the first run Fusion chips built around a dual-core Kuma CPU and RV710 graphics chip, but those plans appear to have gone by the wayside as AMD has had more time to develop a low-end RV800-based core.
The sources also indicate that Fusion will likely be introduced as a half-node chip built around a 40nm manufacturing process, and will later move to 32nm, possibly by the beginning of 2010.
3D graphics technology has grown by leaps and bounds since 3DFX first laid its Voodoo on the computing world, and today's videocards boast everything from multiple GPUs in a single package to the promise of physics processing. And not just for gaming, fanatical Folders can crunch through more proteins by utilizing their GPU, or decode a high definition movie on their new big screen TV.
Leading the charge into this new era of 3D computing are Nvidia and ATI, two companies who have recently started going at each others' throats with aggressive price cuts and a deluge of new videocards while simultaneously chasing the performance crown. But for all their battles, both old and new, it's Intel, CPU maker extraordinaire, who continues to lead the market.
Find out how much catching up Nvidia and ATI have to do after the jump.