As Intel gears up to sample Larrabee later this year, the chip maker continues to build hype over the architecture's x86 roots. Intel is quick to point out that developers will be able to program in C or C++ languages just as they're used to doing on x86 processors, giving them an easy way to port applications from other platforms over to Larrabee.
Meanwhile, Nvidia also wants to build hype, but over its competing CUDA architecture. DailyTech has posted Nvidia's comments on the issue, which read:
CUDA is a C-language compiler that is based on the PathScale C compiler. This open source compiler was originally developed for the x86 architecture. The NVIDIA computing architecture was specifically designed to support the C language - like any other processor architecture. Competitive comments that the GPU is only partially programmable are incorrect - all the processors in the NVIDIA GPU are programmable in the C language.
NVIDIA's approach to parallel computing has already proven to scale from 8 to 240 GPU cores. Also, NVIDIA is just about to release a multi-core CPU version of the CUDA compiler. This allows the developer to write an application once and run across multiple platforms. Larrabee's development environment is proprietary to Intel and, at least disclosed in marketing materials to date, is different than a multi-core CPU software environment.
Andrew Humber from Nvidia also went on to clarify that CUDA is a brand name for the C-compiler rather than being two different things.
Anyone else feel chilly when Nvidia and Intel are in the same room?
It doesn't matter if you seek solace in Creationism or prescribe to the theory of evolution, everyone should be equally stoked about what Nvidia's calling "Big Bang II." No, the graphics chip maker isn't gearing up to end the debate on man's existence, but even better, the company will improve man's quality of life with a new driver package that looks poised to earn its codename by bringing gamers at least one big, long overdue improvement.
Bang Part I
The biggest news associated with Nvidia's ForceWare Release 180 (R180) is the introduction of SLI multi-monitor support. Ever since Nvidia introduced SLI, the inability to run a second monitor while gaming has been a major complaint, and even more so as LCD displays have fallen in price. That finally looks to no longer be the case with the new driver release, and gamers will be able to frag opponents while simultaneously keeping an eye on their email inbox, incoming IMs, and everything else that would previously be blacked out on a second monitor.
Find out what else is bangin' with the new driver after the jump.
Having already moved on to its 9-M series GPUs, Nvidia presumably has solved whatever problem led to an "abnormal failure rate" in the what the company still contends only affects a limited batch of previous generation GPU and MCP products. Exactly how limited that batch is might never be fully disclosed, but it appears the problem may be more widespread than consumers were led to believe.
Just over a week ago Dell made available a list of its notebooks that could possibly be affected by the GPUs believed to be suffering higher than expected failure rates and is recommending owners update their BIOS to reduce their risk of running into a problem. The updated BIOSes modify the fan profile to help regulate GPU temperature fluctuations, but as Dell notes, the new parameters won't help customers who are already suffering video-related issues.
Dell isn't alone, and now HP has also released a list of models that qualify for 'Warranty Service Enhancement' (curiously absent is the DV97xx series). And like Dell, HP is also recommending its owners update their BIOS as a preventive measure.
So are all G84 and G86 parts bad like The Inq surmised early in July? No one but Nvidia knows for sure, but looking over the list of affected models would seem to indicate the allegation could hold some merit.
Did Nvidia drop the ball harder than they're letting on?
AMD's acquisition of graphics chip maker ATI continues to be a sour point whenever the company talks about its finances, most recently coming up when AMD said it would take a near billion dollar charge in the second quarter. Given AMD's financial status, it's easy to criticize the company's decision to overpay for a company that has yet to benefit impatient investors. That could change if AMD's Fusion ends up revolutionizing the PC landscape.
Up to this point, AMD hasn't gone into specifics regarding its upcoming CPU+GPU chip, but according to TGDaily, industry sources aren't being as tight-lipped. If the rumblings are to be believed, the first Fusion processor (code-named Shrike) will consist of a dual-core Phenom CPU and an ATI RV800 GPU core, Previous rumors had the first run Fusion chips built around a dual-core Kuma CPU and RV710 graphics chip, but those plans appear to have gone by the wayside as AMD has had more time to develop a low-end RV800-based core.
The sources also indicate that Fusion will likely be introduced as a half-node chip built around a 40nm manufacturing process, and will later move to 32nm, possibly by the beginning of 2010.
3D graphics technology has grown by leaps and bounds since 3DFX first laid its Voodoo on the computing world, and today's videocards boast everything from multiple GPUs in a single package to the promise of physics processing. And not just for gaming, fanatical Folders can crunch through more proteins by utilizing their GPU, or decode a high definition movie on their new big screen TV.
Leading the charge into this new era of 3D computing are Nvidia and ATI, two companies who have recently started going at each others' throats with aggressive price cuts and a deluge of new videocards while simultaneously chasing the performance crown. But for all their battles, both old and new, it's Intel, CPU maker extraordinaire, who continues to lead the market.
Find out how much catching up Nvidia and ATI have to do after the jump.
Intel can not only lay claim as the current king of chip technology, but its upcoming Nehalem microarchitecture looks poised to keep the silicon studs on top of the competition well into 2009. AMD has yet to threaten Intel's position ever since Conroe, and while the company remains confident under Dirk Meyer as the new head honcho, it's still playing catch up to Intel's 45nm technology.
The situation gets a little more competitive when switching from CPUs to GPUs, and according to Tomshardware, sources at both ATI and Nvidia are saying they will each have a 40nm GPU manufacturing process by the first half of 2009, possibly to be unveiled at next year's CeBit.
Assuming either company meets their target, the accomplishment will unseat Intel as the technological leader in terms of the smallest chip structures, even if only for a short time. The road won't stop at Nehalem and Intel is already busy developing 32nm CPUs, which many expect to be shown off in prototype form at the company's spring development forum in H1 2009. Volume shipments could come as early as Q3 next year.
Even so, if 40nm GPUs materialize as reported, it will mark the first time GPUs will overtake CPUS in terms of production nodes. That won't necessarily make it a better chip, but you can expect plenty of fanfare should Nvidia and/or ATI dethrone the silicon king.
An "abnormal failure rate" among Nvidia's 8M series GPUs hasn't stopped the graphics chip maker from moving forward; the Santa Clara company expanded its lineup last week with the GeForce 9800M and 9700M parts for notebooks. If you've been waiting for these parts to reach the market place, you can now whip out the credit card and place your order.
Best Buy already stocks Toshiba's sexy Qosmio X305-Q701, which comes equipped with Nvidia's 9700M GTS GPU sporting 512MB GDDR3 memory. According to DigiTimes, over 20 other notebook vendors are expected to soon follow suit, including other top tier vendors. Gaming notebooks have also been announced from the likes of Sager, CyberSystem, Infinity, Pioneer Computers Australia, and other regional PC companies.
If Toshiba's Qusmio isn't your style, look forward to a flurry of other 9M based notebooks to make their debut as Intel's Centrino 2 platform emerges full force.
Last month Nvidia said it planned to tweak its 9800GTX videocard with a die shrink and faster clockspeeds resulting in the 9800GTX+, and today the release becomes official with immediate availability. Along with the 9800GTX+, Nvidia fleshes out its GeForce 9-series line with two other videocards, the 9800GT and 9500GT.
All three cards are available now, and each one brings support for Nvidia's PhysX and CUDA technologies, two areas currently exclusive to Nvidia.
"The addition of the new 9800GTX+, 9800GT, and the 9500GT GPUs brings a new level of visual computing capability to additional mainstream market segments," said Ujesh Desai, general manger of desktop GPUs at Nvidia. "Nvidia GPUs deliver the best bang for the buck in each price category, and with support for CUDA, PhysX, and 3D stereoscopic technology, consumers can now experience the unique, innovative, and immersive computing experience that only Nvidia can deliver."
Claiming victory in the bang-for-buck war would have been a tough sell just weeks ago, but such claims become easier to swallow with the 9500GT taking residence in the sub-$70 pricing tier. Both the 9800GT and GTX+ can be bought for under $200, with the latter going head to head against ATI's HD 4850 videocard. For you old schoolers, it hasn't been this fun to shop for a GPU since the TI4200 days.
It's been a rocky summer for Nvidia, who earlier this month saw its shares tumble downward after announcing it was setting aside a one-time hit of $150 to $200 million to cover warranty and repair costs associated with an "abnormal failure rate" in its mobile graphics cards. Now it appears that tough times are still ahead for the graphics card maker.
Citing un-named sources, DigiTimes claims that the faulty mobile parts have led to some channel vendors demanding graphics card parnters to issue a recall for desktop-based videocards using the same GPU core. Nvidia has maintained that the problematic parts only affect a few specific notebook models and no desktop cards, but some have suggested it could include all G84 and G86 parts.
This isn't the first rumor Nvidia's been entangled with in recent times, and as with all hearsay, take this one with a grain of salt.
Have you ever run into an old ex-girlfriend only to realize she's nothing the way you remembered? Or fired up that retro-game and wondered what you found so appealing about it in the first place? Every once in awhile a blast from the past (like bringing WarGames back to theaters) will make a worthwhile comeback, but more often than not, old relics are best left buried, and Albatron might be finding this out.
Earlier this month the company let it be known it would be bringing Nvidia's 8-series videocards in 8400, 8500, and 8600 trim to the PCI bus, but those plans have hit a snag and it might be awhile before we see another PCI videocard. Even though the PCI bus has been around since close to the dawn of time, not all motherboards stick to the same signaling implementation for the PCI interface, and Albatron fears that compatibility with different motherboards could become a problem.
Sam Nada, Albatron's International PR representative, says the company's engineers are working on optimizing the BIOS to ensure a smooth rollout, but it will be a couple of weeks before the new Retrotechnology cards make a debut. But what's a couple of weeks if you've already staved off the upgrade bug for this long?