There's been plenty of coverage surrounding Nvidia's admitted "abnormal failure rate" among what remains an unknown number of GPUs, but in case you missed it, here's the Cliff Notes version: Earlier this summer, Nvidia announced it would take a one time hit of $150-$200 million to cover warrany and repair costs associated with a bad batch of mobile GPUs. The chip maker insisted (and still does) that the failures were an isolated incident, but that's come into question. News and rumor site The Inquirer has been particularly vocal in its questioning of how widespread the problem really is, bringing up the possibility that the defective parts could be affecting both mobile and desktop parts, including G92 and G94 based GPUs.
Now that you're caught up, it's TGDaily who's bringing more speculation to the table. Referencing industry sources, the news site claims that Nvidia's future 45nm GPUs that have recently entered the qualification stage are being built with high-lead solder bumps. Earlier speculation pointed to Nvidia having made the switch to eutecic solders in reaction to the GPU failures, and if that's the case, the switch to solder bumps raises more questions than answers as to what's going on, and whether or not the problem has been solved or is ongoing.
Nvidia isn't commenting on the latest news, and it's a pretty safe bet that this won't be the last you'll hear on the matter.
Since netbooks deploy quaint technology as compared to their full-blown cousins, it can be difficult to believe that they are actually aimed at the future. But that is exactly what Rob Enderle, principal analyst at the Enderle group, thinks. His reasoning is that netbooks would be more practical and fun when WiMax becomes ubiquitous in the near future. A netbook quickly transforms into a worthless, nondescript device once you have no internet access to breathe life into it. Rob Enderle’s point about netbooks being useless without internet might appear to be a mere reiteration of the obvious, but it is actually a very insightful observation.
The second-generation Slacker personal radio player is smaller, slimmer, and even better than the first. There may be no better way to listen to free music. Slacker announced a new version of its portable radio today, and we’re happy to say the Slacker G2 kicks just as much ass as the original product we reviewed last April.
Here’s Slacker in a nutshell, if you don’t want to re-read our previous review: Slacker radio is much like Pandora or Last.FM in that you can listen to music on the Internet for free (along with an occasional advertisement) while the service analyzes your expressed taste in music and recommends new artists it thinks you’ll enjoy.
The trade-offs are that you can't always choose which songs you want to hear, and you can skip only a limited number of tracks. Slacker also a subscription plan ($7.50 per month if you pay for a year at a time) that eliminates the ads, enables you to call up saved tracks at will (as long as you maintain your subscription), and allows you skip an unlimited number of tracks.
The second-generation Slacker personal radio player is smaller, slimmer, and even better than the first. There may be no better way to listen to free music. Read on for our full review.
Take note, Rainier Wolfcastle, because these goggles may actually do something. Nvidia’s latest visual computing venture is a serious foray into stereoscopic 3D, a technology that has not found success among mainstream consumers (or even enthusiasts) in recent history. 3D movies and gaming at home have always been seen as gimmicky, a perception that can largely be attributed to the fact that you have to wear some pretty goofy glasses to experience the effect. In fact, past iterations of 3D stereographic technology (including efforts by the now-defunct company ELSA) have been especially troublesome because they required bulky headgear (that had to be tethered to your PC) that had a tendency to give gamers headaches after just a few minutes of use. Nvidia wants to reinvigorate the 3D stereoscopic market by developing its own glasses hardware and driver software, which they hope will avoid the pitfalls of previous efforts.
Do we have the technology to make stereoscopic 3D tech practical? And more importantly, is this something that, as a gamer, you’d be open to embrace?
We invariably refer to the video memory in modern videocards as GDDR, differentiating it only by version (GDDR2, GDDR3, GDDR4, and now GDDR5), but the technology’s full acronym is actually GDDR SDRAM, which stands for Graphics Double Data Rate Synchronous Dynamic Random Access Memory.
“Double data rate” describes the memory’s capacity for double-pumping data: Transfers occur on both the rising and falling edges of the clock signal. This endows memory clocked at 800MHz with an effective data-transfer rate of 1.6GHz. “Synchronous” refers to the memory’s ability to operate in time with the computer’s system bus. This allows the memory to accept a new instruction without having to wait for a previous instruction to be processed, a practice known as instruction pipelining.
Google’s chief of mobile platforms Andy Rubin seems to believe the cliché ‘first impression is the last impression’. He told Reuters that the success of the Android platform would depend on the reception of its first phone. He believes that there is very little margin for failure as far as the first Android phone goes - first impression. The first Android phone will be T-Mobile’s HTC Dream, and is rumored to be scheduled for release later this month.
Intel today announced the official release of their Dunnington-based Xeon 7400 server CPU. The six-core chip is monolithic, meaning that all six cores are on one die, and is the first Xeon CPU to sport that design. The previous 7300 series CPU, dubbed Tigerton, was a quad-core processor with two dual-core chips on a single module (like existing quad-core consumer chips). As expected, Dunnington is still of the Penryn architecture (45nm High-K manufacturing process), and will be compatible with current Tigerton Socket 604 motherboards.
Speed-wise, Intel claims a 50% performance increase in the 7400 over the 7300 series CPU based on TPC-E database benchmark testing (TPC-E simulates the online transaction workload of a large brokerage firm). More impressive is Intel’s claim that even with the improved performance, Dunnington’s energy efficiency actually means it uses 10% lower power than the previous generation. The gains are largely attributed to the presence of a new 16MB level-3 cache, in addition to the extra compute power of two more cores. Xeon 7400 CPUs will launch at 2.66Ghz with either four or six core, and will be priced from $856 to $2729.
What does this mean for consumers? Unfortunately, not much. Intel has no current plans to release a six-core CPU to the mainstream market, and few applications would be able to scale well enough to take full advantage of the additional two cores. Intel seems to be pushing Nehalem for the consumer market, which will launch as a quad-core. Dunnington customers – large Web 2.0 companies like Myspace – will be the ones who benefit most from the extra performance and power efficiency, which may enable them to develop compute-intensive features like high-definition video sharing.
More pics of the sizable chip and Intel's press conference after the jump.
Forget about dual, quad, or even eight-core processors, all of which would prove woefully inadequate next to the system being called Blue Waters. The 200,000 processor core supercomputer got the green light at the University of Illinois at Urbana-Champaign, finalizing a contract with IBM to build the what will be the world's first sustained petascale computational system.
For anyone not up on their flops, a petaflop is the equivalent to roughly 1 quadrillion calculations per second, presumably just enough to get a decent framerate out of Crysis. Coupled with the 200,000 processor cores will be more than a petabyte of memory and more than 10 petabytes of disk storage. And yes, that would hold a lot of porn, though Blue Waters will spend its time on scintillating real-world scientific and engineering applications.
Specifically, the National Science Foundations says that Blue Waters will wade into the study of complex processes like the interaction of the Sun's coronal mass ejections with the Earth's magnetosphere and ionosphere. Other examples include the formation and evolution of galaxies in the early universe, understanding the chains of reactions that occur with livings cells, the design of novel materials, and other decidedly nerd topics that have nothing to do with propelling Folding at Home team 11108 ahead of the competition.
The job of a whistleblower is a dangerous one, and Robert Delaware has paid the price for speaking out against Microsoft. The contracted game tester had worked closely with the Xbox line, and particularly Bungie Studios since early 2005. For those who haven’t been following the story, Delaware’s testimonial was the basis for an article that made headlines last week regarding Xbox 360 hardware failures at launch. In the VentureBeat article, Delaware detailed the known quality issues with the 360 and that management ignored multiple warnings in order to gain an advantage over the not yet released Playstation 3. Legally Microsoft was within its rights to fire Delaware for his unauthorized interview, but he remains defiant. Delaware claims to have been aware of the possible ramifications but was willing to take the risk. Upon termination Delaware was also warned by an HR representative that he faces possible lawsuits from both Microsoft and the company who contracted him out. The Interview conducted by VentureBeat’s Dean Takahashi remains unconfirmed by Microsoft and in response had only this to say: "This topic has already been covered extensively in the media. This new story repeats old information, and contains rumors and innuendo from anonymous sources, attempting to create a new sensational angle, and is highly irresponsible.”
Did Robert Delaware do the right thing? Or was he just looking for publicity?
NEC said yesterday it would join IBM and six other semiconductor companies who are focused on developing new methods of manufacturing 32nm processors. The other six include Charted Semiconductor, Freescale, Infineon Technologies, Samsung, STMicroelectronics, and Toshiba, with the College of Nanoscale Science and Engineering at the University of Albany in New York also contributing.
The IBM-assembled alliance is attempting to create chips that use standard, bulk CMOS (complimentary metal oxide semiconductor) technology in the manufacturing process. Benefits of going this route include a 35 percent increase in performance over 45nm parts, while also cutting power consumption in half. The double-whammy would prove particularly attractive for mobile computing.
For its part, Intel is also working on a 32nm design. Chips built on the shrunken process are expected to debut in mid-2009. No date has been set for when IBM and its collaboration of companies will bring 32nm processors to market.