Take note, Rainier Wolfcastle, because these goggles may actually do something. Nvidia’s latest visual computing venture is a serious foray into stereoscopic 3D, a technology that has not found success among mainstream consumers (or even enthusiasts) in recent history. 3D movies and gaming at home have always been seen as gimmicky, a perception that can largely be attributed to the fact that you have to wear some pretty goofy glasses to experience the effect. In fact, past iterations of 3D stereographic technology (including efforts by the now-defunct company ELSA) have been especially troublesome because they required bulky headgear (that had to be tethered to your PC) that had a tendency to give gamers headaches after just a few minutes of use. Nvidia wants to reinvigorate the 3D stereoscopic market by developing its own glasses hardware and driver software, which they hope will avoid the pitfalls of previous efforts.
Do we have the technology to make stereoscopic 3D tech practical? And more importantly, is this something that, as a gamer, you’d be open to embrace?
We invariably refer to the video memory in modern videocards as GDDR, differentiating it only by version (GDDR2, GDDR3, GDDR4, and now GDDR5), but the technology’s full acronym is actually GDDR SDRAM, which stands for Graphics Double Data Rate Synchronous Dynamic Random Access Memory.
“Double data rate” describes the memory’s capacity for double-pumping data: Transfers occur on both the rising and falling edges of the clock signal. This endows memory clocked at 800MHz with an effective data-transfer rate of 1.6GHz. “Synchronous” refers to the memory’s ability to operate in time with the computer’s system bus. This allows the memory to accept a new instruction without having to wait for a previous instruction to be processed, a practice known as instruction pipelining.
Google’s chief of mobile platforms Andy Rubin seems to believe the cliché ‘first impression is the last impression’. He told Reuters that the success of the Android platform would depend on the reception of its first phone. He believes that there is very little margin for failure as far as the first Android phone goes - first impression. The first Android phone will be T-Mobile’s HTC Dream, and is rumored to be scheduled for release later this month.
Intel today announced the official release of their Dunnington-based Xeon 7400 server CPU. The six-core chip is monolithic, meaning that all six cores are on one die, and is the first Xeon CPU to sport that design. The previous 7300 series CPU, dubbed Tigerton, was a quad-core processor with two dual-core chips on a single module (like existing quad-core consumer chips). As expected, Dunnington is still of the Penryn architecture (45nm High-K manufacturing process), and will be compatible with current Tigerton Socket 604 motherboards.
Speed-wise, Intel claims a 50% performance increase in the 7400 over the 7300 series CPU based on TPC-E database benchmark testing (TPC-E simulates the online transaction workload of a large brokerage firm). More impressive is Intel’s claim that even with the improved performance, Dunnington’s energy efficiency actually means it uses 10% lower power than the previous generation. The gains are largely attributed to the presence of a new 16MB level-3 cache, in addition to the extra compute power of two more cores. Xeon 7400 CPUs will launch at 2.66Ghz with either four or six core, and will be priced from $856 to $2729.
What does this mean for consumers? Unfortunately, not much. Intel has no current plans to release a six-core CPU to the mainstream market, and few applications would be able to scale well enough to take full advantage of the additional two cores. Intel seems to be pushing Nehalem for the consumer market, which will launch as a quad-core. Dunnington customers – large Web 2.0 companies like Myspace – will be the ones who benefit most from the extra performance and power efficiency, which may enable them to develop compute-intensive features like high-definition video sharing.
More pics of the sizable chip and Intel's press conference after the jump.
Forget about dual, quad, or even eight-core processors, all of which would prove woefully inadequate next to the system being called Blue Waters. The 200,000 processor core supercomputer got the green light at the University of Illinois at Urbana-Champaign, finalizing a contract with IBM to build the what will be the world's first sustained petascale computational system.
For anyone not up on their flops, a petaflop is the equivalent to roughly 1 quadrillion calculations per second, presumably just enough to get a decent framerate out of Crysis. Coupled with the 200,000 processor cores will be more than a petabyte of memory and more than 10 petabytes of disk storage. And yes, that would hold a lot of porn, though Blue Waters will spend its time on scintillating real-world scientific and engineering applications.
Specifically, the National Science Foundations says that Blue Waters will wade into the study of complex processes like the interaction of the Sun's coronal mass ejections with the Earth's magnetosphere and ionosphere. Other examples include the formation and evolution of galaxies in the early universe, understanding the chains of reactions that occur with livings cells, the design of novel materials, and other decidedly nerd topics that have nothing to do with propelling Folding at Home team 11108 ahead of the competition.
The job of a whistleblower is a dangerous one, and Robert Delaware has paid the price for speaking out against Microsoft. The contracted game tester had worked closely with the Xbox line, and particularly Bungie Studios since early 2005. For those who haven’t been following the story, Delaware’s testimonial was the basis for an article that made headlines last week regarding Xbox 360 hardware failures at launch. In the VentureBeat article, Delaware detailed the known quality issues with the 360 and that management ignored multiple warnings in order to gain an advantage over the not yet released Playstation 3. Legally Microsoft was within its rights to fire Delaware for his unauthorized interview, but he remains defiant. Delaware claims to have been aware of the possible ramifications but was willing to take the risk. Upon termination Delaware was also warned by an HR representative that he faces possible lawsuits from both Microsoft and the company who contracted him out. The Interview conducted by VentureBeat’s Dean Takahashi remains unconfirmed by Microsoft and in response had only this to say: "This topic has already been covered extensively in the media. This new story repeats old information, and contains rumors and innuendo from anonymous sources, attempting to create a new sensational angle, and is highly irresponsible.”
Did Robert Delaware do the right thing? Or was he just looking for publicity?
NEC said yesterday it would join IBM and six other semiconductor companies who are focused on developing new methods of manufacturing 32nm processors. The other six include Charted Semiconductor, Freescale, Infineon Technologies, Samsung, STMicroelectronics, and Toshiba, with the College of Nanoscale Science and Engineering at the University of Albany in New York also contributing.
The IBM-assembled alliance is attempting to create chips that use standard, bulk CMOS (complimentary metal oxide semiconductor) technology in the manufacturing process. Benefits of going this route include a 35 percent increase in performance over 45nm parts, while also cutting power consumption in half. The double-whammy would prove particularly attractive for mobile computing.
For its part, Intel is also working on a 32nm design. Chips built on the shrunken process are expected to debut in mid-2009. No date has been set for when IBM and its collaboration of companies will bring 32nm processors to market.
We'd all love to run a pair of 4870 X2 videocards, but for certain workstation tasks, these gaming-centric videocards would prove inappropriate. For those who put work before play, AMD today introduced two new workstation cards, one at each end of the performance spectrum.
Taking its place on the high rung, AMD's new flagship FirePro V8700 is based on the 4800 series with 800 stream processors. The company claims the V8700 is about 40 percent faster than its previous flagship offering. The card comes with 1GB of GDDR5 memory and sports a total bandwidth of 108.8 GB/s. Two DisplayPorts and a single dual-link DVI interface round out the feature-set.
On the lower end, the FirePro V3750 drops the stream processors down to 320 and comes with a more conservative (that's a nice way of saying 'much lower') 256MB of GDDR3, resulting in a bandwidth of 22.4 GB/s.
Both the V8700 and V3750 will be available sometime this quarter with an MSRP of $1,499 and $199 respectively.
It can be argued that AMD didn't start to build an enthusiast following until the Barton days. Back then, the company's efficient processors not only held their own in performance, but destroyed Intel when it came to the bang/buck factor, both in regards to processor pricing and the overall platform (you could pick up a high end AMD motherboard for under $200). Ever since Intel finally responded with its Core 2 architecture, AMD has had a tougher time competing on the performance front, forcing AMD to slash prices, and that's what happening again. In addition to price cuts, AMD is also expanding its tri-core line.
The newly announced Phenom X3 8450e comes clocked at 2.1GHz and the Phenom X2 8250e putters at 1.9GHz. Both processors sport 512KB of L2 cache and 2MB of L3 cache, and both also come rated with a 65W TDP, compared to 95W for AMD's standard Phenom tri-core line. No pricing information has yet been announced for either model.
On the higher end, AMD's Phenom X3 8750 Black Edition will bring an unlocked multiplier to the table and cruise along at 2.4GHz. It will come with the same amount of L2 and L3 cache as the 8450e and 8250e processors, but rated at the aforementioned 95W TDP. Pricing has been set to $134 for bulk orders.
So what about the price cuts? AMD will drop it's X3 8450 (without the 'e' designation) down to $104 and X3 8650 down to $119, both in bulk.
We knew Microsoft wouldn’t forget about us gamers. Yesterday, they debuted a new mouse-tracking technology in the Explorer mouse, which is targeted toward “productivity” users. We were a little skeptical of Bluetrack’s application for gaming, since the Explorer only has a 1000Dpi sensor. Well, Microsoft has assuaged all fears with the announcement of the Sidewinder X8, a BlueTrack mouse which has a sweet 4000 dpi sensor. This high-end gaming mouse is a step up from the original Sidewinder (which will remain in production), and retains features we like from the series: a Dpi adjustor with LCD indicator, vertical thumb buttons, and customizable weights (features which were omitted from the lower-end X5 model). We got some hands-on time with the X8, and was able to put it side-by-side with its non-BlueTrack siblings.
Hit the jump to check out the entire Sidewinder family.