As the tech world waits with abated breath for Intel's Nehalem architecture to crash the Core 2 party, we still don't know what name to put on the banners, but we might have a pretty good idea. It's not yet official, but according to the latest rumor, Intel will dub its newfangled Nehalem as Core i7, which would put to rest any speculation that the chip maker might drop the 'Core' designation in its new nomenclature.
For anyone that hasn't been reading Maximum PC on a regular basis (shame on you) or who have been living under a rock (you get a free pass), Nehalem is Intel's next big processor microarchitecture, representing the 'tock' in the company's tick-tock update cycle. Along with tri-channel DDR3 support, Nehalem will usher in Intel's move to an integrated memory controller and finally do away with the crowded front-side bus. Gordon Mah Ung covered the architecture in detail last week, and while you're brushing up on the nuances of Nehalem, be sure and check out what the first Nehalem system looks like.
Getting back to the naming scheme, we'll have to wait until hearing official word, but in the meantime, speculation is welcome. Do you like the rumored name change?
Nvidia has licensed Transmeta’s power conserving technology for a sum of $25 million. The technologies that Transmeta has leased out to Nvidia include its flagship power management technologies, Longrun and Longrun 2. Transmeta has quickly mastered its current business model of licensing IP to bigger companies and its coffers are loaded with cash.
It shouldn’t surprise anyone that Nvidia has licensed Transmeta’s power management technology as most chip manufacturers are concentrating on increasing power efficiency.
AMD knows it doesn't have a processor line capable of competing with Intel's Core 2 architecture clock for clock, so instead the chipmaker looks to push a new chipset that promises improved overclocking performance. The new 790GX chipset is intended to target the "performance" gaming community, filling the spot just below its 790FX, which hones in on the ultra-enthusiast market.
According to AMD, the 790GX makes it possible to "shift your system performance into next gear with Advanced Clock Calibration that allows you to get the highest overclocking out of your AMD Phenom CPUs." To illustrate the effect, AMD uses a graph showing a 2.5GHz Phenom topping out at 3.0GHz with "standard overclocking," but jumping to 3.2GHZ and beyond with its Advanced Clock Calibration.
Hardcore gamers are likely to be turned off by the 790GX's integrated Radeon HD 3300 graphics and will opt to add in a discrete GPU solution. By doing so, gamers can take advantage of ATI's Hybrid Graphics technology and utilize both GPUs at the same time.
AMD also looks to push the budget angle, pointing out that gamers can pair a quad-core X4 9850 Phenom with a 790GX-based motherboard for $355, or $90 less than a comparable Intel rig sporting a quad-core Q9300 slapped on a P45-based motherboard.
Does AMD have a winner on its hands with the 790GX?
AMD's acquisition of graphics chip maker ATI continues to be a sour point whenever the company talks about its finances, most recently coming up when AMD said it would take a near billion dollar charge in the second quarter. Given AMD's financial status, it's easy to criticize the company's decision to overpay for a company that has yet to benefit impatient investors. That could change if AMD's Fusion ends up revolutionizing the PC landscape.
Up to this point, AMD hasn't gone into specifics regarding its upcoming CPU+GPU chip, but according to TGDaily, industry sources aren't being as tight-lipped. If the rumblings are to be believed, the first Fusion processor (code-named Shrike) will consist of a dual-core Phenom CPU and an ATI RV800 GPU core, Previous rumors had the first run Fusion chips built around a dual-core Kuma CPU and RV710 graphics chip, but those plans appear to have gone by the wayside as AMD has had more time to develop a low-end RV800-based core.
The sources also indicate that Fusion will likely be introduced as a half-node chip built around a 40nm manufacturing process, and will later move to 32nm, possibly by the beginning of 2010.
Social networking site Facebook finds itself needing to update its data center infrastructure to support new media applications, and Intel will be the one to help them do it. The two companies on Thursday announced a joint agreement that will see Facebook use "thousands" of Xeon 5400 quad-core processors built on a 45nm manufacturing process.
More than just hardware support, Intel will also work with Facebook to optimize its software for use with the bevy of Xeon chips, giving extra focus to making the software take advantage of the additional processor cores. Moreover, Intel will look to send a message that its microarchitecture can support the massive data centers that will support cloud-computing infrastructures.
"It's a big win for Intel in the general category of web infrastructure and by that I mean categories like cloud computing," said John Spooner, an analyst with Technology Business Research. "Facebook has a large computing infrastructure that delivers these types of web services on demand and it requires the same level of service and infrastructure as a cloud-computing provider."
Facebook wouldn't comment on which OEMs would build the new servers, but according to eWeek, multiple sources have confirmed Dell and HP would be involved.
Intel can not only lay claim as the current king of chip technology, but its upcoming Nehalem microarchitecture looks poised to keep the silicon studs on top of the competition well into 2009. AMD has yet to threaten Intel's position ever since Conroe, and while the company remains confident under Dirk Meyer as the new head honcho, it's still playing catch up to Intel's 45nm technology.
The situation gets a little more competitive when switching from CPUs to GPUs, and according to Tomshardware, sources at both ATI and Nvidia are saying they will each have a 40nm GPU manufacturing process by the first half of 2009, possibly to be unveiled at next year's CeBit.
Assuming either company meets their target, the accomplishment will unseat Intel as the technological leader in terms of the smallest chip structures, even if only for a short time. The road won't stop at Nehalem and Intel is already busy developing 32nm CPUs, which many expect to be shown off in prototype form at the company's spring development forum in H1 2009. Volume shipments could come as early as Q3 next year.
Even so, if 40nm GPUs materialize as reported, it will mark the first time GPUs will overtake CPUS in terms of production nodes. That won't necessarily make it a better chip, but you can expect plenty of fanfare should Nvidia and/or ATI dethrone the silicon king.
Much has been made over Intel's Atom processor, the 45nm wonder-chip finding its way into more netbooks than production can seemingly keep up with. But lest the world forget, VIA also has a low power chip of its own, one the company claims delivers "truly optimized performance for the most demanding computing, entertainment, and connectivity applications."
VIA's 65nm Nano processor saw an official launch a full two months ago, but it's Intel's Atom that keeps getting the attention. Is it justified? A pair of review sites looked to answer that question by pitting an Intel Atom 230 (1.6GHz) against a VIA Nano L2100 (1.8GHz), and both sites came to the same conclusion: VIA's Nano is the faster processor.
Clocked 12.5 percent faster the Atom chip, it should come as no surprise to see the Nano L2100 churn out better performance numbers, but it's the margin of victory that might turn a few heads. In some cases, the Nano chip outpaced the Atom by a margin of 15 to 20 percent, showing it deserves more attention than just as an also-ran.
Of course, it's all for naught if VIA can't win the one contest that matters most: Vendor support.
"These benchmark results are the latest evidence of the clear value that Quad-Core AMD Opteron processors offer an Internet business - or any data center that requires the ultimate i performance, reliability, and power efficiency," said Patrick Patla, AMD's general manager of Server and Workstation Business.
The press release makes no mention of who or where the benchmarks were ran, but did say an HP ProLiant DL385 G5 server equipped with two Opteron 2356 processors scored 30,007, while an HP ProLiant DL585 G5 server running two 8356 processors posted a score of 43,854.
Ask anyone who's ever been married and they'll tell you how difficult it is to read between the lines. The same holds true in the tech world, where rumors get started with bits and pieces of information pieced together like a puzzle, but the pieces don't always fit. Such is the case involving AMD's fabrication plants in Germany.
With the chip maker struggling to turn a profit and announcing a restructuring plan to get there, some speculated AMD might be gearing up to sell off some of its Fabs. Then more recently AMD's CEO Dirk Meyer told the Austin Amercian-Statesman that it plans to spin the manufacturing operations off into a separate company with new ownership. Could that be taken as confirmation of an earlier rumor?
It can and it was, but AMD is saying not so fast. Contrary to what The Inq maintains is still true, Drew Prairie, an AMD spokesperson, claims Meyer's comments were referring to how the company manufacturers its wafers, and not indicative of any plans to sell off its Fabs.
So while it appears that AMD's fabrication plants are safe for now, this likely won't be the last bit of speculation involving the chip maker. AMD remains tight-lipped about its restructuring plans and 'asset-smart' strategy, and with the recent departure of Hector Ruiz as CEO, it's anyone's guess what the company might be planning. Any guesses?
You've heard of Paper Mario, but a paper processor? That might be taking things a bit too far, but a team of Portuguese scientists have created the first Field Effect Transistor (FET) made with cellulose fiber-based paper. The new approach takes a common sheet of paper and uses it as the dielectric layer on oxide FETs, with devices fabricated on both sides of the paper sheet. And while other teams have reported using paper as the physical support (substrate) of electronic devices, this method is the first one that also allows the paper to be used as the interstrate component as well. In other words, it's really cool.
More than a proof of concept, the team envisions its new paper transistors being used in disposable electronic devices like paper displays, smart labels, smart packaging, bio-applications, RFID tags, and more. Full details will be published in the September 2008 issue of IEEE Electron Device Letters, but until then, you'll have to wade through translated text.