AMD Fusion Details Leaked

5

Comments

+ Add a Comment
avatar

Pixelated

Hybrid SLI

avatar

Caboose

 This could also be used as a step above nVidia's multi-GPU setup (Honestly, the name escapes me) where a descrete graphics and integrated graphics are in the same system, when high performance is needed the descrete graphisc kicks in, when not, it's shut off for integrated graphics. I could see this being a step further than nVidia, especially in the power consumption department.

 Heck, just imagine a desktop system with one of these on board, plus descrete graphics and a full-power quad (or octi) core CPU. When the system is idle, the full CPU is shutdown (or when non-CPU intensive applications are needed) as is the descrete graphics, and the CPU-GPU is used. Plus when extra computing power is needed, the full CPU PLUS the CPU-GPU is used.

Extra power when needed, lower power consumption, and cost savings?

 

-= I don't want to be dead, I want to be alive! Or... a cowboy! =-

avatar

Shalbatana

 HMMM.... this sounds too good. (Quick, think of something negative to say...there must be something negative)

 Um... It's too bad that this will lead to increased packaging around retail CPU's and thus increase waste and pollution.

 (that-ill get em!)

 After seeing that AMD/ATI can still pull rabbits out of their hats, I'm cautiously optomistic. That is, of course until Intel can make one of their own. Still, it must be further along than we think, as ATI wouldn't have leaked it until they were sure they could get theirs out well before any other competitors.

 

_______________________________

"There's no time like the future."

avatar

Keith E. Whisman

Well lets just see what happens here. Sounds very exciting. Perhaps in the future motherboards will come with two ziff (zero insertion force) sockets. One socket for the cpu and the other socket for the GPU. That GPU would come fully equipped with the processor and video memory. On die hispeed cache would act as a buffer to slower memory for communications with the CPU and output via the then standard monitor interface built onto the motherboard but for the near future this just kicks but. Not for highend gaming but how about really good laptops with good graphics that sip power. That's freaking awsome.

I love advances in technology it brings us one step closer to a class 3 civilization and the Enterprise E.

avatar

tabernak

The PCI Express isn't far removed from a ZIF.  The point of this technology is to package the GPU and CPU together so there's no external hardware/wiring to package them together.  I guess that means they have to be nice about sharing the RAM too.  Since they'd be on the same die the communications between the two should be really fast.  I'm sure there's plenty of other advantages, Perhaps the CPU and GPU could offload some tasks to each other even. There's potential in all these small internet devices as well.

My guess is this initially is going to be more of an attempt at creating a decent alternative to integrated graphics in laptops and desktops.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.