AMD Expects APUs to Wipe Out Entry Level Discrete Graphics Cards

7

Comments

+ Add a Comment
avatar

jackal49

I agree.  I built a new HTPC with the AMD E-350 Fusion APU.  I have to say that Blu Rays look even better than my GeForerce GT 9600, and surprisingly better than my Low profile Palit GeForce 450.

Obviosly it can't touch the 1 Gb 450 in games, but the media experience overall is superior.  With the Fusion APU I can use a smaller case and power supply.  That means less heat, noise, and power draw.  I was going to use a Gt 430 in case I decide to go with a 3D HDTV.  I think APU style integrated GPU and CPUs are the future.

avatar

avenger48

AMD would certainly hope so, wouldn't they?  They have very little to lose and a lot to gain as nvidia's low-end solutions go away.  nvidia realizes this, too, that's why they still haven't released a low-end directx 11 part.

avatar

Wingzero_x

I am more than plesantly surprised with my Acer E-350 laptop, and with a few upgrades (8GB RAM, and replaced the DVD drive with an SSD) even more so. I'll admit at first I did have problems, but once I got everything tweaked...it's pure joy. Nimbus there's a world of difference between the onboard 4200, and the on die 6xxx series. Difinately one proud I highly recommend.

avatar

Holly Golightly

Thomas Seifert is absolutely 100% correct. It makes sense for low end users to just have everything compact into 1 small solution. If you are going to play casual games, why would you need to spend the extra money on a low end discrete graphics card when your CPU can handle all of that? It just makes sense. With that said, I hope AMD works on making their APUs to one day replace mid-end video cards. I really feel the future is definitely with APUs and not GPUs. With that said, hardcore gaming will always require something a little more dedicated. This is one thing that will make Radeon last for generations to come.

avatar

Brad Nimbus

I remember when I built my first computer. It was a 250 x2 with just integrated graphics (radeon 4200 I believe) With that combo I was able to run Left4dead with max settings and a 1440 whatever resolution. I think I would rather have an integrated graphics chipset on board rather than an APU. Think about it Asus and Gigabyte will keep pushing out drivers and updates for that chipset for a long time after release to keep it optimized.

It just makes more sense to me to have a seperate chipset controlling graphics. Is there any bandwith or any other penalties associated with having it on die?

avatar

BasiliskSt

The new AMD Fusion processors' (E-series and A-series) ability to smoothly and efficiently stream 1080p high definition video both on-screen and out through HDMI to a TV or external monitor, while providing battery efficient web browsing and e-mail is a real advantage for AMD. Combining the CPU and discrete quality graphics processor on a single silicon die or chip makes the new Accelerated Processing Units (APUs) the future available today. Most users are not doing heavy math parallel processing. AMD is leveraging ATI's historic strength in GPUs to create a competitive edge for AMD. AMD needs the profit center as David versus Intel's Goliath, but Fusion looks to be the right product for consumers and the right strategy for AMD. Intel lags behind AMD in integrated graphics and has a poor record of maintaining and optimizing graphic drivers on an ongoing basis. (No company is perfect on driver support, but in my experience AMD is the best at keeping even historic hardware useable with updated drivers.) Intel's i7 may rule parallel math processing, but AMD's APUs and high-end discrete graphics kill Intel and nVidia may ultimately need a CPU partner or to learn how to do more than just graphics.

avatar

bpstone

There's really no point anymore buying a low-end discrete cards separately when you can get more for your buck purchasing a chipset with one already integrated. AMD APUs are nice for portable platforms. I've considered buying an APU laptop for traveling. They have enough juice to run high definition movies plus modern games on low-mid settings. Personally I think is the smart way to go. High-end discrete graphics probably aren't going anywhere for a long time. The quantum computer era is what will finally do it over. (^<_^)

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.