The State of GPU Computing: Is the CPU Dead Yet?

8

Comments

+ Add a Comment
avatar

wumpus

Parallelism is the future, and always will be.

Parallel code and great speed have been synonymous with speed since roughly the 1980s when single chip processors could surpass old fashioned board level designs.  From that point on, it has been easier to stamp out multiple copies of a chip (or core) than to build a core that is twice as fast.

You may have noticed that everything isn't parallel yet.  Some things are easy.  Graphics is typically considered "embarrassingly parallel", and thus GPUs have been steadily increasing in power in proportion to their transistors for years, while CPUs have stumbled.  If your problem can be broken down into wide, parallel sections (especially producer/consumer threads that share no memory other than queues), you can take advantage of paralllelism.  If you have to share semaphores and memory: prepare for the heisenbugs of doom.

Then there is Amdahl's law.  The catch is that the part that you can't split into multiple parts will dictate your speed.  Expect it to be a significant chunk and limit your program to whatever one core of the CPU can do.  Adding more GPU gives noticably diminishing returns after that.

All this is just for CPU parallelism.  From what I've seen of openCL, it looks pretty weak in terms of trying to merge data computed on other threads back  together for the next pass.  Presumably this will pass soon, but don't expect everything  to be  as easy as on a CPU.

avatar

orbonsj

I can't speak to the gaming/comsumer marketplace, but prior to retirement, I worked in scientific computing since 1965. The problem of utilizing multi-cpu resources has been around since the Cray 1 (or even before). There are two issues. The first is financial. Many commercial codes were developed and verified on single-cpu systems. Changing these codes to a parallel computing platform requires an enormous investment, particularly if your talking about safety issues (think nuclear). The second is theoretical. Some algorithms require sequential operations, while others can be converted to parallel algorithms. These are being worked on, but the insertion of these techniques is relatively slow.

In short, it's the "If it ain't broke, don't fix it." attitude.

 

avatar

HokieTechie

I use the same Phenom II X4 system for gaming and photo editing, so I "happened to have" a Radeon 6850 installed when DXO added OpenCL support to their image processing software. As a result, processing 16 Megapixils of 14-bit RAW data from a Nikon D7000 now takes less than 10 seconds. When I was doing this CPU-only, the same task took 45-50 seconds.

And I only needed to install one Catalyst update to get the system stable . . .

 

avatar

Ulrich

I use the open source Blender 2.61 which has full GPU rendering. In fact when using the cycles engine in Blender 2.61 you get realtime rendering in the view port which is really nice for setting up lighting, and materials. It has the option of using the GPU or CPU (Cuda is only supported as of right now so no ATI support yet) On my laptop the GPU is substantually faster at rendering opposed to my CPU. NVIDIA GeForce GT540 graphics with 2.0GB Video Memory vs. Intel Core i7-2617M 1.5GHz (2.6GHz Turbo Mode, 4MB Cache)

If you are interested just google Blender cycles.

avatar

Ghost XFX

Huhuhuhu.

Is this what AMD is gambling their future on? It seems to me they put a whole lot of effort into thier GPU's as of late compared to their actual CPUs. Personally, I don't like the way they're going right now. Hope they prove me wrong...

avatar

The Corrupted One

1. Thermal issues, Even if the CPU stays the CPU, there will become a point that Moores Law will become invalid, because you simply can't get that many transistors on one chip.

2. Modularity.  CrossfireX, nuff said

avatar

Jiffy

They don't even perform the same operations, of course it's not dead yet.

avatar

vashts1985

you should actually read the artical.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.