Caustic Graphics Demos Their Ray Tracing Processor

11

Comments

+ Add a Comment
avatar

Mighty BOB!

You know they're already experimenting with realtime raytracing on Intel's new Larrabee hardware.  Research engine tests have realtime raytracing working in Quake 4 (Google it).  It's pretty cool.

I think in the (probably not all that distant) future we'll have a mix of raster rendering and raytrace rendering for different visual aspects. (Like the raytracing handling all the reflections and caustics.)

avatar

NAYRhyno

This really will simplify game development if/when it takes off. The developer will simply have to create the 3D models and deine the materials they are made out of. Then the ray tracing engine would know the reflectance parameters of those materials, and it could be rendered. The game could define atmospheric conditions and light sources to affect the way the world looks.

Imagine how cool night vision in a game would be. You could just have a switch to tell the ray tracer what wavelnghts of light to use. Imagine a AVP game with this feature.

The world of real time ray tracing is an amazingly advanced one, and if it can be achieved at HD resolutions and gaming framerates, we will be in for a revolution. Much more so than with PhysX, although that type of completely deformable world interaction is no doubt still on the horizon.

______________________
Game-Central.org

avatar

Balgaroo

So the whole idea behind this is that developers and programmers will no longer have to implement tricks or spend time developing these effects. So when the hardware catches up to the performance needed it should make developing a game that much easier and faster.  

Cool

I do hope they include this in an existing API, and keep this away from NVIDA and ATI.  Make them developed hardware that supports this.

 

avatar

PhelanPKell

So, who is taking bets on how long before nVidia tries to buy this company and integrate their engine into CUDA?

No takers? Psht. 1.8 years. oO

Actually, I think this will likely be reprogrammed to work in the "OpenCL" API (I think that's what it's called)  in DX11. That's, if OpenCL truly turns out to be worth a damn and companies start coding for that instead of CUDA and that thing ATi is supposedly doing.

Will this card make it as an add-in by itself? No. We already have video cards. They are increasing in strength and numbers what almost seems like daily. Will it mark a somewhat revolutionary step towards more realistic gaming? I'd wager on 'yes' to a lesser degree than that of Ageia's PhysX.

In 5 years, we'll need Sexta-SLi so you can hyper render every scene with 32x AA, 5120x3200 resolution on a 56" monitor, ultra-fast (40fps) ray-tracing, physics calculated to the macro-level in every game...and we'll have no reason to leave our houses 'cause CryEngine 6 will ACTUALLY have made jungles look real. ^^

avatar

Keith E. Whisman

I'm not sure if you got the gist but this is supposed to be a coprocessor to work with your graphics card. It will off load the Ray Tracing from the rest of the system so your existing graphics card can keep working on all the maps and stuff. 

I just wish some of these coprocessors would take off because they will really add alot of power to our computers and the software does not need to be coded to take advantage of it as long as you have it connected through DirectX or something like that. That's the way I understand it. 

avatar

PhelanPKell

I fully understand that it's "SUPPOSED" to be a co-processor to the GPU, but so was the PPU to the CPU, and look what happened.

Part of what makes that co-processor work, is the coding that tells the system how to utilize that co-processor efficiently for ray-tracing.

Quite often it takes someone coming up with something innovative, for other people to find a cheap and efficient method to integrate it into the rest of the system.

avatar

Keith E. Whisman

Heck maybe Intel will wow the world with it's Larabe GPU and change the way we play our games. I still like the idea of coprocessors. There was supposed to be a card that turns your computer into a super computer. But alas I can't find it anywhere. I remember the first 3D cards were cards that sat next to your 2D graphics card that gave you 3D. The now dead PhysX card was intriguing. 

But like you said there is always someone that has more money and a better idea. Sometimes that better idea is bad for the consumer as it takes us back technology wise instead of forward in the name of keeping prices down.  

avatar

Keith E. Whisman

So 14 times faster than slow is better I guess. All I know is I want one please give me one.. Mommy.... Please give me one. I promise to be good. 

Looks like it's using SODims for memory and if in the production boards they use GDDR5 they'll be alot faster. Is it going to use a PCIEx x16 slot? God I wish I could find a genie to give me three wishes.\

My fourth wish would be world peace. Too bad there are only three wishes. 

avatar

CTskifreak

Programs have to be written to do this. I don't think this is destined to fail, but it will be a long time before stuff will become available for consumers. From Wikipedia:

"In computer graphics, ray tracing is a technique for generating an image by tracing the path of light through pixels in an image plane. The technique is capable of producing a very high degree of photorealism; usually higher than that of typical scanline rendering methods, but at a greater computational cost. This makes ray tracing best suited for applications where the image can be rendered slowly ahead of time, such as in still images and film and television special effects, and more poorly suited for real-time applications like computer games where speed is critical. Ray tracing is capable of simulating a wide variety of optical effects, such as reflection and refraction, scattering, and chromatic aberration. "

Basically, it's the next step in graphics, but it is hard to do them in real time.

 

avatar

Nuxes

Could someone explain this to me? Do 3D applications have to be written specifically to support ray tracing, kind of like PhysX?  If so, this is destined to fail, just like the PhysX hardware.

avatar

sasquatch42

think they do

 

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.