Quantcast

Maximum PC

It is currently Sat Nov 22, 2014 10:15 pm

All times are UTC - 8 hours




Post new topic Reply to topic  [ 3 posts ] 
Author Message
 Post subject: Using Pixel Shader 3.0 for Commercial Rendering
PostPosted: Tue Feb 08, 2005 7:55 pm 
Little Foot
Little Foot

Joined: Sat Jan 29, 2005 1:04 pm
Posts: 131
I've searched the web and have not found this. The only thing close I've found has been Nvidia's Quadro FX line for display rendering. It seems to me that the current Nvidia cards (I don't include ATI because they are limited to PS2.0) are just as nearly beafy as their $6000 big brother (Quadro FX 4000 SDI).

Now, I don't know a ton about PS3.0, but I do know that it is nearly unlimited in terms of applying shaders and I'm not sure, I could be wrong, but I've heard talk that PS3.0 could do ray-tracing. I've seen tons of cool things being done with PS3.0 on the internet. Most people are trying to get this stuff to work in real time (which is good for gaming) but some of it can't. I still see tons of value in this field for commercial ray-tracing software. A few that come to mind are Lightwave 3D, 3D Studio Max, and Maya. If some ray-tracing could be ported over to the GFX card, then that would increase the value of SLI for graphic designers.

Tell me if I'm wrong, but isn't it possible? I am a graphic designer as well as a gamer on the same machine and would love an excuse to get a Dual-CPU, dual-GPU unit with Lightwave 3D (9 or whichever) considering me to have 4 total threads (1 for each CPU and 1 for each GPU). I also think that the GPU has more horsepower than a CPU because of the number of pipelines and the depth of the pipelines. Also the PCI-E bus allows high-bandwidth in BOTH directions so the GPUs could send info back to main memory and the CPU(s).

I just had this thought and was wondering if it is plausible and also if any of you have heard anything similar. If I'm the first one, money is good :wink: but I would really like to see graphic design software (Lightwave 3D, WorldBuilder, 3D Studio Max, etc.) using this technology. I would think that it would very much decrease rendering time (as well as create a market for PS3.0 sooner and forcing ATI to switch over in order to compete).

Anyways, just a thought... Should be an interesting thread if my thought is correct. Let me know what you guys think.

Arquero


Top
  Profile  
 
 Post subject:
PostPosted: Tue Feb 08, 2005 8:03 pm 
Team Member Top 100
Team Member Top 100

Joined: Fri Sep 17, 2004 5:35 pm
Posts: 1176
Actually, I have thought about this before. My dad said they already use video cards to help with rendering. Not sure if it's true, though.


Top
  Profile  
 
 Post subject:
PostPosted: Tue Feb 08, 2005 8:20 pm 
Little Foot
Little Foot

Joined: Sat Jan 29, 2005 1:04 pm
Posts: 131
Do you know if they only use it one the "Workstation" class or do they use "Pixel Shader v.XX"? As far as I can tell, if Pixel Shader 3.0 could do everything a rendering engine needs to do, then the consumer cards could also be rendering slaves as well as gaming cards. Man if it is possible... that would be a bad thing for those who infest thousands of dollars on "Workstation class" cards which could increase rendering the same amount as commerial grade cards...

I still think that if it is possible, it would be worth it for programmers to add support and list it as a "Feature"... More people would buy their products and card manufactures could sell more cards (for SLI mode... ATI doesn't count in this until they support PS3.0 and also their answer to SLI comes out).

Oh, and also I know that "Workstation Class" cards help with the "real time" rendering (OpenGL and DirectX) of objects... Check out this. This kind of thing isn't exactly a "rendering engine" in my terms... I'm talking about a scene with 100 dynamic lights and fully reflective surfaces and landscapes with grass that blows in the wind, etc. One frame takes 30 mins to 3 hrs. on my P2.4C with 512MB RDRAM 1066... If a GFX card could have a custom PS3.0 program on it used as a "rendering engine" then look out... SLI value could increase.

Oh... And someone at Maximum PC... Why not give more benchmarks using LightWave? I've only seen one benchmark that included LightWave (and that was the dual CPU showdown, or maybe it was the Apple vs. P4 and AMD... I think). I think rendering times are a decent benchmark and Graphic Designers would benefit as well (and also provide the scene file so we could see how slow our machines are). I would help with my future purchases. Thanks again!

Arquero


Top
  Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 3 posts ] 

All times are UTC - 8 hours


Who is online

Users browsing this forum: No registered users and 5 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group

© 2014 Future US, Inc. All rights reserved.