Nvidia Responds to GTX 480 Heat Concerns

68

Comments

+ Add a Comment
avatar

Nickompoop

If there's too much heat, you don't get the performance you want out of a card.  Heat is bad, especially without a water-cooled card.

avatar

crazitrain02

Nvidia said the same thing when I got my 7950GX2 when it was $600.  They've always had the heat problem, and probably will.  Right now they just pushed this thing out to try to stay with ATI.

Now if ATI could only fix their DX10/11 problem with BF:BC2 I would completely rid myself of Nvidia.

avatar

geared

This card better make my breakfast!

avatar

ErikTheGreat

I wish more attention would be placed on the heat load a component creates.  Living in Florida I really have to be carefull what I use in my system as some components have raised the temperature of my office quite a bit.  It doesn't make sense for me to run a high BTU space heater/computer when the air conditioner is on.  I could really use some hard numbers on heate generation to help me compare parts for my builds.  I really like my ATI 5700 series over the old 4800 as it really cooked the case.  Manufactures need to start offering low temp/power high performance components and let the tweakers crank them up if they don't care about the waste heat and high power consumption.

ETG

avatar

sotoa

"Thermi"... haha!!  That's the best thing I've heard all week!

I used to not care about power and heat, but ever since everyone showed that you can have high performance while taking power & heat into consideration... I've been swayed.

Nvidia lost it this time around.

avatar

Shckr57

i have 2 9800GX2's, and the only thing i hate is sli, ill never use it again, or buy 2 cards, i want the best single gpu out there, and that is the 480 atm. however, i am going to wait till they get all 8 core up and running, not with 1 disabled on the 480. that and with new drivers to come out, the performance will just keep going up. i dont care about power or heat. i want shear performance. i like speed, and if it means a hot computer so be it. i will probably watercool the card i get, just so i can over-clock it even farther

avatar

schmitty6633

If you want sheer performance why don't you just get the 5970? 

avatar

Shckr57

1. it takes nvidia 2 weeks after a game launches to get sli working properly. by that time i have already beat the game. and what i mean by sli does not have anything in common with multiple cores. i have a quad core, and i love it, but it would be like getting a new mobo and trying to sync them to work together. nvidias drivers are good, they get new ones out often, on the other hand, i have never liked ati's stuff. there cards are made with lower quality parts, and drivers normally always have bugs in them. i want a single chip on my gpu, no matter how many cores are in it, i dont want 2 chips, to have to use sli to get all the performance. also, with sli, or even crossfire, there is a slight loss of performace for ever card you put on. just because you have 2 gpus doesnt mean you are getting 200% performanve increase over 1, your getting about 160-180%.

avatar

burpnrun

As the very unproud ex-owner of three Nvidia 8600GT cards that bit the dust because of Nvidia's secret and disastrous "Bumpgate" chip manufacturing defects, I'd really advise folks to take a pass on "Thermi".

Sounds too much like Nvidia's up to their old tricks again.  Besides, why is XFX (perhaps NV's biggest add-in board partner) passing completely on Nvidia's Thermi 480/470, and will not manufacture them?  With their "double lifetime warranty", could it be they sense a disaster-in-making here?

avatar

smashingpumpin

"Clearly, this is not for me. I still prefer my radio on!"- Kudos for getting the point for todays corny qoute of the day. lol

_______________________________________________

he's pwning with a trackpad? oh really? oh reheheheeally?

avatar

imagonex

Call me when the GTX485 comes out.

avatar

Danthrax66

Everyone has to realize that Nvidia doesn't have to compete with ati. Their notebook technology is at the top of it's game they have the best netbook platform with ion and CUDA is in a lot wider use than opengl. As long as Nvidia releases the most powerful card they will remain on top too many companies are partners with them for them to lose money because some of the gamers and system builders like us won't buy them this gen. You can vote all you want and so can everyone else that doesn't like what they are doing they will still make money from this. And there is also a report that the heat was a bios issue with the cards and they will run cooler when they are released another possible reason for the delay. And not only is the card getting more fps it is getting more fps with more sampling turned on everyone seems to be overlooking this or just going by the mpc review which was very generic and didn't explore what these cards excel at, turning the filtering up.

avatar

Cruzg10

How do the BIOS affect the cards heat???

avatar

Danthrax66

well if I have my phenom at 1.5 volts it runs above 60c load if it is at 1.425 volts it runs 50c max during load so if they had the wrong voltage set then it wcould run hotter. A driver could possibly fix this too.

avatar

DBsantos77

 If anything, it would be incorrect voltagesettings.

-Santos

avatar

LilHammer

Seems like nVidia has forgotten their humble beginnings.  What made them a great company is how they created a product that performed better than the competition at a competitive price.  Now it's almost the inverse - a middling product at a less than competitive price.

They're arrogant.  Instead of listening to what consumers are looking for and developing a product around that, they are telling us what we want by selling a graphics card with crazy power requirements and crazier thermals.  Maybe they've forgotten they aren't the only piece of equipment in your PC case (nVidia: Your GPU isn't the only component generating heat in my case so you DO need to play nice).  Not everyone has watercooling and/or not everyone wants to void their warranty to use watercooling on a card not factory-designed for it.

It wasn't long ago that the tables were reversed and ATI was the arrogant company who thought they couldn't be knocked off the performance throne.  Are you listening nVidia, or will you remain deaf until you're no longer relevent?

A 10% - 20% average performance benefit does not justify an almost 50% price premium for what you are calling your next generation card.  Consumers will vote with their dollars and you will find you've pumped millions into the river... and lost huge market-share to ATI in the process.  Change your ways soon nVidia and remember, better to speak softly and carry a powerful performance-per-dollar graphics card...

avatar

JohnP

 I just think NVidia was caught by the elegance of the 5800 series in competition. Fermi is a research product tht had to be hastily turned into a working product.

avatar

IFLATLINEI

Heat = Waste

 Its obviously not in line with the direction we are going these days. Efficiency is going to be key when talking about producing and using energy. This is not news. Now i dunno if drivers will affect the high heat this card seems to exhibit but I think its important that we as consumers make sure this is a priority for both Nvidia and AMD going forward. They have made some advances but clearly are focusing a bit too much on whos the fastest. Speed is good but efficiency is king. 

avatar

Sebie Kay

I remember when the Nvidia 5000 series cards came out, there was a lot of people thrilled about the 'performance' that the cards had.  Turns out, the reason they were so fast was they wouldn't fully compute graphics commands, thus making FPS shoot higher.

 Once this was discovered and customers demanded a fix... performance dropped dramaticlly.

This is common by all tech vendors, and has almost come to be expected.  I never put much trust in 'specs' or hype generated by pre-launch reviews..  I want to see the numbers myself, thank you very much. 

-=Do unto others... THEN RUN!!=-

avatar

Karite36

Will no one just wait and see what happens when stable drivers come out. Drivers mean everything, and as far as I know, the're still in beta. "Theoretically", they should blow the Radeons out of the water, but the drivers aren't there yet, wait a month after general availability, and watch prices drop (slightly) and performance increase by like 50%. 

avatar

Danthrax66

most games support sli and crossfire but the drivers have to be modified to tell the game hey there's two cards here just look at th hardocp sli review they work fine in sli.

avatar

zaternine

Anyone please correct me if im wrong, but i dont see how drivers can make a cards hardware run cooler, especially a cards arcitecture designed to run hot to begin with.  i can see it increase FPS in games and such but not drop temps by 10degrees. Plus i have never seen drivers increase performance by 50%.

 

 

 

I'm a mathlete not an athlete 

avatar

Danthrax66

The latest ati drivers increased performance by 40% the drivers can make the difference on games especially after a few release ive it 6 months and you will probably see 60% or more.

avatar

ThunderBolt

I ran my EVGA 9800GX2 with fan speeds at 100% all the time.  After 2 years, I RMA'd 4 or 5 times.  Basically, the card dies after 6 months.  It idled at 60, full usage 95-110.  Sounds familiar?  GTX480/470 will be the same.  I don't doubt that.

avatar

Glycerin

Holy crap, it IDLES at 60? That's terrible! Why do companies let these products get into the hands of consumers with those kinds of numbers?  That is totally unacceptable.

avatar

spurdy

After using Nvidia my whole computer life, I was set to purchase 2 480 GTX's. But after waiting months for a card (that I still can't get) that sucks power, money, and emits heat like no other I did something I should have done months ago. I bought a XFX 5970 and the performance is stunning. I hope this answers some of your questions on what consumers are thinking Nvidia.

SP

avatar

Dexter243

Nutcracklng snack

is this going to be a nother 9800 gx2 ?

the card runs so hot the system will restart at random unles you use rivatuner to force the fan to run at 80% to 100% becous the defalt is 45% in the drivers and no mader how hot the card gets the fan will not speed up and ther is no way to control the fan in the defalt drivers how stuped .

and after you forc it to run at 80% the card sounds like a jet taking off?

no thanks nvidia i all ready spent $600 for a pice of crap video card i wll never do that agen.

and ill traid for a card that has performance (AND) efficiency

ill stick with ati for now lets face it thay just have a better design at this time and that is just a fact .

o i know nvidia will turn it around just like thay did when thay dumped the fx for the kick ass 6800 and when ati whent to the 4k card from the crap 3k card thay all ways do and when nvidia sell's do not turn around thay to will get off ther buts and build a fast and much cooler card  

avatar

charcaroth

Hmmm... kinda reminds me of the Voodoo 5 series video cards.  Waaaay too much heat, waaay to much power requirements.  And we know THAT worked out well.  I hope nVidia stays relevant, if only for competition's sake. 

avatar

RottenMutt

3dfx missed a product cycle and made a tough business decision they couldn't weather.  they were already dead when the vodoo 5 released.  3dfx was a leader, first to require additional power to the card, now everyone does it.  

avatar

Grun

In the Toms Hardware review of the GTX 480, the reviewer measured 160 degrees (71 degrees C) on the exposed metal on the card during normal game play............. after he burned his hand on the card.

avatar

st0rm4200

1. The chips made to withstand the temp.

2. Very Highly Doubtful you will have to run it at 100% on ur desktop. Set a fan profile for your game if your that worried.

3. Isn't going to cost anymore to run than any other card unless you run it 24/7.

4. It damaging something around it? I hope that was a joke... Maybe your hand if you touched it while doing some intense gaming...

5. It's a great card... Buying one.

avatar

Danthrax66

Most of the heat will get channeled out of the back and it won't effect temps much for other parts and if the 1-2c is enough to screw up stability then you are pushing your stuff to hard to begin with.

avatar

Emokidnotrly

why get it if your not gonna make full use of it -_- ever thought about that?

avatar

Brock Kane

Now that NVidia is admitting the 480 runs hot. Does this mean the fan will be running at 100% all the time? Loud and quite annoying I'm sure. So, when I am playing solitaire, will my PC sound like I am playing Crysis 2?

I have always been a big fan of NVidia, so much so that I haven't even considered looking ATI's way. But this concerns me and it should. I like my PC case nice and cool. I spent a lot of time and money building my gamer. I really don't want to see it melt in the Summer heat that the 480 will deliver.

I do want performance, but having second thoughts on this one. I am running 275's in SLI. I might keep that configuration for one more year. Or maybe try the 5870's.

avatar

chinomon

i just hope those heat pipes on the 480 aren't just for looks

avatar

krait_2777

Did someone says 107c ? That's like putting a boiling kettle in my casing....sweat...I was hoping ATI will push them to come up with something better (in terms of heat management, power consumption and power). Those 3 items were what sways me to buy the HD5850 over the GTX2XX. Now that i know, i won't regret ...

avatar

thesmilies

If that sort of temp was true, then it would be impossible to use water cooling.

avatar

CentiZen

Not really. Watercooling dosent actually use water, it uses a nonconductive and electrolyte balanced mixture which I'm sure would have a higher boiling temp.

SHEILA: AMD X4 965 3.2GHZ ; 4 GB G.SKILL GAMING RAM ; RADEON HD 5770 1GB

avatar

JohnP

 <With a scottish accent> "Capt'n, you're pushing the GTX 480 too hard! It's ready to blow!" "Just a moment more, Scotty... Just a> BOOM, HISS, Steam pouring out of the case with a slagged Fermi crystal melted to the motherboard. "Capt'n the Fermi engine is down, we have no long range scans!"

avatar

Danthrax66

The water wouldn't turn to steam just because the core is at that temp doesn't mean the water is at that temp too in fact the water never reaches core temps it is always much cooler.

avatar

Cruzg10

all this ATI-Nvidia fanboyism is getting out of hand. We need to balance
it out.
 So im throwing my intel/SiS/VIA fanboyism in the mix!
 thats
right! i get high performance and FPS in the 10s when i play hardcore
games like yahoo chess or Solitaire and you dont even notice the heat!

avatar

zaternine

Fanboy or not thats an awersome card, however here's the way i see it. If were to purchase this card i would have to upgrade my PSU to accommodate the power consumption, everyone complaining about electricity bills are blowing things way out of proportion as you would not see a difference especially with 80plus certified psu's in the market. As for the heat, i have a basic mid tower with decent cooling, but to buy this card and think about how much heat this card could emit i would be looking at another case for better air flow even though what i have now is good enough for a 9800gtx. All this means is the AVERAGE user would have to spend more money on parts alone for a card that would get only a handful of FPS better than the competition. Now for some who have the best PSU and a HAF case with a big allowance go for it. But i think i can speak for the average user who would rather opt for a 5870 or 5850 or 260 or 280 as they come down in price.

 

 

VICTORY NOT VENGEANCE

avatar

Neufeldt2002

If I had air conditioning I might think on this card, but since I don't, I think I will go ATI

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

I wanted a signature, but all I got was this ________

avatar

michaelbart0n

Heat matters because it will affect the rest of the components in your system and possibly screw up overclocks!

avatar

Ghostryderflyby

I waited patiently the last several months since the Radeon 5000 series cards came out, and didn't jump on the ATI bandwagon. I was waiting for the Fermi cards before taking the plunge into the next generation of cards. Now that Fermi is here though, I have serious concerns about the high power consumption and high heat. My 200 series factory overclocked GeForce never gets over 85c, and Nvidia wants me to think 107c is normal and healthy for my new video card and subsequently my entire system?

After waiting 6 months for Fermi, I now wish I hadn't. I have finally gone ahead and ordered a Radeon 5870 OC card after seeing the power consumption reports and operating temps for the GeForce. 

This reminds of the Intel P4 vs AMD athlon battle, when Intel tried to brute force things with the P4 and AMD came out with the Athlon and trounced it with a more efficient, less power hungry design. We can only hope that Nvidia responds the same way Intel did, and fires back with a Core 2 equivalent come back.

avatar

zaternine

haha yes i remember that P4 Athlon battle, and thats why i went with the Athlon, shoot i still have that system in the garage for toying around with linux stuff it never gave me any problems.

 

 

VICTORY NOT VENGEANCE

avatar

M-ManLA

Filter won't let me post. 

 

 

Electronically charged

avatar

Jeffredo

I feel your pain - don't understand what's going on with it lately.

avatar

Cruzg10

Im sure if it was november no one would have a problem but summer is fast coming up and ill be damned if im going to roast just for a 5% performance improvement

avatar

Sebie Kay

Amen!  It's just the first week of april, and it's already been hitting 85-90 degrees here!

I want to add a video card to my computer... NOT  a space heater! 

-=Do unto others... THEN RUN!!=-

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.