Nvidia Pits the GTX 480 vs. Radeon 5870

68

Comments

+ Add a Comment
avatar

Thiazolium

According to VR-Zone the prices may be $499 for the GTX480 and $349 for the GTX470.

avatar

NOOBZ1LLA

If Nvidia releases something that's
not as good as what's already out for 2x the money they will anger a
lot of consumers and enthusiast. Sure a few people will plop down a
small fortune for a Fermi card but those will be the dudes that you
really don't like. If Fermi is not close to an ATI 5970 and at the same
price consumers will get angry and feel that NVidia is trying to rip them off. 

avatar

NOOBZ1LLA

It seems like NVidia started as company that created GFX cards that catered to gamers. If Fermi isn't that great then they lost their way and my days of buying Nvidia products are over.

 

avatar

ju1i3andcandy

I am really not surprised at all that it seems to only perform slightly better than the ATI for the possible price point of nearly 50% more. I think that maybe Nvidia has been putting a lot of their resources into mobile graphics/computing lately because their demonstrations of the Tegra chip were amazing.

I switched to ATI once I plopped one into my girlfriends rig for half the price of mine and got superior performance overall on essentially the same machine for nearly half the cost.

avatar

dcblues355

LewTeNantDan    I've been using nVidia since since my VooDoo 3000EFX crapped out and hardware upgrade called for something new.I know nVidia doesn't pay much attention to these comments.I've owned from the TI4500 to my now dual 8800GT's(G92) and was holding out for a new affordable performance card from them(nVidia).I guess I may finally 'Crossover' to the ATI(I already use the AMD 965 BE).Like Intel,nVidia, priced themselves out of my range.I'm retired military with youngsters in college and can't drop that kind of jack on new tech-architecture .Sorry nVidia you lost me.

avatar

JohnP

A group managed to sneak a benchmark in at a show:

Nvidia graphics cards still have Fermi end of March in coming, but heise available online exclusively first benchmark values of a prerelease version of the GeForce GTX 470.

http://www.heise.de/newsticker/meldung/Nvidias-Fermi-Leistung-der-GeForce-GTX-470-enthuellt-946411.html

You have to use Chrome to get a kinda good translation out of this but the numbers they got were interesting:

To reach the GTX 470 in 3DMark Vantage (X-Mode) 7511 points, a Radeon HD 5870 with 8730 points, more clearly. , A HD 5850 stood at 6430 points, which reached GeForce GTX 285 in the same test system, 6002 points. The Performance preset shows a similar picture, however, is the Radeon HD 5870 (17303), only slightly ahead of the GeForce GTX 470 (17156). A Radeon HD 5850 reached 14,300 points.

And

Unigine benchmark based on DirectX 11 and uses tessellation. Nvidia announced in advance, would be the Fermi-enabled graphics cards, the HD-Tessellation 5800er cards significantly outperform. This could confirm the preliminary benchmark results. Thus achieved at a fourfold anti-aliasing, the GeForce GTX 470 averaged 29 frames per second (fps), a Radeon HD 5870 around 27 fps, the HD 5850 only 22 fps. At eight times the anti-aliasing, however, the performance of the GeForce GTX 470 breaks strong. Then reached only 20 fps, a Radeon HD 5870 is at just 23 fps faster, the Radeon HD 5850 (19 fps) is subject only slightly.

So my take is that the Fermi will be on par with the ATi 5870, slightly faster here or there, and slower in other places. In other words, don't look for a great leap of technology with Fermi.

 

avatar

Keith E. Whisman

What the biggest problem is here is that the new nvidia femie card is only a little faster than an older ATI card that ATI will either be adding value to with driver updates or with new hardware coming out with in the next six months. So nvidia won't hold the speed crown for long as ATI releases something new about six months befor NVidia can. 

You see Nvidia is behind ATI now in technology. You have to see this at least.  

avatar

spoonard

How is a $700 price point anywhere even in the ballpark of acceptability? Twice the price and not even 25% better performance than the ATi equivalent? Are they depending on hype alone to sell this card and not actual performance??

avatar

avenger48

You are the second post like this.  As I stated in my comment, the $700 price is a RUMOR, and not a very solid one at that.

 Come on! 

avatar

dpgdog187

I think the ultimate conclusion is that even if the Fermi architecture is priced in the $500 range as opposed to the 7 the slight performance gain doesn't justify the upgrade in parts. Anyone who buys a GTX 480 is looking at a PSU upgrade and possibly mobo making that overall price for the slight performance enchancement impractical. AMD/ATI ftw in this battle if you ask me.

avatar

ju1i3andcandy

Although I do want to wait and see I am not really hopeful that Nvidia will break their mold of pricing so dang high. They've been doing it for years, what's going to change that now?

avatar

chaosdsm

Actually, it is a very solid rumor... look for no-frills 480's starting at $679, with manufacturer overclocked models comming later & going for upwards of $749.  Hell, even the top end manufacturer overclocked Radeon 5970's are currently at $700+

avatar

k11k

I will still have to wait for some real world testing. Even a video can be edited. If they are releasing this for the Fanboy to go wait for this product with this crappy try at benchmark, then the Fanboy are also idiots. Boy, how far will nvidia try to be apple. If we price it high, they will come.

avatar

ju1i3andcandy

My thinking exactly, Nvidia is the Apple of the graphics market. I used to buy into their malarky but man, I went ATI once and I'm never going back.

avatar

JohnP

I have to assume that NVidia played fair with the Heaven benchmark. There are a lot of settings that are buired in the settings that could have been turned off by NVidia to increase their framerate but I will take it that it was an apples to apples benchmark.

 I ran the same benchmark with my ATI 5870 and my i7320 set to 3.2 MHz and 6GB of memory on Win7 32 bit. I saw very similar frame rates for the ATI that was in the graph. (Which is kinda funny as my screen setting was for 1920 by 1200 instead of their 1920 by 1080)

What I would have liked to see what effect AA would have had on the benchmark. NVidia turned it off completely. Ansotrophy was also turned to 1. ATI does sturggle with AA turned on for Heaven so NVida would have been better turning it on for THEIR card if it would have less of an effect. So does Fermi play well with AA?

 Just got to wait and see I guess. Least they are letting SOMETHING out! Heh.

 

avatar

snapple00

Why on earth don't you run a 64 bit OS.

avatar

DBsantos77

 LOL, good point snapple, although I'm sure there's a reason for it :)

-Santos

Gigabyte 785GX Micro Atx

AMD Phenom II 720 (Quad @ 3.6 Ghz 1.47v.)

6 GB DDR3 1333

Corsair 500w

Arctic Cooling Freezer Pro Rev.2

HIS HD 5850 @ 940/1175/1175v

500 GB

avatar

JohnP

 I have tried Win7 64bit 3 or 4 times now but ALWAYS run into one of my must have apps not working (Password Tracker Deluxe, Directory Opus, Tune up utilities, Reg Supreme Pro, HP 7210xi printer, and on and on). Fiddled with compatibility but Win7 64 barks so bad at most everything I try to do, JUST NOT WORTH IT! Besides recognizing more of my 6GB of memory, I see NO difference in speed on anything I do real world.

 Nope, Win7 64 just not ready to roll out to the masses.

avatar

QuakindudeMod

You must be talking about older apps. So far in Win7 64-bit, I've had only ONE printer issue. And that printer is older than dirt too. 

All of my games, 99.9% of my programs and all my hardware is running with Win7 64-bit. If you were to manipulate large photo files, as more and more people are doing every day with these better camera's, then having more than 4GB of ram is a MUST. And having an OS that can take advantage of that RAM is critical to your performance. 

But even a 32-bit OS chokes on 4GB of RAM after the BIOS maps memory addresses for all the onbaord hardware. HDD's have 32MB caches. Video cards are normally equipped with 1GB or at least 512MB. By the time you get your PC booted, Windows is only reporting 2.8-3.4GB of ram that's addressable. 

Maybe it's the way I use my computer. But running a VM, 15 tabs open in FF or Chrome, my little info apps all running, modifying an Excel doc, all the while running multiple programs in the VM and still not having a lag because I'v got 6GB of RAM to work with is much more important to me than a few apps not working. Plus, except for the one printer driver (and even then....they say Win7 support is coming), not working, the one or two programs/apps I've had that will not run under Win7 are easily fooled into running anyway by right clicking them and setting them to run as WinXP SP2. 

*****MaximumPC Moderator. Report inappropriate/SPAM comments to
QuakindudeMod at Gmail--dot--com with a link. My personal comments do not necessarily
reflect the opinions of MaxPC or Future US*****

avatar

TechJunkie

Not ready to roll to the masses for you that is. I have been running win7 64 now since it was released as a RC and now the upgrade version without a hitch, none, nada, And that's on my desktop and my netbook. I can understand the plight for older apps not working correctly, but that is what win XP virtual box is for, (free with win7 64 pro on up)! But still, if your using older apps (that you like), and older hardware that you just can't seem to part with,then I see your point. But don't say it's not ready for the masses just because it doesn't work for you.

avatar

NOOBZ1LLA

yeah dude, also don't forget that being limited to 4gigs of memory with a 32-bit OS is not all. YOUR VIDEO CARD MEMORY APPLIES TO THE 4 GIGS!!!

That means if your video card has 512MB of memory your OS can only utilize 3.5 gigs of your ram!!!

avatar

fx2006

<cite>Who the fuck is going to drop $700 on a video card, when you can buy an XBox or PS3 for half that?</cite>

Me :-]

avatar

avenger48

Please read my comment.

avatar

johnny3144

i don't understand why everybody is hating fermi. i think the price is quite reasonable, because the card isn't intended for mainstream users. the card is meant for those who must have the fastest/newest thing money can buy, kinda like how the intel's extreme line of processor works. 1000 for a CPU when you can get almost the same thing $600 4 month later. i can't afford something in that range, so i will just wait for something mainstream to release few month down the road. beside, the performance/buck is not a linear function, a 20% gain on a $500 isn't gonna be as cheap as a 20% gain on a $100 card.

avatar

NOOBZ1LLA

Dude the difference between what NVidia is doing and what Intel does is huge. No one gets angry at Intel for selling an overprice CPU, and subsequently stop buying their products.

Everyone expects there to be a 1k Extreme edition from Intel.at all times. Like death and taxes, the thing is Intel also sells great chips for around $200-300. Thats what people want from Nvidia, good board for around that price. 

avatar

TechJunkie

It is not...repeat...it is not.. 20% faster in all areas. Just in nVidia's own tesselation portion of thier own benchmark. In all other areas, according to thier own, it is only 5%, + or -, faster in games played today.

avatar

johnny3144

i understand it's not 20% faster all around, it's only 20%+ faster in their selective benchmark. but that doesn't mean it's not the fastest graphic card available in the market few week from now. there is a market for this card, just not for us who have a budget. why do you think the Intel's extreme sell? those don't even got a performance gain, just an unlocked multiplier for better overclock. they sell cause they are the newest and fastest thing around(which is why intel release them first for this market, AKA bragging right). 

 

btw where is this according to their own it's only 5% faster? i high doubt nvidia would just walkout and say our card is only 5% faster. and in some game tested(farcry 2) the GF100 showed significant improvement over 5870 with a pre-release driver.  

avatar

TechJunkie

http://www.engadget.com/2010/03/06/nvidia-gtx-480-makes-benchmarking-debut-matches-ati-hd-5870-per/

There ya go. Just look at the chart (benchmark) provided by nVidia and do the math....then watch the video. He explaines when tesselation is turned on, it beats the 5870 then shows that chart but offers no explaination as to why the 5870 matches it in all other areas. That is the only thing the nVidia guy touts.

avatar

white_sereph

Why hate Fermi?

 

Simply put - it barely matches a six-month old ATI card for twice as much money as said ATI card.

 

There are a lot of good points here on benchmark setting questions, and ultimately I think we agree that the Fermi MAY be better at scenes with high tesselation by 5 to 20% in optimized conditions.  Why pay double for that?

avatar

johnny3144

hardware canuck simulated the fermi test system for farcry 2, and tested  5870 on the almost exact replica testbed. fermi is 20% faster all around on that benchmark.  fermi isn`t just slightly faster at only tesselation, it`s suppose to be much faster than 5870. but with currently limited benchmark, it`s hard to say for sure. guess we have to just wait for it be to released. i have high hopes for it. hopefully it will start a price competiation, and i can get a nice card for cheap few month later ^_^

avatar

Neufeldt2002

How can you trust someone who is "Simulating" hardware that isn't out?

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

I wanted a signature, but all I got was this ________

avatar

johnny3144

it's a process called induction. since the GTX285 perform almost the same with +-2%, we can assume the 5870 will perform simular on the same hardware. therefore, we can assume the comparison between GTX480 and 5870 to be accurate within 2%.

avatar

Neufeldt2002

The logic is flawed, you are not taking into account unforseen variables that can be game changers. New arcitecture, raw speed, transistor count, etc, etc. Your numbers are "Speculation", but you quote them as fact. The fact is, these cards aren't out, and any benchmark has to be taken with a grain of salt, as they are completely one sided. I take it you buy nVidia, and only nVidia? 

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

I wanted a signature, but all I got was this ________

avatar

johnny3144

how is it flawed? what unforseen variable? where the hell new arcitecture, transistor count come in? it's a fermi card released benchmark, compared with 5870 tested on the replica testbed. they took the same hardware used on nvidia test bed, tested an ATi Card on it. there is no "unforseen" varible. i gonna assume you never even read the article on hardware canucks, so go read it first.

 

and no i am not an nvidia fanboy, i am simplely pointing out fermi doesn't fail, the card have potential. nVidia isn't stupid to just release a card with same performance and charge 300 extra. if there is any varible, then it's the pre-release driver for fermi, which will only improve at release. you are too quick to judge the card fail.

avatar

Keith E. Whisman

Nvidia and ATI have both been caught fidgiting with benchmarks to artificially inflate their scores. Remember the Nvidia 5800 Ultra leaf blower? Those scores were all inflated for that card. I'd bet the actual performance to be about the same as the ATI 5870 and if it's faster it's only slightly so. 

But really the numbers for these cards are so high that you really wont notice the performance while playing your games.

As for these cards you had better have plenty of  Ram and at least a dual core processor 2ghz or faster or your looking at not seeing the full potential of these cards.

And check this out too. If all your running is a 22" 1680X1050 res display then you aint gonna see any difference and to if your running a 24" display at 1920x1080 still no difference.

I myself am going for the ATI when I build a new machine but if Nvidia has something faster when it's that time I'll go with them. But only if it's not crazy expensive. 

avatar

white_sereph

You missed a subtlety as I do remember the scandal with the benchmarks on the nVidia cards of yore.

 I wasn't that they were cheating on the benchmarks via the benchmark settings - like it is feared to 'possibly' be here.

 The GeForce 5xxx issue was way worse than that.  nVidia wrote a driver that cheated driving the numbers higher by not rendering all the visual information in the first place.

 ATI would run their card and generate their numbers (Radeon 9800), and nVidia would run the same benchmark on their card WHILE THE NVIDIA DRIVER WAS SET TO EXCLUDE MASSIVE AMOUNTS OF DATA FROM BEING PROCESSED behind what was ultimately rendered.  If I remember right, they tried to justify it as well, at one point, by saying the driver selectively only rendered what was within the scope of the point of view rather than a whole environment, or something to that effect.  This made numbers they tried to sell off as comparable to ATI rendering everything without exclusion via driver based cheating.  Please, someone correct me if I'm in any way inaccurate on this.

avatar

Thursday

No need to correst you at all, as you are pretty much bang on with your recollection of the events. It was pretty ingenious by nVidia. Totally and completely cheating, but still ingenious...lol.

avatar

Keith E. Whisman

Great memory or googling either way. 

Kinda like Kirk cheating to win at kobioshimaru. 

avatar

avenger48

This much-cited $700 USD price tag comes from a no-name e-tailer (SabrePC) posting a mis-speced GTX 480 (With no picture) on their website.  It has never been in stock and is probably just a cheap way to get advertising for SabrePC from legit tech websites.  This $700 price hasn't been confirmed by nVidia or anyone else legitimate and hasn't been echoed at any legitimate online stores (Such as Amazon or Newegg).  Also, to make it that much worse, the price on SabrePC is now down to $600 USD.  

 http://www.sabrepc.com/p-174-nvidia-geforce-gtx-480-2gb-gddr5-pci-express-x16.aspx

 Honestly, do we really need to use information which is most likely blatantly false and cite it as a legitimate rumor?

avatar

TechJunkie

There at SabrePC, they state the 480 as 2Gb. I thought nVidia's own specs said 1.5Gb? But anywho, is there such thing as a "legitimate rumor"? A rumor is a rumor and it might or might not be true. So far, rumors about nVidia'a pricing in the last few years have been almost....i say again...almost on par. Even at 600$, that's way to pricey for it's performance.

avatar

avenger48

The 2GB was what I was talking about with the mis-specced comment.  As far as the legit rumor, no, there isn't, but there are much better sources than a no-name store, such as tech crunch, cnet, etc.  As I recall, the 280 was VERY expensive out the gate, but the price dropped within a couple weeks.  And the $600, I was trying to point out that this isn't a reliable source, since they are constantly changing their made-up price tag.

avatar

Baer

I have both NV and ATi solutions with my latest two builds having BFG 285 OCE cards. I like them but I will upgrade for the following two reasons:

1. Ease of set up of three monitors without having to dump my present 1920 X 1200 Samsung 245T's, so far ATi seems to be winning that race. 2. Ability to run three monitors in expanded mode with one card, even if it is the two GPU version, this will require a lot of rendering power. So far school is out on that one. Price is a factor but if there is value I will pay a higher price. If there is no real value I will not.

It is only a few weeks before we will start to get unbiased reports. Right now all we have are opinions. Let's wait for the release and reviews.

avatar

Spyfighter

This benchmark is more of the same tactic Nvidia has been using all along. What's with all the secrecy? I think everyone knows this tactic is aimed at potential ati buyers in order to get them to wait until Nvidia releases its new cards. Speaking for myself, I couldn't care less which card is better because I am on the verge of quitting gaming altogether. Almost all games now come with securom. As gamers, we need to boycott every game which is released with DRM, especially securom. Now I just play those which don't have it, but that means I am not  buying any new games.

avatar

Baer

While I totally agree about extreme DRM like SecuRom or having to be online to play, the last three games I bought do not have SecuRom, Dragon Age Origins, Napeoeon Total War for example do not have it. I would not have bought them if they did have it.

BTW, here are the best SecRom removal instructions that I have seen. They have worked for me. http://www.reclaimyourgame.com/index.php?option=com_content&view=article&id=68&Itemid=40

 

avatar

QUINTIX256

...so that it looks like only pirates who couldn't care either way are playing them is a stupid idea. The more people actually pay for what they use, the less draconian these attempts to get people to actually pay will be.

You can have your recession. I'm not participating.

avatar

DBsantos77

The only way this card is going to sell for that performance gain over the 5870 is CUDA. Otherwise, better start coming for reasons to justify the $700 over $400.

-Santos

Gigabyte 785GX Micro Atx

AMD Phenom II 720 (Quad @ 3.6 Ghz 1.47v.)

6 GB DDR3 1333

Corsair 500w

Arctic Cooling Freezer Pro Rev.2

HIS HD 5850 @ 940/1175/1175v

500 GB

avatar

AntiHero

I have to say it. That looks pretty weak if the price point is over 450. The ATi 5870 is last half's series, and it's competing VERY well with that card. I have to say I'm a little unimpressed.

avatar

LatiosXT

 Why are people treating rumors like a fact?

 Wait for the damn thing to come out first and have everyone properly benchmark it before throwing a [blank]storm.

avatar

johnny3144

cause they are not rumors? they are actually benchmark release by nvidia. the benchmark might have been selective(running test at res they are good at), but it's still a real benchmark that reflect the card's performance(more or less). 

avatar

TechJunkie

Let's break this down.....

nVidia's (fermi)<-----sounds like an std to me...GTX 480 is rumored to be 700$ and is only slightly (+ or - 5%?) faster than its counter part, the Radeon 5870 which is only roughly 380$. The 5970 (dual GPU)is 100$ cheaper than the rumored price of the 480 and still makes it its bitch. This is a no brainer.

Even if the 480 is upwards of 10 to 15 percent faster than the 5870 in real world testing, it still doesn't justify the 700$ cost, Especially with the high thermals they claim it to have. I certaintly will not buy one of these cards to justify having to buy a new PSU to power the damn thing. So in essence, people are going to pay a rumored extra of 320$ for a 5% increase in speed, more heat, and the name....nVidia....more power to ya. I can take a cruise for 3 days and get pampered for 320$.

Get real folks......this card is an epic FAIL for price to performance.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.