Nvidia GTX 670: Faster Than The Radeon 7970?

33

Comments

+ Add a Comment
avatar

Strangelove_424

If the calculations benchmarks are as pitiful as the 680's, which I'm 99% sure they will be, I'll be going with a 580 to replace my 560Ti. These 6-series cards are overpriced, one-trick ponies. The 580 will still do fine on games, be good for rendering 3d, and support Adobe's mercury engine. With the 680, you shell out more money and get 1 out of 3 of those things.

avatar

JohnP

Sigh, that what I thought too until I downloaded FRAPS and saw that I was getting only 40-50FPS on Skyrim ( GTX 580 on a 27 inch monitor with all the bells and whistles turned on). I am getting GPU fever...

avatar

Strangelove_424

I hear what you're saying. I use a big monitor, but I switch to an HDTV for gaming. I would like more frames in BF3. Skyrim is CPU dependent so I turned off all software-based AA and AF and cranked it via Nvidia control Panel. That might help. Also, with the bad, unoptimized coding, OCing does gangbusters. For every .1 Ghz increase I saw 2-3 more frames until I got 55-60fps (it's capped for me with V-Sync) at 4.4-4.5ghz (2600k). The game also tends to run cool so if you're using a Skyrim-specific OC you could even push it a little. But like I said I'm on 1920x1080 output so ymmv.

My problem with the 6 series is it forces you to buy two cards. I personally cannot afford to spend, in addition to a ~$300 GTX gaming card plus ~$1,200 on a workstation card that is effectively a GTX series chip with a kick-in-the-balls price tag for paranoid workstation buyers to feel good about their purchase. My 560Ti gets the virtually same accuracy benchmarks in Cinebench as a quadro card - 99.456% vs. 99.457%. Nvidia can take their .001% accuracy improvement and shove it. It's a shame that they pulled a Tonya Harding on the 6 series. Unfortunately, my wagon is GTX580 or Bust.

avatar

Biceps

I have a 570 GTX right now. Trying to decide if I should upgrade to a 6-series or just go SLI on my current rig. Those 6-series are pricey!

avatar

Strangelove_424

Good point. I didn’t even think about SLI. Also, just so everyone knows I ain’t talkin’ trash, I re-benchmarked my system with Cinebench R11.5. Here are the results of my system, and a reference workstation with a $3,800 quadro card...

Reference benchmark from 3DBoxx 8520:
PROCESSOR=Intel Xeon CPU W5590
OPENGLVENDOR=NVIDIA Corporation
OPENGLCARD=Quadro FX 5800/PCI/SSE2
OPENGLVERSION=3.2.0
OSTYPE=64 Bit
CBVERSION=Windows XP Professional x64 Edition Service Pack 2 (build 3790)
CBCPU1=1.195368
CBCPUX=11.694880
CBOPENGL=45.521294
CBOPENGLQUALITY=99.467697

My system:
PROCESSOR= Intel Core i7-2600K CPU
OPENGLVENDOR=NVIDIA Corporation
OPENGLCARD=GeForce GTX 560 Ti/PCIe/SSE2
OPENGLVERSION=4.2.0
CBTYPE=64 Bit
OSVERSION=Windows 7, 64 Bit, Home Premium Edition Service Pack 1 (build 7601)
CBCPU1=1.771644
CBCPUX=8.504305
CBOPENGL=64.810318
CBOPENGLQUALITY=99.463425

And finally, here is the GTX680 benchmarked in Cinebench:

http://www.youtube.com/watch?v=sBDb1R3Rqec

So I was technically wrong – it’s not a .001% improvement, it’s a .004% improvement. My bad. I still think it’s an absolute #^$&ing joke that my humble, OCed-to-the-edge 560Ti beats both a quadro card and GTX680 in Cinebench's Open GL and came within a thousandth of a percent in accuracy of a $3,800 quadro card.

avatar

jcollins

This going to be another fairly paper launch like the 680/690?

avatar

Philippe Lemay

What do you mean paper launch? The 680 and 690 did come out... they're out now.

Well... mostly out of stock, but they have sold.

avatar

btdog

I could be wrong, but I think the comment was to suggest exactly what you're saying, "While the 680/690 are "available", no one can get one; effectively making it a paper launch."

avatar

Budman_NC

Ho-hum 670>7970 OK Those of us that have been around since the original Voodoo days have seen this battle rage for a couple decades now. Booooorrrring! Well, this IS MaxPC so I guess its still relevant. 3D Vision, Eyefinity, I could care less. Give me playable frame rates at a decent rez and I'm good to go. Inefficient game coders will just bog it down anyway. Wake me up when Intel stops ticking and starts tocking again. Ivy Bridge! hahahaha

avatar

gruvsf

Seriously, my AMD FX cpu give me plenty of frame rates paired with a Radeon 5850 (BF 3, Skyrim) and I didn't have to spend the 30-40% premium to run Intel's silicon (which I spec'ed out to deliberately avoid the last upgrade) I've been around since before Voodoo was the king of the hill and I'm still interested in the nVidia vs. AMD battle. It only leads to better frame rates (which you seem to care about) and lower prices for consumers.

avatar

Danthrax66

Your comment is dumb.

avatar

Budman_NC

OK, I get it, I switched topics, Nvidia/AMD to Intel.

avatar

alex_dh9

Really dumb.

avatar

limitbreaker

Lol

avatar

Valor958

You know what, for the performance the current gen of nVidia cards is dumping out, I'll take a $100 price cut for a GREAT card. So what if it's not the 'best'. Really... no one needs 100+fps in a game, its just for show and fluff, or bragging rights. I care about actual performance. Facts are... right now, even the 670 will be able to give everything we need, and 2 of them for those running 'uber' setups for the sake of it. At this point, 680's/690's, and the AMD equivalents are for bragging rights only. Most people should know this, but I felt the urge to say it anyways. I'm running a GTX460 OC (factory) and can run BF3 and every other game just fine.

avatar

Penterax

People still running at 1024x768 should get a clue, unless they just enjoy demonstrating their ignorance in public.

avatar

PCWonder

Agree. Give me a card that's a little cheaper but plays my games at 40-50 FPS all the time over a more expensive card that gives me 100+ FPS. Why pay stupid prices for 30 or 40 more frames? Your eyes and brain won't see the difference. It's just paying more money to brag.

avatar

limitbreaker

Unless you have 3d vision surround or eyefinity where suddenly your overpowered nvidia/amd becomes weak.

avatar

JohnP

Or Skyrim running on a 27 inch monitor with AA turned on...

avatar

Strhopper

True Dat! AA is a killer! I don't even use it most games. Game at 2560x1600

avatar

Bun

Or for 120Hz 2D gaming.

avatar

Valor958

Still unnecessary. To play a game on high settings (not necessarily max, but high), one 670 will be plenty. If you choose to go higher, you 'need' more, but only to fulfill your want. Though gaming is hardly a 'need' anyways, being able to play a game with great graphical quality doesn't "need" a ton anymore. BF3 pushes the envelope some, but the 7970 and 680 are enough to satisfy 90% of gamers who want great graphics.

avatar

illusionslayer

No one needs graphics cards. We don't even need computers.

So your "no one needs more than a 670 because no one actually needs it but wants it." logic is garbage.

avatar

Valor958

Your faux-rage/QQ completely ignores the point of what I said. I wasn't speaking of philosophical want/need scenarios, I was speaking of gaming want/need scenarios. My logic is actually fairly sound in that those that 'need' the dual/tri graphics solutions are generally enthusiasts with a lot of money to blow to indulge their ‘want’ for higher fps or higher benchmark scores. Getting the full game experience with graphics that impress can be achieved fairly easily with one current gen card. And, for the record, I don’t game at 1024x768, and haven’t in a very long time. A GTX460 still has plenty of power and I have NO issues at all with any games I play. Playing Skyrim on max everything with the HD pack does slow my system down a bit, but that doesn’t affect my experience in the game, and since having beaten the game, I don’t play it anymore anyways. On top of all that, playing with higher graphics settings doesn’t make you a better player. I prefer the eye candy, but sometimes it can get in the way due to the added ‘realism’. A old friend of mine would play BF2, and now BF3 on low settings (on a high end PC) to cut out some of the distracting effects and make enemies ‘pop out’ easier. He always does well, and is usually in the top 3 when he plays.
To each their own, but sli is not a ‘need’ by a long shot to enjoy a game.

avatar

limitbreaker

you're forgetting what you said? let me quote you "680's/690's, and the AMD equivalents are for bragging rights only" and i pointed out that it's not true and some people actually need that power and more, it has nothing to do with bragging rights. okay maybe you don't "need" a 1000$ card to play any of today's games on a 1080p screen but you also don't "need" a HDTV to watch a movie and an old analog screen will tell you the story just the same but that doesn't mean we should go on forums and tell everyone that innovation is useless and that we all should stop caring, if everyone would listen to you technology would be 20 years behind.

avatar

Peanut Fox

Even if it doesn't beat it and the two are just close, at 100 dollars cheaper ouch.

avatar

Strhopper

+1

avatar

Bun

The 680 is looking more and more like it really was meant to be a 670 Ti.

avatar

Supall

Someone willing to explain to me how this is possible? I'm assuming that this is reference cards they're comparing. If the 7970 can hold its own against the 680 clock for clock (as is apparent when you OC the two), how can the 670 beat the 7970? And would that mean that the 670 should downright beat the 680's stocks when OCed?

avatar

jcollins

That's what I was thinking. If the 680's toe to toe with the 7970 and the 670 beats the 7970, why would I spend the extra $100+ to get a 680 of a 670?

avatar

Danthrax66

The 7970 can't really hold it's own, most reviews showed the 680 beating it with launch day drivers while ATI had a few months of improvements already implemented. And in a lot of tests it destroyed the 7970.

avatar

USraging

yea... dont really think this article is true to performance. like others have said the 7970 only looses by a few frames to the 680 so how can the 670 get more frame rates? the real selling point for the 680 is half the power draw of last gen which in turn plays out to less heat.

avatar

Danthrax66

They actually aren't that close anymore, now that new drivers are out: http://hexus.net/tech/reviews/graphics/39013-asus-geforce-gtx-680-directcu-ii-top/?page=3 they have results for both the beefed up Asus card and the vanilla 680. Also note the the 7970 that is above the 680 on the tests is actually xfire and in some games the 680 is getting close to that performance.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.