Nvidia GeForce GTX 680 May Launch Ahead of Schedule

43

Comments

+ Add a Comment
avatar

8IronBob

I kind of find it hard to believe that when the 3GB GTX 580 came about, that was the card that every high-end user was after...then the GTX 590 came out, and the price tags just shied everyone but the most serious graphaholics away. So now we have the GTX 680 due out...now I have to question whether or not this upcoming GPU will have what it takes to be the new GTX 580 in terms of performance... Seems rather iffy to me.

However, considering that this one will only have 2GB RAM, that's making me wince, since the higher-end GTX 580 has 3GB RAM, and we all thought that was the holy grail. What's everyone's thought on this?

avatar

JCCIII

Even 3 GB of RAM, with today's texture mods, is minimal.

Skyrim, with my three GTX 480s, is running out of memory, majorly!

With the enthusiasm of mods, 3 GB is minimum, could not believe Maximum PC said, in last month's issue, that 3 GB was bragging rights; even, not needed. What? I had to wipe my eyes.

If they do not put the boot back into maximum, I am going to be calling my issues MinimumPC when they arrive.

I thought it was odd that the 480s were released with 1.5 GB of RAM, and, now, I am knowing it.

Do not buy high-end cards with less than 3 GB; believe me.

Sincerely, Joseph C. Carbone III

avatar

Mombasa69

Well don't forget about PCIe 3.0, oh and a motherboard that uses PCIe 3.0, then a new CPU that uses PCIe 3.0, oh and of course a new GPU. $$$$$$$$$££££££££££££££$$$$$$$$$$$$$$$£££££££££££££££$$$$$$$$$$$$$

And lastly no game developer is actually going to bother to programme games that use this new technology, well not until the new XBOX 720 comes out in a few years time, might as well go to the toilet now and flush your cash straight down it. =)

My 3 way factory oc 570's can run any current game maxed out np, cost me a fortune my last upgrade, aint gonna be stung again anytime soon, going to wait, spend my cash when I need too.

avatar

Nimrod

i have a one millionK monitor and these cards suck. i cant even play tetris at my resolution with 3 of these in crossfire. dont buy them also they over heat if you need to take the fan out

avatar

nadako

And I found this graph posted i wouldnt say that everything here is fact. Take it with a grain of salt

MODEL___SKU___SP___ROP__core_CCLK__SCLK___M__blend__TMU__bus__Mtype__Mcap___TDP____Price
GTX690_GK100__32_*_24_=_768__825/__1650/_4600__72____96__512__GDDR5/ 2.00GB_270W__499 U.S. dollars
GTX680_GK100__32_*_22_=_704__750/__1500/_4200__66____88__448__GDDR5/ 2.00GB_230W__399 U.S. dollars
GTX670_GK104__48_*_12_=_576__825/__1700/_4600__48____96__384__GDDR5/ 1.50GB_190W__299 dollars
GTX660_GK104__48_*_11_=_528__725/__1450/_4200__44____88__320__GDDR5/ 1.25GB_155W__229 U.S. dollars
GTX655_GK104__48_*_10_=_576__650/__1300/_4000__40____80__256__GDDR5/ 1.00GB_130W__189 U.S. dollars
GTS650_GK106__48_*_6__=_288__800/__1600/_4600__24____48__256__GDDR5/ 1.00GB_105W__149 dollars
GTS640_GK106__48_*_5__=_240__700/__1400/_4200__20____40__192__GDDR5/ 1.00GB__80W__119 U.S. dollars
GT630__GK108__48_*_3__=_144__750/__1500/_2000__12____24__128__GDDR3/ 1.00GB__60W__89 U.S. dollars
GT620__GK108__48_*_2__=_96___700/__1400/_2000___8____16__128__GDDR3/ 1.00GB__45W__69 U.S. dollars
GT610__GK108__48_*_1__=_48___700/__1400/_2000___4_____8___64__GDDR3/ 1.00GB__30W__59 U.S. dollars

avatar

dgrmouse

If this is accurate, NV is in huge trouble.

avatar

THE_REAL_MAVERICK

Now, lets see the GTX580 price drop like a rock.  Well, we can only hope.  :)

avatar

mortalrage

I think its a bogus rumor, too slow down the sales from AMD. We have seen Nvidia do this before....

avatar

newegg911

To be honest, everyone I know either has more horsepower than they need or just simply isn't interested in graphics. I know we need progress to keep the industry going, but it seems like hardware is so insane these days that people aren't even using what they have. People blame the consoles for this a lot, but I think they are a symptom rather the problem. I think things just progressed so fast in the 90's that we can't keep up that progress without getting totally burnt out.

 

Perhaps I'm overly cynical and psychology projecting it on to the world, I dunno.

avatar

0ly1r3m@1ns

well that is true unless your running 3 displayes then you need that extra horse power

avatar

Baer

I am, I run 3 1920 X 1200 24" @5760 X 1200 using a pair of GTX 580's OCed and they let me max out almost everything.

I am all for the next Gen GPU's but I just do not have the incentive or need to shell out $1500 or so unless there is a game that needs it and lately that is not the case. The last few generations of games have just been basically console ports that my two GPU's can handle just fine even in three monitor surround.

avatar

kamikaji

Or if you want to play Crysis, BF3, or Metro 2033 on max settings.  Those games really ramp up GPU usage like crazy.  GTX 580 can run all of them at around 30-60FPS with all settings up.  I can't imagine any less powerful card being able to run these at highest settings.

 

So yes, people DO use the really powerful cards.  They aren't just a waste.

avatar

The Corrupted One

You can run anything with 2 6970's in crossfire.

Hell, unless you have a giant screen setup, 2 6870's will do the trick most of the time(I need to get a second 6850 in my rig).

If you need more than 2 GTX 590's, you are either hacking into NORAD brute force style, or bitcoin mining.

avatar

aarcane

I hope it brings down the 500 prices.  to hell with ATI/AMD Parts, they're crap anyway.  I know there's little if any on the market that can use these cards, but progress for the sake of progress is always good for the future!  Once this console port fad is over, we'll all be glad for the fermi serries and beyond.

Beyond the release date, I'm more interrested in MaxPC getting their hands on a preview part and seeing what this bad boy can do and if it's worth the $700 per that I'm sure it'll debut at.

avatar

warptek2010

AMD parts are crap? How exactly do you justify such a statement? Benchmarks only? I am writing this on a computer I built with AMD, Gigabyte, Gskill, Crucial, WD, Seagate, parts. I suppose according to you the AMD parts (1100T cpu and Radeon HD 6950 video) are the weakest link in the chain? Sorry, but l've had this computer running for almost 6 months straight. Not ONE time have I ever blue screened, Not once has it ever refused to run ANY DX11 title on ULTRA settings and with smooth as butter frame rates or had ANY issue related to AMD parts or any other parts at all. So go right ahead and buy the very latest and greatest and pay premium prices for power you don't need and won't need for at least 3 more years.

avatar

solomonshv

AMD parts are UBER crap. their crappy-ness is unprecedented. I made this account on maximumpc just to tell you how awful they are.

Had a pair of MSI HD 6950 in CF with an Intel 2600k CPU and 16GB of corsair memory (1866MHz CL9). when running games like Crysis 2, Metro 2033, BF 3, the micro stutter was unbearable. At first I had a OCZ 120GB MAXIOPS SSD, thought they may be my problem, got a Corsair 120GB Force GT 3, same damn thing. Tried swapping memory, motherboard, PSUs, etc. Then got a pair of Sapphire HD 6950's. Nothing helped. Same thing

After Catalyst 10.10 driver failed to fix anything, I changed to a pair of GTX 570s (one ECS and one EVGA) and I was in instant gaming heaven!

I don't care about the FPS count, all I care about is a smooth and enjoyable gaming experience. AMD can't provide that. As a bonus, the NVidia cards were cooler, quieter, and used much less power because they freaking worked right. I have an APC unit that tracks the power being drawn from the wall.

I ended up putting all the worthless AMD cards on flea bay, item number of the last one that I sold: 300621476937

Understand that money isn't an issue for me and I can afford to blow a butt load of cash on this. and at one point I had 5 HD 6950's. All were equally worthless. Had performance problems in the past with ATI cards. Never going ATI/AMD again. They can consider one of their insane money burning customers gone.

avatar

lichi110

Thing is... it is all based on personal experience.. in my case, I purchased 2 6970s, which i have in crossfire... and I am not at all impressed, only BF3 runs as it should, I get 20 fps when driving in saints row the third, I have to disable the crossfire to get good performance in assassins creed revelations, rage was a pain in the ass, i get stuttering in arkham city... I clean installed drivers do not know how many times so far... so i understand how some may think AMD is not good.

avatar

DU00

Same kind of experience here. The hardware is capable but their multi-GPU support in the drivers is horrid. Granted I'm still on the 5850s, but I'd expect them to perform a little better in some games than they do, which I think is all on the drivers.

avatar

aferrara50

this isn't the flagship card. Nvidia is launching the midrange cards first. The GTX 780 will be the flagship and is coming out close to summer. This should compete in the high $200 to low $300 bracket

avatar

austin43

Um...no.  'This is supposed to offer similar performance to ATI's 7970'.  The 700 series will probably come in 2013.  This will be a $500-$700 card.  They won't just skip a whole *100.  They'll stick to the x60 x70 x80 x90 etc nomenclature.

avatar

aferrara50

you obviously haven't seen the roadmap. The 600 series is being released for OEMs and mobile while the 700 series will be for desktops. Low end coming out first, then mid, and finally high end to follow which we should see around summer. The dual gpu versions will be out next winter.  

avatar

kiaghi7

Apparently it is YOU that hasn't seen the road-map...

http://en.wikipedia.org/wiki/GeForce_600_Series

 

Entry-level GPU GTX 650

Mid-range GPU GTX 660, GTX 660 Ti

High-end GPU GTX 670, GTX 680, GTX 690

 

The future 700 series isn't going to be on a Kepler based chip anyway, by then it will be "Maxwell", but it will also be the year 2013 as their own road map clearly shows.

 

http://www.tomshardware.com/news/fermi-kepler-maxwell-gigaflop-watt,11339.html

 

 

Please in the future, if you're going to pretend to know everything, start out by knowing something...

avatar

aferrara50

and there is this too

 

http://cdn.overclock.net/7/79/600x393px-LL-79132af7_133a-635x416.jpeg

avatar

kiaghi7

You do know that link proves my point right?

 

Here, let me educate you some more:

!BY YOUR OWN LINK!

 

GK112 is slated for 2013, that's the 780 card, the G110 is a dual-processor card based off of dual GK104 dies...

GK107 is their mobile chip:

http://www.legitreviews.com/news/11583/

 

 

Now if you need more education, here you are:

http://www.madsmik.dk/wordpress/tag/gk106/

http://www.madsmik.dk/wordpress/wp-content/uploads/2011/07/Nvidia_Kepler_GPU_Specifications.png

 

You know, it looks an AWFUL lot like every last one of those GK### monikers is taken by a 600 series card save for GK112... Meanwhile the 680 is very clearly noted... So what does that leave?!?!

 

No don't worry, I know you don't know, so I'll help you!

 

Yes indeed, the GK112 chip is Maxwell! Hence why it's a 2013 product, perfectly in line with every road map and document from Nvidia! And note how it is radically more powerful than even the 680? That's because it's not a 680, but rather a 780...

avatar

aferrara50

GK is kepler... GF was fermi. GM will be maxwell. GK is all kepler based gpus.

avatar

aferrara50

if you read the link posted the 580 and 590 replacements are both dual gpu cards. If it was maxwell it would be single gpu.

avatar

kiaghi7

Indeed, READ the link, you will see the 580 replacement is -NOT- a dual GPU card, and the dual GPU card is the GK110...

Please, this is really just getting sad at this point...

I'm teaching you the entire subject, WHILE you're trying to pretend you aren't/weren't completely wrong...

avatar

aferrara50

/facepalm. That came out in 2010. How about something recent? Here it is. 

 

Before being rude you might actually want to read. Follow my builds on OCN and you'll actually see that I do for the most part. 

 

http://www.tweaktown.com/news/21936/leakedtt_nvidia_to_skip_600_series_jump_straight_to_geforce_gtx_780_did_i_mention_it_is_nearly_twice_as_fast_as_the_gtx_580/index.html

avatar

kiaghi7

Necro-posting, but I want to make sure you're properly embarrassed for being such a troll about it:

http://www.maximumpc.com/article/news/nvidias_full_kepler_lineup_leaked_web

BO-TO-THE-YAAAHHH!!!

In other words, you were 100%, completely, utterly, and absolutely WRONG on every single front!

avatar

kiaghi7

Please read your own link, because first it's not even remotely official, and in fact the author goes on and on about theory and presumptions based on absolutely nothing, also it doesn't say what you think it does such as:

 

"Kepler-based GTX 780"

 

We know that the 780 is NOT going to be a Kepler chip, that's what Maxwell is about... Nvidia's own official documents and road map clearly spell that out...

 

Just because you wanted to be "Mr. Know-it-all" doesn't mean you knew this subject... Which apparently you don't because we are having to teach it to you while you're arguing against us with links that don't even say what you think they say.

avatar

austin43

From the tweaktown article...

 

'Remember: This is just a leak, rumor, so TweakTown advise that you're up-to-date with your grains of salt and skill levels of pinching it.'

 

We're all playing off conjecture, but it would make less sense for them to skip an entire 100 in their naming conventions, no?

avatar

aferrara50

Nvidia went from the 200 series to the 400 series for consumers and did have a 300 series for mobile and oems. Wouldn't be anything new doing the same with the 600 series for mobile and oems and the 700 series for consumers. 

 

either way, this isn't nvidia's flagship card no matter what it's named. The 2GB of vram is near proof of that. 

avatar

voltagenic

There was never a GTX  380 ever released or announced which leads me to believe that the 600 series will be targeted towards normal consumers, and perhaps the 700 series to OEM/mobile only. 

 ALSO, since when does NVIDIA release the low end cards first?!?  I've never heard or seen such a thing before.  Maybe I just don't remember correctly, but I could've sworn that it has always been the flagship card first, (example from 400 series, GTX 480), then followed by mid range and then low end.

 

Don't make me find links to follow this up to prove it.  I'm just right :P

 

avatar

aferrara50

No gtx 380 existed, but the 300 series did. There's no way in hell this is the flagship. They might as well burn the card now if it's a flagship with only 2GB of vram. Can't even play BF3 on a 2560x1440/1600 monitor with that much vram without issues. 

http://www.engadget.com/2010/01/13/nvidia-outs-300m-mobile-graphics-series-causes-little-excitemen/

Naming has changed so often in the last few generations "gtx" doesn't really mean much since it wasn't always the highest end. Remember the GX2 cards. Most likely this will be the highest end mobile gpu  

avatar

rawrnomnom

He is (right)... Nvidia launching a card that can match a card that just beat its flagship. and then put out a midrange card, instead of trying to get the title back is stupid at best... Every day they delay their new flagship is a day that they waste, and lose sales to amd... And never, in the history of the company, has nvidia released a low or midrange card before releasing their flagship, why would they start now... 

avatar

aferrara50

because that's what they are doing and here is the link. 

 

http://cdn.overclock.net/7/79/79132af7_133a-635x416.jpeg

 

You'll see that the replacement for the 550 comes soonest. Nvidia delayed launch with the GTX 480 as well so the time between flagship launch of ATi (at the time) and nvidia was 6 months since the 5870 came out in september while the gtx480 came out the next march.

avatar

Baer

(double post, sorry)

avatar

Baer

Sounds good but, why do we need it? Most games being released lately are watered down console posts that the present generation of GPU's hardly break a sweat running.

avatar

aferrara50

Because even 4 way sli gtx 580s can't run 7680x1600 surround at maxxed out settings in newer games. TBH even the 7970s aren't fast enough for higher end monitor setups. 

avatar

mattman059

I'd be interested in statistics showing how many people actually play on those resolutions.

avatar

SPreston2001

I agree while im sure there are some people who game with extreme settings like that, most people do not. My GTX 590 slaughters everything I throw at it albeit I only game on a 50" 1920x1080 LCD. Still, I really see no need to upgrade anytime soon. But im all for better newer technology! I just wish game devs would start making software to push these beastly graphics cards we have now!

avatar

mattman059

Double post

avatar

ceator3571

My same thoughts. :(

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.