Nvidia GeForce GTX 780 Still Headed for a May 23 Release

36

Comments

+ Add a Comment
avatar

Crazycanukk

I Skipped an upgrade when the 680's were released and went with 2 GTX 580's instead.

yes yes,,I know..why would you upgrade to a 780 when 580 sli will do just as good @ 1080p..for me there are other factors i look at as well.

I will be moving the 2 580's into 2 other pcs in need (badly needed TBH) of upgrades ( Living room gaming pc and Office PC) and TBH the drop in power consumption and heat is another factor which i am considering although this forms a very small part of my over all thinking in considering an upgrade. Because trying to keep 2 gtx 580's cooling isnt easy and my fans are regularly turned up to 75%+ when heavy gaming just to keep both cards (top especially) below 85c. yes they are cleaned and no i will not spend money on an after market gpu coolers and yes my Antec twelve hundred is well ventilated with no less then 7 fans, good cable management and cleaned regularly and i know the cards can run hotter then 85c safely but i don't like my stuff running hot at all.

+ i can cut my power requirements with SLI 580's almost in half..and that is significant considering how much time my system is on..at full load when gaming currently My UPS is telling me its pushing 760watts to my system. Thats a lot of Juice. With a GTX 780 i am guessing (by using titan specs) that will drop to 450w aprox or potentially slightly less. Thats like turning off 3 100w light bulbs..and with the hydro rates we pay..OI VEY!! That's a decent savings in a years worth of gaming IMO.

I see speculation that the GTX 780's will not be much above the GTX680's in performance. Lets not put the cart before the horse and tear each others faces off until we see some benchmarks.

If there is not much of an increase in performance (say 10-15% maximum) and it looks like this is more of a tock release in the Tick/Tock cycle of development (which it appears to be) then I would say wait for the 680's to come down and buy one of those which will be an option i explore instead in the interim or wait and see when the new architecture comes out which could be a year + from now Vs what games are coming out and what performance do i expect from those titles.

Either way i don't think anyone should rush out day 1 and hand their cash over..it should be a wait and see the benchmarks and user feedback from the ones that did buy it on release.

avatar

John Pombrio

I want to see the benchmarks before changing out my GTX680 (running at 1084MHz).

avatar

shommy2002

John Pombrio why do you upgrade every-time a new generation comes out? Instead of wasting your money donate to some charity, because that is all you are doing is wasting money. I know it can be interesting and all to read up on all the latest technology but buying it just so you can get your e-peen on and to brag about benchmarks just doesn't make any sense and is not practical. At least wait for a generation skip or something.

avatar

John Pombrio

Waste money? No, not really. With two sisters, four in-laws, two grown sons, 8 nieces and nephews and a bunch of their kids, and a retired friend from HP who loved my gift to him of a full Sandy Bridge machine with a Dell 24 inch monitor, I keep them all in desktop computers. My "old" graphics cards go skipping down a line until 4 and 5 year old graphics cards are finally retired at the far end which get recycled. Right now, a nephew is off to WPI so his graduation will be a full blown machine from my spare parts.

I also have learned to keep fairly up to date new part spares (when they are on sale) in case something on somebody's machine goes bad. Keyboards, headphones, power supplies are at the top of my list. After having my son lost his GTX680 card a couple of months ago, that too would be worth having a spare of.

As I do have the money, I give to Doctors without Borders and help support my dead wife's mother who only has SS to keep her afloat.

Besides, what I spend on computer pales in comparison to, let's say a big sailboat. The "mortgage" payment, taxes, docking, storage, and upkeep alone on a friends boat costs more per year than I have spent on ALL of my computers ever. My "hobby" is a cheap one (and keeps me home at night, heh)
Good enough?

avatar

captainjack

This is for you John.
http://www.ski-epic.com/gifs/g007_citizen_kane_slow_clap.gif

avatar

Crazycanukk

We think the same John;

I game because its

1) Fun and enjoyable (most important)
2) Cheap in a relative sense - I have friends who spend in one trip to the mechanics to fix a snowmobile (which they use for maybe 12 weeks here ONLY if there is enough snow) what i spent all year on gaming.

I upgrade for other reasons and do new builds. I don't have family to pass my old pc components forward aside from my mother who gets an upgrade every few years. But i will donate my old desktops to charities or help out a friend if they need it.

I upgrade because i like to keep my system current within a 2 year window just in case i find myself short on money due to job loss or illness, at least if something happens i have a reasonably current system that will last me through a few years of new releases and enough "current" parts in a few boxes if something goes i most likely have a working replacement without needing to spend money.

I hope this GTX 780 has some decent benchmarks in comparison to a gtx 680 since i skipped an upgrade to the 680 and went with SLI 580's instead..im not asking for 50% increase but 20% would be nice.

avatar

pixelpopper

I currently own a factory overclocked Asus GTX 580 with 1.5gb GDDR3. I plan on retiring it to a 2nd machine and purchasing one of these bad boys for my main PC. Looks like a decent upgrade.

avatar

pixelpopper

meant gddr5 lol

avatar

shommy2002

Why would you want to upgrade from a GTX 580? I have a factory overclocked GTX 570 1.2Gb and it plays every game at maximum settings @1080p except for maybe one or two titles. I'm probably going to wait till the 800 series or maybe if there is a sudden increase in game requirements due to the next-gen consoles coming out at the end of the year.

avatar

Baer

And do not forget those of us that run surround. Getting high frame rates at 5760 X 1200 takes some serious memory in the GPU. (1080p, what a toy)

avatar

shommy2002

Not everyone has the kind of money to burn for gaming system... Besides playing on a 70 inch 1080p TV is better.

avatar

vrmlbasic

But if you're sending some serious money on GPUs why not buy some serious monitors? 2560*1440 and higher per monitor or bust.

avatar

praetor_alpha

This. I found it out recently.

avatar

russellmorris52

Cool card , but ill be getting the 7990

avatar

shommy2002

If I were you i'd wait till AMD fixed their microstudder problem. I heard it is very noticeable on that card. It can make it feel like the frame-rate is much lower than what it actually is.

avatar

devin3627

this is a joke. these cards aren't worth their price for the performance they give compared to previous-gen. 4gb cards should be standard by now. nvidia sucks for wanting to push 1gb cards still. if amd ups the gigs, i might just leave nvidia for the competitor. all nvidia cares about is performance, and i mean it... they push less memory and more of their "graphics" chips. thats why it takes a console pushing the memory limit. i've had 1gb several years ago and still have it on a 560. im sorry but nvidia needs to push their standards on memory and not performance. even if nvidia isn't selling memory, they need to push it so their graphics chip can be utilized. the 1gb 560 i have isn't doing the job because games now need 2gb or they run into problems.

avatar

Chronologist

Nvidia isn't using anything less than 2gb of GDDR5 on their GTX 660 non-Ti GPU's and above. And saying that you've had 1gb "several years ago and still have it on a 560" is entirely idiotic, as the 600 GTX series have been out for quite some time now. Additionally, the AMD 6k series mostly ran 1gb memory cards, except for their most powerful cards. And again, I ask you, show me a game that eats away into the full 2gb of memory at 1080p resolutions. If you're running multiple monitors, or higher resolution ones, you shouldn't be using a bang-for-your-buck card, but rather a multi-gpu set up, or at least a single GPU that was designed to be put into a multi-gpu set up (extra memory).

And still I have no idea what you're saying by "nvidia needs to push their standards on memory and not performance". Specs DO NOT equal performance. The term performance encompasses EVERYTHING, including the software drivers and memory.

As for your statement "these cards aren't worth their price for performance"; It's obvious that Nvidia is still banking on using 600 series to fill in the price points BELOW the 700 series. Both GPU makers have been doing this, using previous gen GPUs to fill in lower price points until the new GPU series can completely pan out. If they set the price the the 780 comparable to a 680 or a 7970, they're only hurting their own overall sales. In doing so, they're essentially telling consumers "Look, these GPU's are now entirely obsolete, and this new generation is soon about to come out is going to entirely destroy them". While that should ALWAYS be true, it isn't the marketing angle any GPU maker wants. They want to get rid of as many last gen parts, to make way for newer cards.

avatar

devin3627

im fine, i'll wait for the Maxwell architecture, with stacked memory, we'll have more of it. current generation 700 series is still kepler. (not worth it)

avatar

devin3627

consoles reflect the future proof of PCs. the next gen is 8gb GDDR5 memory and their video power is 2 years old. video cards arent going anywhere with direct x 11 which will be around for a long time especially when its embedded in the next console. if you want to be future proof, go for more memory on your videocards, because thats what videogame designers are asking more. its how much the user see's, not how pretty the screen is, i dont care how smooth or graphical the effects are, i care about details on-screen.

avatar

devin3627

max payne 3 needs 2gb to run at full settings. i went from a 1gb 260 to a 1gb 560. still, little performance difference.

avatar

Chronologist

Max Payne 3 needs that on resolutions higher than 1080p. Additionally, a 260 gtx had GDDR3, while a 560 has GDDR5. And if you honestly couldn't tell the difference in playing games between a 560 gtx and a 260 gtx, you probably should get your eyes checked.

Also; "thats why it takes a console pushing the memory limit" Are you kidding me? Really now? Lets take the supposed specs of a PS4, and punk them down next to an Intel Sandy-bridge E machine. Simple comparison here, folks. PS4: 8gb of shared GDDR5 memory. PC: 128 gb of DDR3 memory, and up to 6gb GDDR5 memory from the GPU. Who's pushing the memory limit? PC's have got so much memory we're making RAM discs.

avatar

shommy2002

Who exactly is making ram disks? The 0.1% of all PC users?

avatar

devin3627

im telling you, that the graphics market is making baby steps. this stuff is going to be a letdown in the near future. 1080p hasnt been fully exhibited. much more can be done on screen, even if shared memory is the case of GDDR5, its direct hand-on-hand CPU/GPU. you are only as strong as your weakest link, and if you dont have the 8gb gddr5, you arent living the full experience. the average DLC will be a little less than 8gb. PCs aren't utilizing memory bandwidth or quanity that they could be. ALL PC GAMES ARE PORTS OF CONSOLES. the future will change with the next console releases. dont waste your money on these cards, they are the last of their make.

avatar

Chronologist

The console port trend is shifting. Look at BF3. Made for PC, scaled down to consoles. Hell, look at Steam. Look at the F2P trends.

And what in the world does "the average DLC will be a little less than 8gb" mean. If you don't know the difference between storage memory and RAM, I have no words for you.

Also, next gen consoles ARE essentially PC's now, with the AMD sourced hardware, and the PS4's x86 approach.

"PC's aren't utilizing memory bandwith, or (quality?) they could be." Yes, that has ALWAYS been the case. Consoles are weighing PC's down with their long refresh cycles.

"1080p hasnt been fully exhibited. much more can be done on screen". I can predict the future too: Graphics will improve over time. All your broad generalization says is "Technology will improve". I'm pretty sure everyone knows that.

avatar

devin3627

when i said a little less than 8gb, i was talking about entire world's on 8gb, what we dont know is that every multiplayer is sharing a 512mb map/level/world or whatever you call it. when playing online, each area must be in blocks of 512mb for current consoles so that everyone see's the same thing. like call of duty zombie maps. like you said about cutting to the point. we should "NOT" purchase a videocard when the the second console this june is being announced. nvidia's consumerism, clearing their inventory of their long-due crap. nvidia complained about console hardware when they have a high price tag just to look pretty. microsoft is the boss, and game developers for the longest time have struggled with memory quanity and coding. 32 MB ESRAM is something people take for granted. thats not there for shits and gigs, PCs don't have that, and YES thats pure shared video memory right there. its not the bad type of BS shared we know of. ESRAM is pushing 8gb of pure gddr5 bandwidth, that means vivid textures, details, even if you have to throw antialiasing to the curb, still looks better than a dull bare room on the 360. its a different path and more memory is a very creative way to exhibit the gaming future. we are going to look back and see how bare our older game's environments were. no more limited creativity for the developers. we are going to look back at these series and realize we threw away money during the big jump in technology by amd's hardware in the console. sorry, i was an nvidia fan, from the 260 to the 560, wasnt worth the jump, still cant play assassin's creed on even medium settings on a black phenom quadcore.

avatar

Cube

WHy?

They already have the titan and it runs everything I tried on it just fine.

avatar

shommy2002

You don't even have to go anywhere near a Titan to max everything out unless you are playing on some ridiculously high resolution.

avatar

gamecrusader

I'm the OCD of us!! :)

avatar

iplayoneasy

Too expensive. Too freaken expensive. Considering how well this generation destroys current round of games I can see only the OCD of us upgrading from their gtx 680s.

avatar

shommy2002

I too agree 100%, this sounds like a huge ripoff. You are probably only going to get about a 10-15 FPS lift from the 680...

avatar

btdog

Couldn't agree more. A year ago, nVidia bucked this trend by introducing the 680 at $500 (AMD was pricing the lesser, non-1GHz 7970 at $550 at the time). Now, $1300 to SLI - that's the cost of a solid gaming build.

Disappointed. That's all I can say.

avatar

TheFrawg

Current gen games run at maximum settings on 680GTX SLI'd. While I'm always eager for faster hardware, I can't see spending $1300+ unless I can see the performance increase.

avatar

praetor_alpha

My GTX 680 can't maintain 60 fps at 5760x1200 with FC3 Blood Dragon on medium/high settings. A better card is still useful.

avatar

shommy2002

ROFL, go cry me a river.

avatar

iplayoneasy

A second 680 would be even more useful. 2 of those will beat a GTX Titan every day black and blue

avatar

Chronologist

Yep. Anyways, next gen GPU's are never meant to target previous generation buyers; they're meant to get people running cards 2-3 generations behind to make the jump.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.