Nvidia Announces Dual GPU GTX 690 Set For Release May 3rd

40

Comments

+ Add a Comment
avatar

kerrycarney

buddy's mother-in-law brought home $20956 a month ago. she is making cash on the laptop and moved in a $391600 condo. All she did was get lucky and put into work the tips explained on this web page ukliq.me/8hGkqu

avatar

streetking

Seriously, can I proofread your articles for you?? I'll do it for minimum wage if I can do it from home... How do you miss this stuff?

avatar

JLloyd13

$1000.... you could build am entire machine on that witch could max all the settings on most games

avatar

Pruneface

So, who exactly can afford this asid efrom JohnP???

avatar

morrowindsky

The GTX 690, as designed by the UAC for use at their Delta Labs installation on Mars.

avatar

zaternine

That's all good n all but the real questions is can it play Crysis 3???? Huh? Huh? Riddle me that! Seriously the only ones buying this kind of card are for bragging rights and the others who really need this kind of power for all that quantum computing going around.

avatar

germanogre

Why bother? Really. DirectX gets updated frequently enough that, by the time any games use half this card's potential, it'll be obselete.

My GTX 285 may only be running one HDTV, but it still handles most games pretty easily. However, I'm still stuck with DX10.

Why not get a 670 or two, whenever they are released, and start saving for DX12.

avatar

Cy-Kill

Where the hell is AMD with the 7990?

avatar

damicatz

Thanks but no thanks.

One of the things that many people don't know is that nVidia deliberately cripples the double-precision performance of the card in the drivers so that they can sell overpriced Tesla cards. I'm not about to pay $1000 for a card where nVidia will deliberately cripple the performance.

This is not the first time nVidia has done something shady like this. They also deliberately prevent GPU-accelerated PhysX from working in mixed-vendor systems (ATI and nVidia GPU both present in the same system).

Couple that with incompetently written drivers (seriously, the TDR issues have been going on for far too long so either hire someone that knows how to program or open-source the drivers and let competent programmers fix the problem) and a lack of openness and transparency regarding hardware specifications (AMD releases the source code of their Linux drivers so people can write their own drivers) and I have no intention of buying another nVidia product again.

avatar

maverick knight

You really need to educate yourself. GeForce and Tesla are two different cards. Tesla uses cuda architecture cores but its build for computing not graphics. Your argument that is crippled is moot since they are designed differently. Quadro specs may look similar to Geforce however it is far slower, therefore it is not meant for gaming.

Yes, the gtx 570 may be crippled from the 580. But Nvidia delivers the performance it advertises so it is not shady at all. Companies most of the time do not make a lot of money from consumers but instead from saving on manufacturing cost. These are business tactics and if you are a professional or educated you would understand better. Stop spreading misinformation.

avatar

AngryDemon

I wonder why didnt Nvidia stop making cards with 2 gpu and instead create a dual/quad core GPU?
It is feasible with today tech, so why noone didnt invent it?

avatar

vrmlbasic

The Voodoo 5 6000 returns!

avatar

steven4570

the Tesla card is not meant for gaming anyways, this card is...so i really dont know what you're talking about

avatar

theabsinthehare

What he's saying is that if the "Gaming" cards were not crippled, they would achieve close to the same double precision floating point performance as the Tesla cards. Specifically, they cut the double precision performance down to a quarter of what the equivalent Tesla card can do.

If they both perform the same, then there's no reason to buy a Tesla card specifically for workstation rendering or whatever it is you need to compute; you can just save 2 grand and buy something from the GeForce line.

avatar

damicatz

Actually the GTX 680 is even worse; they crippled it to the point that it's faster to just do DP in the software.

GTX 680 DP performance is 1/24 of SP performance.

It would be one thing if the Tesla actually used a different GPU that was actually faster at DP. It doesn't. The GK104 is perfectly capable of doing faster DP calculations except that the driver intentionally slows it down.

avatar

Mayhemm

That....is a thing of beauty.

avatar

jcollins

They've got enough problems getting a supply of 680's, now they want to use the few chips they have on a 690 as well? So much for a new video card before Christmas...

avatar

JohnP

Y'know, I am curious about this and I asked Brad about it in another post. The Kepler chip shortage from the fab was big news in the middle of February but not a peep since.
So the question is, are GTX680 cards scarce because of chip shortages or because of hyper demand?
I guess we won't know that until NVidia shows us their sales figures or at least their profit margin this quarter.
Meantime, good luck in getting a Kepler card!

avatar

nbrowser

I'll take two please, and subject one of my currently SLI'ed GTS 450 cards to a Physics card :)

avatar

fuddco

True it's spendy, but some early adopters will buy it, and like all tech the price will come down or at least future generations of cards will benefit from the R&D this card has provided. Like a Ferrari, not for everyone, but for those who can,I am envious......

avatar

Eternal

What is the point when you can get 2 680's for the same price?

avatar

bartyh5

As Erriwin said, it takes up a valuable PCIe slot. I am an ITX gamer that loves power but also small form factor. This would be a helluva card and amazing in an ITX/DTX build. Having a MaximumPC isn't always about ATX.

avatar

erriwin

The point, dear sir, is to keep moving forward. You COULD get two 680's for the same price, but that would take up two valuable PCI slots. Whereas, you could have two of THESE beefcakes and have almost the equivalent of FOUR 680's. Yes indeedy two of these would be overkill for any game on the market right now, but 4k and even 8k resolutions are on the near horizon. And new resolutions bring all sorts new shiney's to the table - new higher res tvs, and monitors, new Blu-ray players, and media, new video cameras, and, most importantly for this type of video card: new games.

avatar

JohnP

Depends on how close you sit. Higher res monitors and TVs make sense only to a certain point. Our eyes simply cannot distinguish a higher resolution after a certain distance away. The new Apple IPad needs it because it is in your face but put it 3 foot away and no real difference. Same with TVs. Total waste to have 1080P on a 27inch TV for instance, just a selling point. Lots of studies to back this up anyways (you can wiki this).
Yet a 4K monitor on my desktop sounds so sweet!

avatar

twann

In 5 weeks, I will embark on building my first enthusiast PC for gaming, video editing, and my graphic design business. The last time I built my own PC (a really cheap build) was back in 2001, so since all the latest technology is new to me, I’ve been playing catch up. I am thinking of getting the new Ivy Bridge processor when they become available for sale (although I do like the Extreme Edition Sandy Bridge processors too).

Now, I initially planned on getting 2 EVGA GeForce GTX 580 SuperClocked video cards. Do you think I should get this GeForce 690 card instead? Also, what motherboard manufacturers do you suggest I look at for the best in quality?

avatar

erriwin

This is true, but not necessarily because of how close you sit - monitor/tv size does indeed make a difference because of pixel density or PPI (pixels per inch). For quick instance: if you compare a 1920x1080res 21" monitor vs. a 42" 1920x1080 TV you should notice the monitor producing much clearer/sharper images because the PPI on the smaller screen is higher (around ~100). The TV is going to have lower PPI because its a larger screen with the same resolution. The main factor is the human eye, which can detect a max PPI of about 250. So, currently a 9" diagonal screen is about the sharpest 1080p viewing panel you can find. Still though, bigger screens make for different experiences: I love to watch movies on my friends projector setup because of the gigantic image thrown at me; it encompasses you, it demands full attention. Even though the PPI of his 1080p 90-some-odd inch screen is very low (if viewed side to side with a smaller version of the media would seem low quality in comparison) I love the experience of massive picture and sound. However, if his projector could display 4096 x 3072 (4k max) resolution it would be a whole new world of awesomeness... 'course actual movie theatre projectors currently DO display in 4k, except their screens are even larger meaning the PPI is still degraded... soon they'll be in 8k though... Basically, what I'm trying to say with my blather is something like: the higher the res, the bigger the screen will have to be to display max human-detectable PPI, and the more pixels and PPI equals the more life-like images can be viewed. So, by the time we make it to 8k res (16x more detailed then 1080p) images/movies will look more realistic then ever and we'll need at least 35" monitors to display all pixels in their full brilliance - boy will our eyes heart us though. Also though: a 27" 1080p monitor makes perfect sense at 2-3 feet away - its a clearer image than anything larger :)

avatar

Eternal

Unless your face is touching the screen, I don't see why the ipad would need a resolution that high. Even so, it seems a little ridiculous.

avatar

tekknyne

MaxPC should get a demo unit in and then have a raffle/give-away for said demo-unit.

Just sayin...

avatar

Almightysarlac

Seems positively good value next to a single i7 3960X

avatar

illusionslayer

I can't wait for other companies to start cranking these cards harder.
Something like a MSI 690GTX Lightning Xtreme would be insane.

Also, I wonder how rumors of Big Kepler will pan out. a 685 that's decently faster and more efficient than the 680 would provide a more sane level for high end extremes, and a dual 685, 695 card would blow anything AMD has to offer out of the water for a few years.

avatar

szore

May 3rd just happens to be my birthday...

Hmm...

avatar

aldenf

While $1k is a lot of money that I won't be paying for a gpu, it is interesting and pretty tech... I was hoping that nVidia would release their mainstream Kepler parts. You know, products people will actually buy in numbers? My 9600GT is getting tired.

avatar

Obsidian

The design, the thermal envelope, the raw power are all awesome. Sadly the price is just as awesome. We have to pay to play I guess.

There are practical questions that need to be answered regarding this bleeding-edge money-maker.

  • Does it shut off one core entirely when doing normal desktop stuff?
  • What's the idle power consumption when the monitors dim?
  • Can we run 3 or 4 displays for desktop applications without sucking down a ton of power?
  • How long will this kind of technology realistically last before it burns itself down?
  • What's the real-world dBA on that fan setup when idle and gaming?

Only one factor would make this even remotely worth it is if this SINGLE card supported triple-monitor gaming in titles like Skyrim, MW3, Portal 2, current gen racing games, etc. Has NVIDIA finally figured out a way to support mutli-monitor gaming with different monitor types? It's my understanding that if you want to multi-monitor game with NVIDIA you have to buy the exact same make/model monitors across the entire setup.

avatar

cmasupra

One of the selling points of the 680 was that it could handle 3 monitors on 1 card. I assume the 690 would also be able to do that.

avatar

irishsj

For multi monitor gaming, you would be much better off buying a pair of 4gb 680s. These are limited to 2gb.

avatar

szore

nvidia website says 4 gig...

avatar

cmasupra

The card has a total of 4GB of VRAM, but each GPU only has 2GB. Each GPU also has to store the exact same copy of each texture, so you essentially only have 2GB of VRAM. It's still plenty, though, unless you have a multi-monitor gaming setup.

avatar

Obsidian

Yes the card as a whole has 4GB, but each chip is limited to access 2 GB in theory. I think that's what is being said above. It remains to be seen if that would actually effect triple monitor gaming.

avatar

gatorXXX

I'm sorry but no game is worth throwing 999 smacks on just to have bragging rights. Don't get me wrong, it's a beautiful card and a hellofalotta horsepower that should kick some serious kahonas but sad to say, it's not worth it.

avatar

B.A.Frayd

You're thinking like a non-independently-wealthy person...

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.