Nvidia Reportedly Prepping GTX Titan Ultra and Titan LE SKUs



+ Add a Comment


I’m glad they are trying to push the envelope. With the coming next generation consoles hopefully game graphics will advance significantly and with 4K sets prices becoming reasonable (Tiger Direct raised the price on the Seiki 50” set to $1499, from $1299) high end vid cards are going to have to be able to push out the new games at 3840 X 2160 or more, hopefully at 120 fps so you can have 3D at 60 fps. Personally I’m hoping to see the Oculus Rift team come out with a premium VR headset and I’d like my next rig and vid cards to be able to push the pixels and polygons out at those higher resolutions and frame rates.

Right now I can’t see any reason to upgrade from my 580 OC SLI set up as nothing I do seems to task that system too much.



I'm with you: Stereoscopic 3D at 4K resolutions at a reliable 60 fps per eye with no microstutter... Hell Yeah!

***2D gaming: A Waste***
People go through life with two eyes and stereoscopic vision. It seems a shame to effectively not make use of two eyes while gaming. I have played games (way back) in black-and-white; lacking 3D support now seems equally crippling. If you have stereoscopic vision (~7% of people do not) this is not, in my opinion, an exaggeration.

***Incidental benefits of 3D gaming***
- There is 95% scaling for SLI setups in 3D. This means you're effectively getting vastly better performance with SLI if you're running in 3d-e.g. getting 60 fps per eye (120 fps per sec) in SLI vs. getting ~30 fps per eye (60 fps, total) w/o SLI.
- Better image quality--with 3D you're getting twice the information "piped" to your eyes: effectively double the number of pixels of 2D (since each eye receives a slightly different image). The result is that the image quality is higher than you'd expect, at a given resolution (e.g. I game sitting literally 1.5-ft. from a 27-in. 3D monitor with no image quality issues, whatsoever).
- Better depth perception--you can easily and intuitively gauge distances, which has value for first person shooters (e.g. lobbing grenades, or any other projectiles that travel in an "arc", intuitively knowing what the spread will be for a firearm at a distance, deftly jumping from small surfaces to small surfaces [it's quite easy to be acrobatic]).
- 3D effectively increases your monitor's size--you can sit far closer to a monitor when gaming in 3d (since you can adjust the convergence, allowing your eyes to focus far off in the distance). this effectively makes a 27-in. monitor fill more of your view than a 100-in. television(!). Eyestrain is not an issue.

***True-3D, not Crap-3D***
By the way, if you think you've experienced effective 3D, you may be mistaken: many movies have used it ineffectively, making things look "flat" (two exceptions are Life of Pi and Avatar). Likewise, many games use it poorly (think Crysis 2 and 3, which have "pseudo-3d" crap). Additionally, default Nvidia profile settings set 3D to only 15% (making things appear flat), and don't have keyboard shortcuts for convergence (which adjusts how "large" things appear). Proper configuration can take a while to get down. However, when things are set up correctly.

**3D--The closest to "being there"**
Stereoscopic 3D is the closest that gamers currently can get to "being there". It's like being on one side of a transparent window with a real, living world on the other side. It's fantastic.



I agree with you 100%, I have a 23inch 120Hz screen and when playing games that don't properly support 3d I feel like the game is handicapped. Unfortunately Nvidia profiles state some games support for 3d vision as good or excellent that have 3d artifacts and that completely runs the experience.

Crisys 2 uses a very sophisticated post processing that uses information from the game engine to direct a 2d=>3d from a single frame, this gives you 3d at almost no cost in performance which was great for the ps3 and low performing pc's.



Yeah sure FINALLY ATI is having something to come close(ish) in comparison to Nvidia... but its still one GPU (Nvidia) vs 2 GPU's (ATI). So it's evident Nvidia would still be king if they introduce an upgraded TITAN or not.

Still gonna wait for DX12...



Really? ATI? What is this 05? You gonna complain about george bush and his weather machine creating katrina while you're at it?

Troll baited rants aside, currently, AMD is where it's at if you're a gamer. A single 7970 will push out more than smooth enough framerates for 1080p gaming and actually beats the 580 in a lot of games, depending on drivers and game engine. Add that to the fact that amd will play santa clause for anyone buying their GFX parts and you have a recipe for sucess.

So far as Titan, it was never intended as a graphics part. It was designed with the Titan supercomputer in mind because if they didn't design a higer performing part, AMD was almost certain to get the deal due to much superior GPGPU performance, which can almost be five times faster on AMD hardware (see extremetech's article on bitcoin mining). The real quest for the fastest single card is played in the realm of dual GPU parts.

Also, the 7990 will almost certanly be faster than the Titan is. Logic? 690>Titan; 6970<580 and yet 6990>590; 7970=680(give or take). Therefore, with some agressive clocking and well engineered cooling, the 7990 stands a huge chance of taking the prize.



While it is incorrect to refer to AMD video cards as ATI, most PC enthusiasts know ATI was bought by AMD. Additionally, referring to the cards by ATI has evolved as a simple way of referring to AMD's video cards, and not their CPU's.

If you're a gamer, you most likely are looking at the sub-300 dollar market, not the top of the line GPU's. A 660 GTX non-Ti offers a superb amount of bang for your buck, and has dropped down occasionally to 185 on Amazon with rebates. AMD has a hard time beating the 660 GTX non-Ti at it's normal price point of 210, let alone sub 200.

And why are you comparing a 7970 to a 580 GTX? If you haven't noticed, there's an entire new 600 GTX series, at which AMD had to re-release their cards as GHz editions and introduce aggressive price cuts and Never Settle bundles to compete with.

And Nvidia has traditionally held the edge in GPGPU performance, and only recently forfeited the advantage in their Geforce line because realistically, most people buying these cards are gaming, not bitcoin mining. Additionally, if you put down 200+ for any current gen card, you'll get "more than smooth enough framerates for 1080p gaming". If you don't have a screen with better resolution than 1080p (or multiple 1080p screens), you really shouldn't be spending 500+ on GPUs.

Additionally, the GK110 was supposedly geared as a the 680 GTX, and the GK104 was supposed to be pegged as the 660 GTX Ti. This changed when AMD fell short of expectations.

As for dual card GPUs, the 690 GTX won't get beaten all that badly by the AMD spec 7990, if at all. Ares II required "aggressive clocking and well engineered cooling" to beat the 690 GTX, but I'm pretty sure it's fair to say that the reference 7990 won't have a liquid cooling loop installed on it. Therefore, no 1100 MHz clock for the 7990.

And for your logic, again, why are you using last generation cards in your comparisons? We all know that the 590 GTX was a half-ass of a card. That's why Nvidia released the 690 GTX following the 680 GTX, instead of a 670 GTX. Additionally, It's bad marketing for the Titan to beat out the 690 GTX. The idea behind the Titan is to have 690 GTX like performance on a SINGLE card, so you can double,triple, quadruple SLI them.

As a last point, the 7990 SHOULD beat the Titan. No one's expecting it not to. If it can't, I have no clue what AMD's doing anymore.



Rumors are flying like hotcakes.


Seems I'll be getting a new card soon.


Dorito Bandit




Titan LE looks sexy. Hoping it comes in around $350 (or less).



Thank you!



Oops double article post.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.