Nvidia GTX 750 Ti Benchmarks

26

Comments

+ Add a Comment
avatar

Ninjawithagun

The single biggest disappointment for me with this first iteration is that it isn't a single slot card. NVidia could have easily made a reference cooler that spanned the length of the card versus having that huge 'hump' cooler protruding perpendicular to the card. A lot of folks have smaller build systems that only allow for a single slot card solution. I'm not buying once until either Nvidia or a 3rd part manufacter releases a true single slot formfactor GTX750Ti.

avatar

Obsidian

Too bad they didn't put SLI connectors on this card. At $150 it could be a really interesting way to see what 2, 3 or 4 of these could do with game benchmarks and at what resolution. Even with 4 of them equating to $600 and milking the motherboard for 240 watts (maybe too much) it's still less expensive than a single Titan or 780 Ti Black.

avatar

vrmlbasic

Seems to me that you have determined why Nvidia didn't put those SLI connectors on. ;)

I'd still be interested, given the "diminishing returns" of multiple cards on the PCIe bus as each one beyond 2 drops the bandwidth for them all, IIRC. As these are single-slot (I can hardly believe it) I would be very interested to see 4 GPUs in the space of 2.

avatar

joshnorem

They are dual-slot cards.

avatar

vrmlbasic

Thanks for that correction.

avatar

jason2393

Doesn't less power usually equal less heat? I approve of that, since my current computer rivals my vacuum for noise levels... 13 fans in total in my current rig.

avatar

Bullwinkle J Moose

Yes, less power usually equals less heat

even a 100 watt liquid cooled CPU generates the same amount of heat as a 100 watt fan cooled CPU but the liquid cooled CPU sheds the heat more efficiently

Voltage squared divided by resistance = watts so a 40 watt cpu with low resistance copper interconnects produces as much heat as a 40 watt light bulb with tungsten conductors

Lowering the resistance of the interconnects will not give you less heat if it is still dissipates 40 watts as far as I am aware

I was going to say "at least untill room temp superconducting CPU's are available" but if it is 40 watts used somewhere along the line, then it is still 40 watts of heat regardless of how efficiently that power is delivered to the resistive components that use 40 watts of heat/power

avatar

vrmlbasic

Wouldn't a superconducting CPU still generate a fair amount of heat? While the "leads" between the components would be lossless, I'm thinking that the very nature of the "semiconductors" used would cause heat generation.

Though wasn't there an article a few weeks back about some techno-wizardry that promised to make transistors ~100x more efficient or so?

avatar

LatiosXT

P = V^2/R applies if the electricity is being directly converted.

P = C * V^2 * (a * f) is the actual equation you're supposed to use (http://software.intel.com/en-us/blogs/2009/08/25/why-p-scales-as-cv2f-is-so-obvious-pt-2-2 )

avatar

joshnorem

Usually, yes. This card runs at 63-67 C under load so it's pretty cool.

avatar

Innomasta

This tech is the nail in the coffin for AMD's non APU mobile GPU's. Ultrabooks and thin gaming laptops everywhere will benefit greatly from this efficiency, and I can't think of an OEM anywhere who wouldn't want this in their laptops.

avatar

vrmlbasic

Not to worry, Ultrabooks will still continue to be saddled with the woefully underpowered Intel iGPUs :(

If Nvidia could spare Intel-based laptops from the "birth defect" of only having an Intel iGPU then more power to them. Whether it be the HD 2000, 3000, 4000, 4600 or even the oddly-unavilable iris iGPU one trait remains constant: they are abysmal for gaming.

avatar

LatiosXT

Looking at other review websites that go into the nitty and gritty of the architecture, this makes me worried about AMD. AMD's Rx series appears to just be a performance-at-any-cost approach with GCN. NVIDIA had some architecture restructuring to make things more efficient.

If the flagship Maxwell keeps this power efficiency, not only will it look much better on all fronts, but this means NVIDIA can also ship higher-tier GPUs that, owing to a possibly better thermal headroom, continue to dominate the single-GPU card market.

What's kind of funny is NVIDIA is taking Intel's approach with this: design a chip meant to scale well for mobile applications and avoid sharing resources.

avatar

vrmlbasic

Piledriver taught us that Revision Two of AMD's architecture is where they rework things for greater efficiency. Steamroller taught us that R3 is where they put in some features that should have been present since R1 :(

What is tragic is that the potential brilliance of AMD's resource-sharing CPU architecture hasn't been realized "en masse" as Microsoft hasn't done any improvements past its "hotfix" and the widely-used Intel compiler still hasn't (for obvious reasons) been made to optimize code for Bulldozer/Piledriver/Steamroller.

Bulldozer is the "sega dremcast" of CPUs- it was before its time. Today's software just doesn't understand its brilliance. ;)

avatar

LatiosXT

Er. To take from Anandtech:

"They’re wasting space and power if not fed, the crossbar to connect all of them is not particularly cheap on a power or area basis, and there is additional scheduling overhead from having to coordinate the actions of those warp schedulers"

And if you're going to blame Microsoft, blame the Linux Foundation as well. Benchmarks for AMD's parts are consistent on Linux as is on Windows, which I'm sure the Linux Foundation doesn't use Intel's compiler (and it's not the only x86 compiler in the world).

avatar

vrmlbasic

...you're applying Anantech's casual comments about shared resources in the context of the architecture of a Nvidia GPU to the architecture of an AMD CPU?!

Point remains that Windows doesn't "understand" Bulldozer, nor does most software. Windows was quickly adapted to "understand" hyperthreading.

Considering that MPC just posted an article stating that Google has just now gotten around to employing multithreading in Chrome-in a way that seems shockingly obvious-I can only come to the conclusion that Bulldozer, while far from perfect, was born into a world not yet ready for it, a world that hasn't really accepted threading.

'tis the Sega Dreamcast of CPUs, Bulldozer.

avatar

LatiosXT

*facepalm*

The article simply stated that the compilation task was offloaded into another thread. That doesn't mean suddenly Google embraces the "wonders" of Bulldozer. It just means that Google finally realized it could separate a task into another thread.

Multi-threading also isn't something that can happen overnight. Do you understand the concepts of multi-threaded programming? The dangers? The pitfalls? Then you'd understand why a lot of programs we use don't implement it by default unless there's a really good reason to do so.

Besides, the fundamental flaw with Bulldozer is that each integer core was neutered from Phenom II. Phenom II had three AGUs and ALUs per integer core, Bulldozer only has two. Even if you fed tasks correctly, the throughput for each integer unit is theoretically lower.

Either way, even when programs and Windows was patched to correctly use Hyperthreading, it only offered similar gains to AMD's hotfix or made the program perform worse.

avatar

vrmlbasic

The irony here is that you're accusing me of lacking reading comprehension when it is you who lack it. Or you're just creating a sort of straw man, which wouldn't be as humorous and would be disingenuous.

Programmers are finally embracing threading, eschewing the "threading is evil" paradigm that was prevalent for so long, but the going is slow. Software has always lagged behind hardware but AMD put too much faith in developers rising to the occasion. Ahead of its time, Bulldozer/Pildriver/Steamroller/Excavator is.

MaximumPC's benchmarks over the years have shown that Hyperthreading, now that Windows and software are made for it, can offer gains far in excess of what the windows hotfix offers. Seriously.

avatar

Innomasta

Because mobile is the future :P

avatar

AFDozerman

One thing that ithink some people are missing is that, even as low power as it is, hash-for-watts, it's as efficent as GCN in LTC mining. This could have some serious implications for AMD's current crypto-fueled selling spree.

avatar

vrmlbasic

So...no gamers will be able to get GPUs as both companies will have their GPUs marked up in price & out of stock due to the miners?

If ASIC makers don't make a product for hashing these newer cryptocurrencies then we might have to turn "GPU" into "HPU", "hash processing unit" (unfortunately Cryptocoin processing unit wouldn't work lol).

avatar

wolfing

hey this is my price range for video cards, wonder if I should replace my Zotac gtx 560 1GB for this

avatar

bjoswald

I wouldn't. Not yet, anyway.

avatar

Neufeldt2002

Yes, yes you should.

avatar

The Mac

maxell huh? its powered by old 80s blank tapes?

lol

avatar

Rift2

I remember those good to see this finally out....

I'll wait for the 800 GTX cards