Quantcast

Maximum PC

It is currently Wed Nov 26, 2014 9:53 pm

All times are UTC - 8 hours




Post new topic Reply to topic  [ 1 post ] 
Author Message
 Post subject: Graphic Cards Myths Debunked
PostPosted: Mon Feb 10, 2014 9:18 am 
Smithfield
Smithfield

Joined: Sun Jun 18, 2006 7:37 pm
Posts: 5469
Tom's Hardware posted an article about various graphics cards myths floating around the internet and how they held no weight. Mod note: I was gonna put this in nuts and bolts, but I think the gamers here would care more about it.

The tl;dr version:
Myth 1: Frame rate is the absolute indicator of performance
You have to take into account something else that's been getting a lot of light recently: frame rate variance. Sure if your average frame rate is 60FPS, but wildly varies between some low FPS and high FPS, it's going to be a different experience than if you were constantly getting 60 FPS.

There was also talk about V-Sync, and to leave it off if you play twitch games competitively, the graphics card has trouble rendering above the refresh rate (if it's lower, the frame rate stutters), have issues with input lag, or benchmarking.

Myth 2: Graphics cards affect input lag
Sort of, by way of frame rates. Otherwise there's a laundry list of other things that affect input lag. Notably though, things that don't affect input lag were using a USB or PS/2 based peripheral, using a wired or wireless based network, or using SLI/Crossfire.

Myth 3: Graphics cards with more memory are faster
Some of us should see this and laugh, because we've seen cases where lower end cards have more memory than higher end cards, but the higher end ones obviously win out. The only time where more memory helps greatly is if you're playing at higher resolutions, using high quality textures, or using antialiasing methods other than FXAA, MLAA, or its derivatives (i.e., post processing anti-aliasing). Of note, if you're using not MSAA, FSAA, CSAA, or the like, then even a supposedly heavy hitter like Skyrim with the High Resolution Texture Pack only uses about 1.2GB of VRAM @ 4K. Hitman Absolution was their worst hitter, but at 1.7GB @ 4K. So basically, if you're using a 1080p display, 2GB of VRAM is plenty for most of the games out there.

They did do some Windows GUI tests and found that Windows 7 Basic uses 99MB, Aero uses ~140MB, and 8.1 uses ~200MB, at 1080p. This shoots up to 330MB and 410MB on Aero and 8.1 respectively at 4K, with Basic barely budging.

Temperatures, fan noise, and overclocking
It wasn't really a myth they were trying to debunk more so than gathering a data point.

However I do know that temperature itself does not affect the performance of a part, but it will once it starts hitting thermal limits. If you're not overclocking, then a GPU running full speed at 30C isn't going to perform any better than a GPU running at full speed at 50C. But if the GPU is constantly hitting its thermal limit at say 90C, then yes, it'll start throttling.


Top
  Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 1 post ] 

All times are UTC - 8 hours


Who is online

Users browsing this forum: No registered users and 4 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group

© 2014 Future US, Inc. All rights reserved.