WHQL-Certified GTX 680 Drivers Limit Sandy Bridge-E To PCI-E 2.0 Speeds

19

Comments

+ Add a Comment
avatar

nahshal

Brad Chacos thanks for this nice article ...

jogos de meninas

avatar

muelle

my roomate's aunt makes $83/hr on the laptop. She has been without work for 8 months but last month her pay was $8682 just working on the laptop for a few hours. Read more on this site...NuttyRich . com

avatar

biggiebob12345

Nothing maxes PCI-e 2.0 x16. So why would I care about whether or not the 680 supported 3.0?

avatar

bvdsf8d9s

???WELCOME TO

http://ai.vc/zd
It is the best online website?
have some cheaper and more good things.
Best quality, Best reputation , Best services

shirts,bags,hat and the decorations.
if you like to order anything you like.
More details,
please just browse our website Quality is our Dignity;
Service is our Lift.
enjoy yourself.
thank you!!

http://ai.vc/zd

avatar

JohnP

Is there any benchmarks to show that there any benefit using PCI 3 on this latest set of graphics cards?
EDIT: Mikey in this post found out that there indeed is NO difference between the two (at least for the AMD 7970).

avatar

davidthemaster30

I prefer anandtech's article :

http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa

It shows the differences with x2, x4, x8 and x16 PCIe 3.0 to show the difference between PCIe 2.0 and PCIe 3.0!

avatar

JohnP

Heh, still shows no improvement but thank you for the link...

avatar

EthicSlave

i believe 0.2-2 fps is all the difference when you push the 30fps limitations of certain games sure at 120fps its no difference u also cant see much past 60-70fps, however you can notice a difference between 60 and 120fps thats because more frames are syncing with your actual eye refresh rates... whats the big deal you say? well at 30 frames and below your eyes will start to notice shuttering and flickering is that extra 1 fps enough to stop that? Yes by all means it is!

avatar

Baer

This is the first generation I am not upgrading to in quite some time. The reason is SKYRIM (and a few other new games). SKYRIM does not support wide screen gaming and is a console port so my present pair of GTX 580's handle with maxed out settings just fine. Since the newer games seemed to be dumed down some and and are easily handled by the present generation of GPU's why upgrade as often?

avatar

SilverSurferNHS

News Flash: Skyrim is only slightly more gpu heavy than FONV; Also, UWS is supported - by the community. If your gonna bitch about "console port" then you obviously belong their if you can't get it to run properly.

Troll somewhere else; you flaming games people enjoy makes you a dick.

avatar

Baer

Idiot, my point was that why should I spend $1400 on a new pair of top end GPU's when my favorite games do not even tax my present set up. Now if SKYRIM Etc. had supported surround or Eyfinity and taxed my GPU's then I would have been on the pre order list.
That's it. that's all, no other meaning (dolt!) You might also write more and text less to improve your grammar.

avatar

aferrara50

run a 7680x1600 or 7680x3200 resolution and come back and say you don't need more gpu power. Skyrim works perfectly fine on both my surround and eyefinity setups with both 3 and 6 monitors. I have no idea what you're talking about.

avatar

SilverSurferNHS

thank you - i gave up tryin ta tell 'em

avatar

SilverSurferNHS

Im telling you it does run in eyefinity and surround. There are community fixes to the common problems.

avatar

Baer

HaYDeN has stopped providing fixes and the last patch by Bethesda broke Helifax's latest fix although he has promised a new one today. The next patch from Bethesda will probably again break whatever fix the WSG community comes up with.
The key point is why can Bethesda not just include the menu and map fix into its code? They are obviously pandering to their main volume audience which are console gamers who do not need WSG technology compatibility. So, if they are mainly developing games for consoles and those games do not really need heavy powerful hardware why should we buy that powerful and expensive hardware? I KNOW this is a concern of the CGU guys and I am just pointing this out. High end computer gaming has been one of the key drivers in computer technology for decades and without that driver the economics of the thing say that progress will slow.
As the kids say, Just saying

avatar

mikeyfree

Take it easy, you sound like an AMD support forum megaposter. It didn't sound like he was "bitchn", just stating his point of view. Back on topic.

Maybe a testbed of alike hardware/software to see if the newer pcie 3 is an improvement and how much of an improvement it is.

Edit: I've found a forum quote that's interesting.
http://www.overclock.net/t/1188376/hardwarecanucks-hd-7970-pci-e-3-0-vs-pci-e-2-0-comparison

avatar

JohnP

Mikey, thanks for the post. Just what I was wondering. Yep, no difference between PCI 3 and PCI 2 for the AMD 7970...
Perhaps when the dual chip cards come out...

avatar

Gezzer

I don't even know if the dual GPU cards would need 3.
I'm running a 875K on a MSI P55-GD65 and will be setting up a 3 monitor setup (3x HP LP2475w 1920x1200) so I was really worried about using a PCI-E 2 MB either with a dual or in SLI/Crossfire and I ran across a couple of videos.

http://www.youtube.com/watch?NR=1&feature=endscreen&v=NFMzRZqFh-w
http://www.youtube.com/watch?v=rSfifE2Domo

It seems that you can run a 6990 in x4 and get almost the same score in 3Dmark. So I think 3 is pretty much more of a bullet point feature then a must have at this stage.

avatar

DU00

Wouldn't the GPU itself have to be designed to take advantage of PCI-E 3? Or are the newer ones PCI-e 3 compatible and I missed the news?

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.