The Most Graphically Demanding PC Games

91

Comments

+ Add a Comment
avatar

vrmlbasic

Sounds as though you should jump to a current-gen AMD GPU so that you can jump on the Mantle bandwagon, since you've already experienced CPU-bind and Mantle is supposed to save us from that.

...My 6870/6850 crossfire can't really handle 2560*1440 as well as I'd like, and 1440 is too low a resolution still. My 23" 1080p is unbearable, it is like looking through a screen door. Also, I still play some old games where the greater the res, the more I see of the field (as opposed to the more detailed the field).

avatar

maleficarus™

Sure you don't work for AMD? All you ever go on about is Mantle this and Mantle that lol....

avatar

vrmlbasic

When is Crysis finally going to make an appearance on something like GOG so I can finally play it without having to worry about the oppressive, in-addition-to-Steam DRM in the Steam version?

:(

avatar

Richardbs

If you're gaming at 4k res, you don't really need AA and whatnot. The res is so high you shouldn't really see any jagged edges. The performance hit wouldn't justify the minimal graphic gain

avatar

mynamewasstolen

I wonder where Bioshock Infinite would fall in this list.

avatar

maleficarus™

I get 80 FPS in Tomb2013 with TFX off and no AA. I am using an OC'd GTX 760...

avatar

noobstix

I remember when Crysis was the ultimate benchmark for PCs back then. Then Metro 2033 came out and stomped CryEngine to the ground. Even now, it seems Metro doesn't fail to bog down decent hardware.

avatar

legionera

IT seems that the most interest is in trying to make that game run smoothly rather than the game plot itself.

avatar

vrmlbasic

It reveals the sorry state of PC ports when quite a few of these games were designed to run on the inferior consoles-and still look it on PC-and yet can be considered "demanding" on PC as the games rely on the power of hardware to cover their developers' lazy coding.

It sounds as though some of these games might be made less demanding if they were Mantle enabled as Mantle promises to allow the CPU to better handle high-physics effects like exploding barrels, no?

avatar

rjwagner

Never thought I see a game that would make the men and women at Maximum Pc hard pressed. I mean I am still shocked too see how much it takes to run Metrow 2033 at max settings but just wow. Nivida has some very talented people working for them. And I thought Crysis 3 was a difficult game to run on a PC at high settings and still get decent FPS Metro Last Light.... just wow. But then again different comptures and configurations like "Jimmthang" said "BTW impressive list Max PC"

avatar

automaticman

If your going to count Ubersampling and other names for SSAA as reasons Witcher 2 and Tomb Raider get such low performance, then you need to turn it on for Battlefield 4 as well. They just call it resolution scale and gave it a slider with a default at 100%, but it does the same thing that any other name for SSAA does. Not really accurate to turn it on for some games but leave it off for others.

Personally, I'd rather just leave it off for all the games to even the playing field a bit. It's such a drastic performance hit for something that is done so much more efficiently by other methods, I suspect very few people actually use it.

avatar

chriszele

Resolution scale was set to 100% in BF4, as I recently double checked the result by re-benchmarking it.

avatar

automaticman

But that's my point. Setting resolution scale to 100% is effectively "SSAA off". Increasing the resolution scale to 200% or more effectively renders the image at a higher resolution, then downsamples back to your native resolution. This is exactly what SSAA does, Resolution scale at 200% should be the same as SSAA x4 (as it's rendering 4x as many pixels as standard).

The point I was originally trying to make was that, if you are going to turn on Ubersampling for Witcher and SSAA for Tomb Raider, then you should set Resolution Scale to 200% for BF4.

Again, though, my personal preference is just to leave it off for all three games.

avatar

poee

Do any of these titles that use SSAA also offer MSAA (or FXAA or any other kind of edge-smoothing aside from super-sampling)?

avatar

automaticman

I'm not sure about the other, but I know Battlefield 4 offers independent controls for SSAA (called Resolution Scale in the game), Deferred AA like MSAA, and Post Process AA like FXAA. The SSAA options makes nice screenshots, but I doubt many people actually play with it as it absolutely crushes performance (and like I pointed out in another comment, was left turned off for this list).

avatar

The Mac

Thats what Radeon Pro or Gforce Experience are for.

WHo cares if its in game.

avatar

Morete

Crysis 3.
I use an old modded HP Pavilion OEM, AMD Phenom II x4 830 2.80 GHz, Nvidia 560ti o/c, 8 Gigs of RAM, 10K rpm HD. I run the game with all settings maxed out on "Very High". Motion Blur is set next to lowest, and run it with MSAA at Medium (4X). There are options in settings for SSAA and FXAA. I never use them because MSAA offers the best eye candy on my pretty HP 1080 GLOSSY screen, (yes, I did say glossy for all you glossy haters). Frame rates are smooth, no jitter, tearing or noticeable lag.

avatar

Tman450

No. Just no.

No you do NOT. that is complete BS.

avatar

RUSENSITIVESWEETNESS

Yeah, I maintain 60 fps on my Etch-A-Sketch. I don't know what the fuck you guys are doing....

avatar

trog69

2600k @4.2ghz, and no added OCing to the 680FTW card, so basically the same. Thanks for showing me why I need to upgrade now with a 780 at minimum, since I'm gaming at 1440p.

avatar

JosephColt

Since your gaming at 1440P you should need to worry about AA as much, so not really.

avatar

Ninjawithagun

...actually, trog69 is correct. He/she will need to upgrade not to just a GTX780, but at least two of them and running SLI. I play all my games at 2560x1440 resolutions and have a 3930K OC'd to 4.2Ghz and have two Gigabyte GTX780 Ghz Edition cards in SLI. With all graphic settings maxed in game and dynamic vertical sync enabled, I can tell you now that both Crysis 3 and Metro Last Light (PhysX is disabled though) frame rates still drop below 30FPS in certain parts of the game. In Metro Last Light, I have only been able to play it with the PhysX enabled when running three GTX780s (only after several Nvidia driver updates may I add). I am no longer running three of them as their was just one game that needed a third card and that was Metro Last Light. The heat and power consumption of the third GTX780 wasn't worth it, so I removed it and put it into my second gaming system which is now also running two GTX780s in SLI. Metro Last Light is easily playable with PhysX enabled at 1920 x 1080 with two GTX780s, as is every other game listed here. Remember that 2560x1440 has 76% more pixels to process vs. 1920x1080 ;-)

avatar

JosephColt

You dropped to 30FPS in Crysis 3 at 1440P?

I played Crysis 3 on 1440P with 2x 670 not that long ago and never dropped below 60FPS unless there was an over kill of explosions. I wasn't use AA though because at 1440P AA becomes a lot less noticeable.

For most gaming his 680 will be fine for 1440P till the Nvidia 800 series releases unless he wants to really have the absolute max possible graphics.

avatar

maleficarus™

The question begs to ask: Why play in such a high resolution if you are struggling with your frame rates even with SLI?

avatar

The Mac

Native resolution.

Scaling impacts visual quality.

avatar

Chizmad

damn
i must have a good rig if i played Last light on max settings with SSAA 4X and got mostly 50-60 fps.

i would have thought Crysis 3 would have had been more graphics intensive.

avatar

Ninjawithagun

...but you didn't have the in-game physics turned on :P

avatar

Biceps

I would argue that Crysis 1 is still presenting with low frame rates - just like GTA IV - because of relatively poor implementation, not because of all the eye candy. There are better looking games that run at a much higher frame rate, so the answer is simply that it was fairly poorly put together.

I really hope the PC version of GTA V is done better than GTA IV - I am holding onto my money until I see that it is.

avatar

Ninjawithagun

Technically, what you are speaking of is poor software coding that fails to optimize the game execution using specific PC hardware, software (operating system), and graphics API (DirectX, graphics drivers). Console ported games are at the top of the guilty list. A majority of the time, game designers are just lazy (or just sloppy) and release the PC version of the game as fast as possible in as little time as possible. Crysis is the exception. The Crysis game engine was designed from the beginning to NOT be optimized. That is, Cry Engine was written to take advantage of computer hardware processing and does not fully take advantage of any particular hardware architecture (i.e., Nvidia vs AMD). Thus, it is still to this day very demanding to play at higher graphics settings on the most current generation graphics cards. Step up the resolution and playable frame rates disappear. This is where SLI and Crossfire have to fill the gap if you really want to play the game at higher resolutions above 1920 x 1080. And don't even think 4K gaming with all graphics settings maxed out will be feasable without having to run at least to top end graphics cards in SLI. For now, two or more GTX780Ti cards -OR- two or more R9 290X cards will be your only current 4K gaming solution. Now, make note before jumping off into making comments. I said ALL graphics settings maxed. That includes shadows, motion blur, and in-game physics engines :D

avatar

RUSENSITIVESWEETNESS

It won't be. Rockstar doesn't give a shit about the PC. Going back to GTA 3, they've all ran like ass or had crippling 30 fps caps. Yes, you could turn vsync off, but then half the scripting wouldn't work and the games would lock up at random.

avatar

jimmthang

We mentioned that GTA V is poorly optimized, but optimization happens on a granular level anyhow and graphical beauty can be subjective. We primarily wanted to get down to brass task and see which games made our hardware struggle the most for this story.

avatar

Biceps

@jimmythang: you did mention it, and I thank you for it. Not trying to give you all a hard time, just pointing out that Crysis was in the same boat. A lot of people think Crysis is such a beast because it is so pretty (and it is a gorgeous game), when spending a bit more time fleshing out their engine could have mitigated a lot of the workload problems.

avatar

kiaghi7

"We mentioned that GTA V is poorly optimized"

MaxPC exclusive copy of GTA5 in house!

The Rockstar FBI and "Men in Black" will be here shortly to scrub the scene and erase all evidence!

avatar

jimmthang

You got me. Just don't tell anyone we've got early Steam codes of GTA5, mmkay? Pinky swear now! :P

avatar

RUSENSITIVESWEETNESS

It wasn't optimized. At. All.

avatar

vonalka

How about show the results with an AMD card as well?

avatar

X2brute

I agree, and a high end nvidia one as well. seems like I got double the tomb raider fps with a 2500k and a 6950HD, then went to nvidia and my frames plummeted (only on that game and only with tresfx on though)

avatar

jimmthang

We may consider that. 

Thanks for the feedback.

- Jimmy

avatar

kandress86

You should consider it. Purely because not everyone has an Intel and Nvidia Combo.

I personally have an all AMD rig. Not everyone can afford an Intel or even like Intel, e.g Me, I don't like intel and their vicious upgrade cycle.

avatar

jimmthang

I hear ya, it's challenging to accomodate scenarios like this because there are so many different configurations out there: AMD CPU + Nvidia GPU, Nvidia GPU + Intel CPU, AMD CPU + AMD GPU, crossfire, enthusiast build, budget build etc.

However, because games are mostly GPU intensive, I'll see about maybe updating this article to include an AMD GPU in there.

 

 

avatar

vrmlbasic

If you put an R9 290 in there then I might just dislike those darned miners even more. :)