AMD Delays Battlefield 4 Mantle Patch Until January

18

Comments

+ Add a Comment
avatar

maleficarus™

From what I read and what I know about PC gaming, this is basically more or less GLIDE with a new name. You can put up all the smokes and mirrors all you want but at the end of the day it is supposed to improve AMD GPU's performance upwards of 20% Or to put it in a easy way to see 10 frames give or take?

What this really means with anyone that has even a little bit of PC know-how means that anyone that is already past 60+ FPS ultra in BF4 will see ZERO and I repeat ZERO difference in performance using mantle. Yes, we all know there are the human robot players that magically can see past 100 FPS but all of us with an IQ higher then our shoe size knows this to be utter bull anyways!

Having said that for me at least Mantle means jack and shit. I have a GPU that gives me well over 90 FPS in BF4 as it is so why would Mantle mean anything to me?

For everyone that has been following the PC industry since 3dFX was king we all know Mantle is trying to be GLIDE thus trying to give AMD an edge over NVIDIA. Now the way I look at all of this is this: NVIDIA has G-sync, ShadowPlay and Game Streaming with Shield, plus PyX. AMD is trying with Mantle and if history repeats like it always does, it will pass just like GLIDE did to 3dfx...

avatar

Commodore 64

60fps is no different on a 60hz monitor than 100fps, 60fps is no different than 100fps on a 120hz monitor for a non panning image. In some games we tend to have to look left and right quite a bit. Consider panning quickly for 1 second, How many frames would be best to see during that 1 second? more than 60-120 I think, Personally I'd like 500+ fps in that instance if it were possible on my hardware.
Motion blur is a very poor substitute.

avatar

Xenite

Congratulations, you have zero clue about what you are talking about. Feel proud.

avatar

AFDozerman

No, this is an open API originally proposed by EA, not AMD, that allows developers to target the arch itself for higher optimization than DX and OGL allow. Nvidia and Intel were both approached by EA as well as AMD. AMD was the only one who took it, but anyone is welcome to join in.

It's not AMD's fault that Nvidia passed up a good thing.

avatar

maleficarus™

But the problem is if it is not used by game makers then it has as much impact as EAX has for sound today right? Let us assume that 25% of up and coming game titles use Mantle, that still leaves 75% without it. That sounds like PhyX to me right? I hope it works well for them I really do.

In all my PC gaming years I have never supported AMD. But today I finally did. I bought my family a Wii U Deluxe and AMD supplied the GPU! So I guess I can say I am an AMD fan now too!

avatar

vrmlbasic

Comparing Mantle to PhysX isn't right as PhysX is designed to enhance some visual effects, not give us an order-of-magnitude performance increase in games.

NVidia could make PhysX run on Intel iGPUs and AMD GPUs, and perhaps optimize it further for running on a CPU, but they elect not to. AMD can only make Mantle work on their own hardware.

avatar

maleficarus™

"order-of-magnitude performance increase in games"

LOL! That was funny thank you for the laugh. You think up to 20% + or - is an order of magnitude? Jeez you work for AMD?

Why would NVIDIA want to allow AMD to use PhysX when Nvidia paid for the technology. Would you buy a sports car then let everyone else drive it on a regular basis?

avatar

vrmlbasic

AMD claimed that Mantle would ultimately increase the number of draw calls per second by 10 to 30+ times in games with Mantle. That is an order of magnitude, and an order of magnitude is low.

We know that Nvidia won't let AMD use physx. We're pretty sure that they haven't optimized it to run on a CPU either (kinda sketchy). We know that the game devs and AMD gave Nvidia the option to get on-board Mantle and Nvidia gave them "the finger" as a reply.

avatar

RUSENSITIVESWEETNESS

You seem to confuse being a gamer with being a game designer. One requires a couch and a fat ass; the other requires an education.

avatar

The Mac

There are over 6 game makers already on board, and more signing up everyday.

as much as people hate EA, they are the biggest publisher out there and frostbite is going to power most of their next gen games.

avatar

RUSENSITIVESWEETNESS

I got a spam filter for you.

avatar

RUSENSITIVESWEETNESS

Read up on the Hawaii conference, where Nixxes, the house responsible for the shitty port of Deus Ex: Human Revolution, blames DirectX for the game's lousy PC performance. I'd argue it has more to do with their lack of ability or budget.

Seriously, how many shops are able to produce games with frame rates above the teens? Micro-stutter is common in modern games (anything Gamebryo), but it's often resolved with a user-made mod written by someone more skilled or determined than the original programmers.

avatar

vrmlbasic

I still can't believe that the company that made that awful DX:HR port put their name to it proudly. I would have been ashamed to have published that.

Maybe they couldn't have gotten high-res textures, upped the model details so every character had Jensen-level facial detail or removed all the unnecessary-on-PC mid-level load screens but they could have at least given us FPS that didn't suck :(

avatar

Innomasta

It's good to see that they're devoting time and energy to doing the right thing. Shoulda been priority one from the get-go though :/

I'm confused about mantle. Does this mean AMD cards will have a marked performance edge to comparatively priced Nvidia cards in mantle optimized games? Like a 7870 will match a 780 in regards to ingame performance with the new API? Will Nvidia be able to utilize mantle in some way?

avatar

USraging

My understanding of it, is that is allows AMD GPU's to interact directly with the hardware vs having to talk to a software layer that talks to the hardware. It also sounds like AMD is making the technology proprietary, but don't know if they are going to licences it out, or if Nvidia has plans on making their own version.

avatar

The Mac

Its not proprietary, it will be open source. The game houses have been asking for this for years due to the poor memory management, thread management and draw call budgets direct x forces on them.

all those fancy bells an wistles on modern harware goes mostly unused due to high abstraction overhead from DX. This is AMDs solution.

However, the initial SDK will be made for CGN hardware obviously since its AMDs baby.

Anyone can adapt it for their own hardware and OS if they wish.

Nvidia hates open standards, so it remains to be seen wether they will use it.

Basically it will be a wash for highend hardware, but will allow lower end cards to preform at a much higher level.

avatar

trog69

I have been waiting for the hardware manufacturers and game developers to find ways to allow mid-level PC owners to enjoy the same graphics as those of us with high-end gear. I remember when I had to play with most of the eye-candy turned down, and I didn't like. Unlike most others though, I am very fortunate in being able to afford my gear, and I hope this is the start of getting everyone on board.

avatar

maleficarus™

They have been doing this already for years! Matter of fact right now I am using a GTX 760 which is considered a MID range GPU and I can play everything high quality. The only difference between high end and mid end is the max resolution you can play at. So your point is?

The real issue is that a lot of PC gamers don't really understand what to buy in the first place. From what I have seen a lot are fooled by marketing or follow the masses which are usually misinformed in the first place.