AMD Mantle Interview with Oxide Games' Dan Baker

26

Comments

+ Add a Comment
avatar

methuselah

Doesn't Mantle only apply to AMD/ATI cards?

Or can Nvidia owners take advantage of this as well?

avatar

Zee530

It can basically be programmed for any card.

avatar

wumpus

If that was really true, I'd expect some sort of schedule for AMD VLIW4/VLIW5 GPUs/ALUs (basically everything except the latest cards).

On the other hand, this seems to have been designed more to aid AMD CPUs, especially in any situation that might be CPU limited (which seems less likely on older cards).

avatar

The Mac

no, anything prior to GCN is not hardware comapatable with mantle. Too much fixed function hardware.

One of the basic requirements is dynamically allocable shader units.

This doesn't exist prior to 7xxx.

avatar

bwat47

Nvidia would have to add support in their driver though, and it could potentially require hardware changes as well (similar to how new versions of dx sometimes require updated graphics hardware for certain new features). I recall reading somewhere though that nvidia's kepler should theoretically support the hardware features required by mantle (I think it was from an interview with BF4's rendering architect)

avatar

Apoc1138

The one question I would have liked to see an answer to would have been "why is the deferred contexts option disabled by default when enabling that option gives nvidia users a similar boost to mantle?"

I have both Amd and Nvidia rigs and my Nvidia rig actually gets higher performance with DC enabled than my Amd rig gets with mantke enabled, go figure

avatar

LatiosXT

I found slides from GDC '13 that describe DC. And guess who made it? NVIDIA: https://developer.nvidia.com/sites/default/files/akamai/gamedev/docs/GDC_2013_DUDASH_DeferredContexts.pdf

According to them, it saw a ~50% increase in performance on Civ 5 and a ~24% increase in performance on Assassin's Creed 3. On Core i7-930 ("2.93GHz Nehalem"). By the way, while the presenter was NVIDIA affiliated, the slides made it sound generic, particularly when one said to test these methods on NVIDIA and AMD.

It seemed the guy was also talking about "threading", which makes me think AMD wants to turn their GPUs more into GPGPUs. I guess that's cool and all, but what's stopping Realtek say from implementing proper headphone virtualization as a standard feature on their chipsets?

avatar

Apoc1138

Yes, it was "invented" as such by nvidia, however it is a standard part of DirectX, Amd cards get some benefit from it and would get more if Amd added full support to their drivers.

Dont get me wrong, mantle has much more potential long term (though that isnt being reflected in the current implementations weve seen so far) but it puzzles me why a developer would obviously spent a fair amount of time implementing DX optimisations that clearly work and then turn them off and not document them... The cynic in me thinks Oxide really wanted to "showcase" mantle at any cost. None of my cpu cores nor gpu are running at 100% when running the demo, so im struggling to work out what the bottleneck is. Their explanation that DX multithreads poorly doesnt really stack up, as all my cores see some activity but none of them reach 100%

I'm also puzzled as to why their motion blur - which doesnt even look that good - causes a 50-70% reduction in frame rate.

avatar

LatiosXT

I'm wondering how you enabled DC, since all my searches point to DC being something that must be implemented in the game engine itself, not a switch that you can toggle.

Also more ammo against Mantle because why not? http://www.overclock.net/t/1463351/steam-star-swarm-benchmark/130

avatar

Apoc1138

go in to the "assets" folder in the starswarm demo folder and open up Settings_Extreme.ini and then change "deferred contexts=0" to =1

run

enjoy higher FPS

this is what I mean - star swarm HAS been coded to take advantage of deferred contexts, but it has been disabled by default and you have to hack an ini file to turn it back on

e.g. I get 35fps on my AMD machine, turn on mantle get 50fps
on nvidia machine (same CPU) I get 30fps no DC, 55fps with DC, so bigger gain percentage wise and higher average fps on NV

interestingly, BF3 favoured nvidia hardware because of DC, but DICE have removed DC from BF4 because they claim they saw LOWER performance with DC, however every other game using DC gets a big boost for nvidia hardware

avatar

Damnlogin

This is a big step in the right direction. I hope it catches on...FAST!

avatar

yammerpickle2

Mantle may not be the greatest thing since sliced bread, but at least AMD is trying to more effectively leverage the power of modern GPU. MS has been busy trying to make money on Xbox and Direct x is very dated. However with 4K displays and VR we need a revamped modern API optimized for modern hardware, and new display technology.

avatar

John Pombrio

And yet 35% of the computers are still running WinXP and 10% of Steam users are running Intel integrated graphics. Throwing another API on these old machines are not going to affect a whole lot of people. We are the exception, not the rule.

avatar

vrmlbasic

...over 11% of Steam users are using Intel graphics.

Those GPUs can't game, plain and simple. Sandy Bridge's iGPU has issues playing even indie 2D games. Nothing can save those Intel-hamstrung machines so the gaming community should just ignore them.

XP users might be able to get something out of Mantle, if AMD still supports XP with Hawaii, as it could (in addition to the performance boosts) give them DX11-class features despite the fact that M$ won't let them use anything > DX9.

avatar

LatiosXT

>Those GPUs can't game, plain and simple

I ran Mirror's Edge, albeit at the lowest settings, on the iGPU in an Ivy Bridge based Pentium, which is worse than what's on the i5/i7 parts.

You can play games. You just can't play them to the same standard as PC gamers want.

And OGL also skirts the whole DX11 features on NT6. But yeah, nobody really does OGL these days.

avatar

xRadeon

I could see it becoming main stream if others can side by side support it as easy as this guys says with the other APIs.

As for hardware vendors, I don't think Nvidia would jump on it any time soon, but I see Intel doing so.

Anyhoo I've tested it and am pretty impressed with the results. So I'm game so long as more future games support it! :D

avatar

maleficarus™

I have a really interesting story to tell about this Mantle hype. Today I was out buying a entry-level card for my girlfriends son. He has a dual core Pentium E5800 3.2GHz with onboard Intel Graphics. I already knew what to buy but my struggle was I had a budget of 65.00. I did my research and the choices went down to either a Asus GT630 DDR3 or Sapphire Radeon 7730 DDR5. So anyways, I walked into the PC store and told them I want the 7730. The kid behind the desk says to me any reason why I want that card? I replied that this card is going to be paired with a dual core and a low resolution monitor. He comes at me saying this card will be so much faster then NVIDIA because of Mantle.

Anyhow, I said on what game? He looked at me with the dear in the headlight look. Anyways I walked out with the 7730 DDR5 card for 65 bucks and said to my GF in the car he had no clue how much I know about hardware LOL

My point to all of this is, the PC store guys push certain cards on potential false hope and not what would actually help in performance guaranteed. I picked the Radeon 7730 not because of Mantle, but because it runs on DDR5 and not the slower DDR3. What I hate about what AMD is doing now is trying to add another layer of confusion. This time it is in the form of the term Mantle.

avatar

John Pombrio

Hype is a wonderful thing for selling things. Astronomy and Sky&Telescope magazines had these HUGE ads for buying binoc and scopes to see the wondrous ISON comet! Comet of the century? was on the cover. My how that fizzled out when I got stories and ads like this in the mags AFTER the comet had already been completely broken up by the close passage to the sun.

avatar

vrmlbasic

At least you made the right choice, and for some of the right reasons.

Though a dual-core Pentium?! That poor child had better hope that Mantle comes through as otherwise he's screwed. There's no way that sad dual-core chip and its ancient accoutrements can sate that GPU. Even with Mantle performing to AMD's promises it still might not as that ancient chip has a mere 800 Mhz FSB. :)

avatar

maleficarus™

This is a tough one for me. My brain is thinking in two different directions on this at the same time. On one hand this could be a really good thing that AMD is trying to do for PC gaming. On the other hand this could be AMD's attempt at trying to gain an edge over NVIDIA in performance by releasing an API that is inherently going to run better on CGN capable hardware.

Then, the other side of the coin is-hasn't this been done before with GLIDE? I remember GLIDE and how 3dfx cards ran Unreal Tourney games so much better then OpenGL.I remember Tribes and the only way to play it was with a GLIDE capable card. This gave 3dfx the edge over ATi and NVIDIA. At the time Intel could not care less as they were too focused on the Pentium brand. So having said this, this just seems to me to be a history repeat in the form of Mantle.

I am shockingly surprised that the guys here at MPC have not even made that connection yet. If it looks like a duck and it acts like a duck. Is it a duck?

avatar

The Mac

AMD itself has said in the end, they dont care if mantle gets adopted as long as something "Like it" is adopted.

They are trying to push the industry. For there own bottom line of course but for everyone elses as well.

There isnt a connection to make.

avatar

John Pombrio

If Mantle is to be adopted in any meaningful sense, it will have to be widely adopted. It will have to support multiple AMD graphics cards, including older cards, Intel integrated graphics, NVidia. and AMD integrated graphics. Since out of the top 10 graphics cards used by Steam users, AMD does not even make the list and Intel has the top two slots, that bodes ill for wide adoption.
http://store.steampowered.com/hwsurvey/videocard/

The other thing is that the API will only show large performance gains on computers that are on the ragged edge of obsolescence. With more powerful GPUs, there simply will not make much of a difference.

The last thing is how many games will be able to take advantage of Mantle? Like Physx, is it going to be a niche product?

avatar

The Mac

it will NEVER support older cards as the functions nessisary for its functioning better over DX are fixed function prior to CGN.

an example AMD itself used is dynamically allocating shaders to render units. Its not possible on older cards.

Depends on your definition of obsolescence, and gains.

on my 2500K oced to 4.5ghz, and a 290 pro, i get 100% gains in starswarm, and close to 20% in bf4.

The whole thing is less about trying to get wholesale adoption of mantle, and more about lighting a fire under the industry's collective butt to better leverage the massively under utilized GPU power of today's hardware.

If in the meantime it means I get better performance out of my new GPU, thats fine with me too.

avatar

vrmlbasic

I don't know who is more to blame: Intel for passing off its shoddy iGPU as being capable, GPU makers for not reminding computer makers that even their weakest GPUs > intel's best iGPUs or the computer makers for electing to settle for intel's horrid excuse of a GPU, effectively hamstringing thousands of computers and at least 11.13% of the Steam users as of January 2014.

"The other thing is that the API will only show large performance gains on computers that are on the ragged edge of obsolescence. With more powerful GPUs, there simply will not make much of a difference."
^ proof that AMD really needs to do a better job of getting the word out as to what Mantle is.

avatar

AFDozerman

Only 3-4000 lines of code is actually pretty surprising considering this is a completely new API that is lower-level than most, depending on just how verbose it is. Come to think of it, what does Mantle even look like as an API? Has documentation even been released?

avatar

The Mac

not yet.

Only developers have the SDK.

AMD is supposedly going to release documentation this year.