AMD: R9 290X Will Be "Much Faster Than Titan in Battlefield 4"

62

Comments

+ Add a Comment
avatar

brucek2

What I don't get about this claim is that it is implying that either virtually no other card will be able to play the game, or if not, then what is all that "much faster" speed going to be used for?

If the game needs this special Mantle mode speed, than no nvidia card nor any of the previous and lesser AMD cards are going to be able to play it. Ie essentially no one can play.

But of course that's market suicide. So the game must include some mode that looks good on regular cards. Which leaves the question, just how much better is it really going to look on a special mode which only one specific graphics card can deliver? How much effort is a developer really going to put into specialized assets/effects that 95+% of their players can't see?

avatar

bluetorino1973

Microsoft is no longer going to update DirectX so DirectX 11.1 is it. The DirectX division has been broken up and assigned else where. So for companies like AMD and NVIDIA have to make their own API if they want more performance out of their gear cause Microsoft stopped doing updates for DirectX. Its been 3 years since DirectX 11.1 last update. Companies could go with OpenGL. This card better have more up its sleeve than just the Mantle API to compete with NVIDIA.

avatar

Freebar

I'm an AMD fanboy from many years back. When the long delayed Half Life 2 came out there was the ATI Radeon 9700 and a card from Nvidia. That Nvidia card was probably the worst mistake Nvidia ever made and ultimately led to a public apology by Nvidia's president. Nvidia had optimized that card for Half Life 2 and Gabe Newell called them out on it. It was also much weaker in all other games. And it was huge, power hungry, and ran hot. Does anyone remember this? I hope AMD isn't making the same mistake.

avatar

Ghost XFX

Has anyone else seen the XFX version of this card? That card x2 and the RIVE BE, that's the kind of techno-porn you'd have to peel hot sweaty t-shirts off to continue viewing...

Oh yeah, I'm planning my build as of this writing, based on those XFX cards and that board. Just for the pure sexy factor (9.9). I can't wait to start my next rig!

Igor! Cover me in thermal paste!!

avatar

Uber_Roy

It looks like Steam os and openGl are where gaming is heading for us PC bunch and AMD can have their consoles. BF4 is only going to be the same game as 3 was just a better single player mission, new content but basicly the same so what it will turn to rubbish like cod, how do i know well EA own it now thats enough evidence there. fool me once EA/Dice. Fool me twice shame on me and you.

avatar

criss969

You read an article about this yet-to-be-released Linux-based OS and just agree that Microsoft is down and out? OpenGL requires a much more complex coding process in comparison to DirectX which means that the chance of OpenGL dominating the gaming scene in the near future is close to none. As for EA and DICE, I agree with you 100%. I was disappointed with the direction they took with Battlefield 3, and now the Battlefield 4 beta is representing more of a run 'n' gun gameplay sort of like COD. If this is EA's new plan, they can kiss my money goodbye. What ever happened to caring about their customers?

avatar

pithargas

well the article says for the PC and uses the mantle API alternatively to openGL and DirectX.

avatar

Fyrwulf

I'm trying to understand if people are being blatantly obtuse or if they're spreading FUD. Mantle operates at a lower level than DirectX, so a game will probably do a simple system check upon installation and find out whether to use Mantle or DirectX. Yes, Mantle-enable games are going to run faster on machines that run straight AMD, because they will bypass all the APIs that are between the game and the hardware. Oh well, shit happens, the competition is creatively braindead because they'd rather buy up small companies that have great ideas rather than develop their own tech in-house.

As for the argument of AMD vs everybody else. I've had three computers in my life. My first one came with an AMD Socket A Athlon (800 Mhz) and a Rage TNT2 Ultra. I liked the Ultra, but when it died after three years of excellent service, I went ATI and never doubted my choice. My second computer was an Alienware that came with a AMD Athlon64 3400+ and an ATI X800 PRO; excellent computer that I never had problems with that served me well for eight years, including running games it had no business even starting. And now the POS I'm presently using, an Intel laptop that's barely four years old which chokes on browser based games and that I had to install Linux on just to keep it operating somewhat smoothly.

In short, y'all can stick with your Monopoly Tech Corp crap and I'll stick with what works best for me. AMD makes beast hardware and the only people that disagree are Intel/nVidia fanboys who can't perform a simple driver install because they're too busy fapping to screens of their latest synthetic bench runs. Those of us that do real thing with real computers and have tasted the water on both sides of the border find AMD to be oh so sweet.

avatar

Ghost XFX

You used an Intel LAPTOP to compare to a couple of AMD PCs?

Personally, i could careless about AMD or Intel, I just want the best possible for my needs. I been with both companies for my PC needs.

But the bottom line is, while I support AMD and want them to do well, they need a swift kick in the can and need to get a move on before they're left in the dust. << That's your TL:DR for this one...

The PC I'm using right now was supposed to have an FX chip installed, instead, I have an old Phenom II 970BE. I been looking elsewhere because they bungled the Bulldozer launch and tried to cover it up. If it weren't for my prior knowledge of what happened with the original Phenoms (you remember those?) would I have not known better to wait it out for the Vishera chips.

And while they're markedly better than the Bulldozers, they still fall short of what their goal should have been. These should have been the Bulldozers, and the Visheras should have been the upcoming Steamrollers. But again, that's how far behind AMD is becoming.

So look at the GPU side of things. They bought out ATI, which was essentially a division of AMD anyhow. What amazes me is when ATI consistently churns out a great product, and AMD struggles to get on their horse to keep up with a in-house product that should be tailor made for their chip.

This isn't AMD's first dance, and right now, they're the basic wall flower while Intel is busting moves on the dance floor with AMD's main squeeze, the ATI division. This is worse than any drama you could ever watch on TV. We have Intel chips bumping and grinding with AMDs GPUs (Nvidia doesn't mind, because Intel will always come back to them...), and the AMD chips are basically cuckholds! Only the 8350 and the 8320 have the guts to make a stand against Intel...

And true, GPUs are different than CPUs by design. But the fact remains, the GPU side knows what they're doing, the CPU side is grasping for straws to make it work. This isn't a match in heaven, it showing us all just how pathetic AMD is looking in the eyes of onlookers.

So as with AMD, I challenge AMD fans to take a good look at the i7-4820k. Imagine if that were AMD's chip. Imagine that chip as AMD's top end with the typical AMD MSRP. It would kick ass and very few would disagree.

It's kind of rhetorical at this point, but Intel is competing with itself right now. And while AMD has no chance of direct competition, they could at least do better with their innovations than they have. When you innovate, you better tidy up the loose ends, because everyone will ridicule it til they can't anymore. Until they do that, expect more of the same from all sides.

avatar

EJS1980

Thank you, sir, for your anecdotal brandism. I had no idea that anyone who prefers Intel/Nvidia over AMD due to their objective performance advantages, were "fanboys who can't perform a simple driver install because they're too busy fapping to screens of their latest synthetic bench runs".

So just so we understand your thinking here, a fanboy is someone who requires faster performance, better power efficiency, and consistently more reliable driver/software support, over marginal price differentials and/or the allure of championing underdog status?

Intel and Nvidia have a leg up on AMD in almost every regard save for pricing, which they can justify since they have the performance advantages to back it up. So anyone who wants to argue AMD's price advantage is almost wholly justified in doing so. However, once you start falsely presenting AMD as anything other than the cheaper option, you yourself run the risk of blatant fanboyism.

AMD does make "beast hardware", but Intel/Nvidia make even beast(ier) hardware, and this fact can be seen not just in "synthetic benches", but in every real world application that we use these components for in the first place.

avatar

John Pombrio

Wow, a lot of replies! I had issues with AMD drivers in the past and had have been using NVidia cards for a long time. Right now with a 27 inch monitor and a GTX780, I should be good until NVidia hopscotches over AMD next year.
Raw number or benchmarks are not that big an issue for me anymore. I am not going to go 4K for a least a couple of years and a 27 inch monitor is a nice sweet spot for me. Without more pixel real estate, I SHOULD be set for a while. Of course I always say that until the next BIG thing comes along, heh.

avatar

EJS1980

Just curious what res you're running?

I myself have a 27" 1440p panel that can be overclocked to 120Hz, so the dual 780's I have are justified in keeping the framerate and settings up, all while pushing this resolution and frequency.

avatar

Baer

Lets see the reviews AFTER it is out. Let's see if the drivers will work not only for BF but for everything else.

avatar

AFDozerman

It's the new Mantle API. I have a feeling this is going to be happening more often with a specific set of games.

avatar

carage

I have a feeling that specific set of games pretty much is all EA products...'cuz EA has a tendency of using the Frostbite engine for everything nowadays...

avatar

kiaghi7

I get so tired of long claims and short results...

The next line of Radeons may well be awesome, but they've got absolutely no room to talk until it's a reality, not a claim.

It's no different than a kid in grade-school saying "well my dad can beat your dad!"

AMD has needed to get their act together for going on three some odd years now, and it's not going to be turned around with meaningless claims and a puffed up chest.

avatar

nick779

I can understand being bitter about physx, but physx isnt game breaking. if the company purposely makes the game run slower on nvidia cards theyre hurting the entire game and everyone who buys it.

im all for competition in the name of performance, but coding to beat someone instead of actually having the GPU horsepower to beat a rival is a crappy way of doing business.

as a customer thats been failed by AMD multiple times across multiple generations of GPUs and drivers, this isnt the way to get a playerbase back and after this, I wont touch amd even more so.

avatar

Ilander

That's not exactly what Mantle does; it gives developers better functioning code if they have the staff (and inclination) to use it, and we need this pressure on DirectX, which Microsoft has been dragging its feet on for years.

The conspiracy theorist out there say Microsoft has done this to push console sales by limiting the "awesomeness" of PC gaming. I think it's more the case that they don't have financial incentive to actually develop a more efficient version of DirectX, because of the consoles, but also because they see the hardware always getting faster, and figure that in the long run, bloat doesn't matter.

The problem, though, is that the particular kind of bloat DirectX has, with CPU overhead for each draw request, means that it hasn't really gone away because draw requests have not leveled off; we're still adding resolution. That MIGHT change with 4K, which could be the final 2-dimensional graphic standard for PCs and televisions.

Finally, I bet nVidia, as a savvy corporation who also tried to get their graphics on the next gen consoles, will be able to respond to this with their Maxwell launch. They might not take 100% of the sting out of Mantle, but they can probably take half out without much trouble.

avatar

LatiosXT

The problem with DirectX is that it has to be generic because the developers don't know if you're using an AMD, NVIDIA, or some other GPU. OpenGL isn't any different. The only way you can get more efficient is to get down to lower levels and talk directly to hardware that you won't know is really there.

Mantle solves this problem just for AMD, because the render path knows it's using an AMD GPU. The way I see it is...

Renderer -> DX -> Driver -> Hardware vs. Renderer -> Mantle -> Hardware.

avatar

chicagoguye

Vice versa developers been doing that for years for nvidia cards to run better than amd/ati in most of there games i can remember DOOM 3 was one of the worst for ati the game was develop for nvidia cards even john carmack said it but the game sold very well so i dont see how is bad the gaming community

avatar

zaphodbeeblebrox 42

BUT WHERES THE MOBILE?!?!?!

seriously. only reason i waited until the 25th was to hopefully see some new laptop cards...

avatar

methuselah

What good is a screaming video card when you're fighting to get drivers to work?

avatar

Army Of One

A comment that was worthy 5 years ago......

Have you read the BF3 forums the last 4 months with TONS of Nvidia owners complaining about driver issues lately?

Both companies have had their up and downs, driver issues for AMD hasn't been one of them for a long time.

avatar

aferrara50

it's still valid today. There's a reason I sold my 3x 7970s for 2x 690s then 3x titans. Crossfire is broken on AMD's cards and quad sli is broken for nvdia. I went through hell getting trifire to work consistently.

avatar

methuselah

Worthy 5 years and still worthy today.

I can't tell you how many friends that have had to root through the registry just to do an uninstall from a failed driver install.

I'm not referring to performance issues, or more specifically to BF3, "Microstuttering" that both AMD and Nvidia are now working to correct.
I'm talking overall problems with driver install/uninstall.

avatar

Nacelle

Wrong sir! Try running Crossfire with Eyefinity. There's a good article on the whole thing at pcper.com

avatar

nick779

thats a complete and utter lie, I own a laptop with an 6770M and used to own a hd6770 and a 7970ghz, and had numerous issues. every new driver brought more problems even with fresh windows installs.

I broke down and bought a gtx 770, installed the driver that came with it, had bad artifacting to the point of being unplayable, then a week or two later 320.49 was released and ive been problem free ever since.

avatar

AFDozerman

Funny. I haven't had any problems with my 7k series card. None whatsoever.

avatar

aferrara50

Run 3 of them and report back. It's hell

avatar

SirGCal

I also went from a pair of 7970's to a 690 and am much happier for it. ESPECIALLY for > 60Hz setups, AMD has serious driver problems but don't care about them for the 'elitist few' as they put it.

I actually loved my AMD cards until I went beyond 60Hz. Then they fell apart with a lot of card crashes, snow-lockups, as well as all hardware acceleration became useless. Nvidia doesn't have any of those problems. (and I also tried each card individually before switching, same results). I used to work for AMD for many years so I held on a long time out of loyalty. But I finally moved onto the Intel core and now the Nvidia GPUs. We'll see what the next one brings.

But I have a visual issue where <60Hz actually gives me extreme migraine headaches. So I need the speed just for physical reasons alone. Sure it does look smoother but it keeps me from having headaches that put me completely down. And for whatever reason, AMD doesn't seem to want to care about all that. Getting it to work was a pain and then it was never stable, especially in crossfire with their horrible frame latency and microstudder. I'd move to Titans but... $...

avatar

LatiosXT

The only thing this benefits greatly is the PS4 and Xbox One, where you need to squeeze out every bit of performance you can get and if bypassing the high-level API gets you that performance, so be it.

For PC? You'll be alienating a good chunk of your customer base if you solely code for Mantle. If http://store.steampowered.com/hwsurvey is any indication of something, you screw over 50% of your potential customers if you say "we're only coding for Mantle!" and give them shoddy DX/OGL support.

avatar

lacky32

Or maybe the purpose is to get people to roll over to the Red Team after the benchmarks start rolling in and people see that all these games play better on Radeon? That's assuming that people pay attention to that sort of thing and aren't just Fanboys one way or the other.

avatar

bpstone

I am tired of dealing with shoddy drivers from AMD. Maxwell is on my shopping list.

avatar

AFDozerman

Not looking forward to an era of gaming where games are only optimized for one card vendor and you can barley play a game from the other vendor on your card.

avatar

misterz100

You're about a decade late.

avatar

CaptainFabulous

Too late. *cough*physx*cough*

avatar

USraging

I switched from AMD about 2.5 years ago. Had way to many problems with them and driver issues with micro stutter. now I am sporting a GTX 690 and am very happy with the dual GPU card. I really don't have a reason to upgrade yet.

avatar

NotYetRated

Ditto. I switched about a year ago, will not be going back. The AMD cards were loud, energy hungry and had awful driver support. Nvidia has been good to me in terms of performance per watt, noise and having good driver support.

avatar

Ninjawithagun

It's all smoke and mirrors until the final consumer version of the card is released and hardware reviewers have had to time to test it out. The one thing we have learned is that with that fan design, AMD is going to redicule it's 290X owners with a ton of fan noise. And I need to point out the absolutely craptastic AMD graphics driver development team...

avatar

Xenite

The 3 GB 280x was listed as being $299, I can't see the 290x being in the $600 dollar range. $300+ difference with no cards between is huge pricing gap.

avatar

lacky32

Your forgetting about the 290.

avatar

Ghost XFX

Ok, question for the almighty sages of AMD, and perhaps, somebody with some inside knowledge on the matter....

How in the hell is it possible for AMD to make such great GPUs, and yet, their processors drag it up the rear like a 500lb fat boy running a marathon?! The hell is wrong with you, AMD?

Don't get me wrong, I'm glad the GPUs coming out of AMD are up to speed and priced very well to boot. But coupling these great GPUs with these processors is like putting a 4 cyl in a quad cab 1/2 ton truck!

If they can make a processor that performs as well as their GPUs, Intel would be up to their chins in **manure**! They couldn't possibly justify anything they put out into the market for the prices they seek.

I'm about to write AMD an open letter, it's time to light a fire under that SB. Give the people a processor they can be proud of, or get ready to hit the bricks.

avatar

Upyourbucket

Great post. I totally agree with you. SO... AMD if you are listening. Step your game up on the CPU side of things. You've got the console market locked in. Time to put the pedal to the floor and churn out some whoop ass processors that stomp on Intel.

avatar

CaptainFabulous

Because making an x86 chip isn't the same as making a GPU? Two different divisions (remember, the GPU division used to be ATI and was bought by AMD), two different teams, two very different architectures.

@Xenite: you're delirious. In games that are lightly multithreaded and rely heavily on single core performance (translation: most games) AMD chips gets stomped by Intel ones. This isn't speculation, it's fact. Now if you're talking about general computing or things that are highly multithreaded then the difference becomes much less noticeable. But on per-core performance AMD chips simply can't touch Intel, and that's what you need for good gaming performance for current titles.

avatar

misterz100

Honestly he is right about one thing, a high end CPU isint in the need as much as it used to be, most games are GPU heavy, my 1st gen i7 can still keep up with any demand I ask of it. I upgraded my dad and brothers PCs with i5 Cpus ivy bridge and they run games perfectly.

avatar

CaptainFabulous

It depends upon the game. Multiplayer games and MMO are very CPU dependent. Single player and FPSs not so much.

avatar

Ghost XFX

Depends on the developers in question, that is. Meet Ghost Recon: Online for example. Intel machines are WTFSTMPN at a furious pace in that title. Majority of the top players have Intel rigs. A few have AMD rigs. UBIsoft haven't even worked out all of the bugs as of yet, but that hasn't stop players of Intel rigs from putting a good beat down on everyone else.

And it's known that GR:O plays better with multi-threaded procs. It does matter what kind of chip you have and if you're optimizing it for the best performance possible.

But the point I was getting to, If you build a proc to match up with the performance of your GPU's, such as AMD does with their platforms like Scorpius, you should at least try to make sure your processor has the juice to take advantage of the GPU's ability.

Otherwise, what's the point? We see the benchmarks between these procs, and over the last 5+ years, Intel has dominated for the most part in those benchmarks that revolve around gaming. It sure didn't hurt Intel with the scores seen. They can use the exact same GPU and the results usually end up in the Intel rig's favor regardless.

If I were asked what I wanted out of the next AMD processor, I'd hand them the i7-4820k and tell them, stronger, faster, competently efficient performance, and look, you don't even have to move your TDP, it's there already. Focus on this much and make this your top of the line, that's not asking for much. And price it the way only AMD can. They ever do that much with the next proc, I'd tattoo AMD on my left ass cheek.

avatar

RUSENSITIVESWEETNESS

I suspect the lack of competition in the performance arena is the result of underhanded manipulation of AMD, similar to Microsoft's gutting of Nokia.

avatar

Xenite

In real world use, AMD's processors are just fine. I have a full AMD gaming rig and I can run any game out there maxed out without an issue. And I do it for a hell of a lot cheaper than an Intel rig.

Most benchmarks are skewed in favor of Intel's architecture anyways. In most benchmarks the bulldozer way underperformed because they didn't take advantage of it's unique multi core design.

PCMark 2005 was notorious for benchmarking Intel Id'ed chips 30-40% over what they should have. Sysmark was intentionally revised to downplay the strengths of AMD chips.

When you consider many people use those numbers as a bias towards what they purchase, it's clear they have a vested interest in making sure their chips perform well in them.

avatar

Ninjawithagun

Uhhhh...what do you considered "a hell of a lot cheaper"?? The best AMD Bulldozer CPU vs. a comparable 3770K will cost you only $100. AMD FX-8350 costs $199 right now, whereas the Intel 4670K costs $240 and the 4770K costs around $340. What I get in return for that extra money ($40 more or $140 more) is superior processor is more powerful, overclocks way better, uses less power, and executes everything I throw at it faster than the AMD Bulldozer CPU - PERIOD. I don't want to rain on AMD's CPU world, but the facts are the facts. To me it's worth it to invest into an Intel CPU, whereas others like yourself think it's not worth the extra cost. Nevertheless, I do agree that whatever you decide to go with for your CPU, you will still end up with a great gaming rig :)