Nvidia GeForce GTX 780 Ti Benchmarks

104

Comments

+ Add a Comment
avatar

Vesuvius

Love all these nerds saying their card is good enough to run their games and this stuff is overpriced. Try running top games on a 2500x 1600 monitor maxed out. Nerds you spend 2 to 6 hours a day on your gaming computer year after year. Spending and extra grand on quality monitor and card works out to about a dollar a day. Since you are not getting laid, going on dates or anything else socially what a bargain. Can't wait to get my king of the hill 780ti, Bolt 2 from Digital Storm, 500 gb samsung ssd and play with my hot Colombian 21 year old gf on my 30 inch high res dell monitor. I have the best of both worlds!!

avatar

limitbreaker

Lol you're a looser! While I agree that if you spend a lot of time on something that you should spend some money on it, you sound much worst your self than a "nerd"... And that 21 year old kid is not your girl friend if you need a screen to play with her lol girls on cam from porn sites does not equal girl friend.
Lol looser

avatar

PCLinuxguy

Not only is he a loser and a troll, but also a hypocrite as well.

avatar

PCWolf

Can't wait until the prices for the new GTX 780 Ti card hit $500. Can't wait to see AMD's response. This battle between them excites me! Still wish they didn't suck at writing drivers.

avatar

limitbreaker

AMD drivers have improved a lot and apparently not any worst than Nvidia's. I've yet to have Driver support for final fantasy 14ARR for my gtx780.

avatar

PhuxAche

Until the day a GPU can play games like what Pixar can do for graphics, and at a lightening fast frame rate, then I will get a hard on. Until then, this card is a money milker from Nvidia! Man that company is really starting to piss me off with their crap!

avatar

PCWolf

Graphics cards that can render Games in real time that look a Pixar Animated Movie are a good 20-25yrs away. By that time, the Graphics used by Pixar & Dreamworks would be so Realistic, the real life actors might be out of a job.

avatar

PhuxAche

I'm curious... how much do you get paid from Nvidia for typing this crap? How mundane and tediously BORING does it get to hear "...Is now the most powerful GPU in the world, beating the "other" brand, by X amount. Fuck knows why I bother reading this shit every time a new bit of plastic,metal,compressed paper and resistors comes out claiming the throne again which costs a bank loan or some credit increase on one's credit card! It's like hardware has turned into some sort of XFactor glitz or some shit! Fucking'ell....

avatar

PCLinuxguy

"butt-hurt" because they report what they get/or that they aren't fanboys of the brands you like? Granted there is bias in MPC, but to rant and complain while still following them sounds as moronic as the lemmings that bitch about X/Y/Z social network while still using it and recommending it. Don't like it? Then go elsewhere. There are tons of places online that offer up to date reviews and etc. Hell, I've got several in my RSS.

avatar

aarcane

I would, just once in the near future, like to see a true landslide victory. I want nVidia to produce a product that will blow the previous generation out of the water by an order of magnitude, rather than a minor incremental improvement. Any incremental improvement is minor if it's linear.

Next time you bring out something earth shattering (nVidia, AMD, Intel... MaximumPC), don't raise the constant. Increase the exponent. Change the growth curve. Demolish the old records in ways that will leave them incomparable. Otherwise, it's just a minor bump, and hardly worth the price increase.

avatar

PCWolf

I'm sure that they could build a Graphics card right now that would make even the GTX Titan look like a integrated graphics card from 2004. But only the people with very deep pockets would be able to afford it as it would cost more than a Car.

avatar

PCLinuxguy

Much like the $3,000+ workstation cards. I've priced a few and yeah, I could put a down payment on a car for what they're priced at.

avatar

PhuxAche

Hah! Dream on! Until people with more money than sense stop squandering their money on a completely new card - with retro plastic fascia - for a measley 10-15fps, and still struggles with MSAAx8 on a display of 1920x1080 then Nvidia will continue to milk, milk milk. So, the answer to your comment, is "it will never happen". The only way it will happen, is if you use your current card for say 4 years or more, then buy the card after that time period, then there is your answer.

Until then, Nvidia and all other greedy HardWare corporate will ensure you chase the illusion of "better" for a stupid amount of cash!

avatar

Alpha-Zulu

I'm not entirely sold on this GTX 780 Ti. Now before the "ATI VS Nvidia" fan boy argument comes out, I've had numerous Nvidia cards in the past. My previous card was a 280 GTX, a 7800 GT before that, 6800 GT before that, and a Ti 4600. See a trend here? With that being said, I still don't see the GTX 780 Ti as a smart buy. Why? Well look at the price. It's abysmally high for the performance. I could spend half as much as the cost of a Titan and it's basically right there as far as performance is concerned. Hell if I wanted to be real cheap, I could also go for the 290 which is still close to the 290X.

In an economy that isn't doing so great, I don't see how one can justify the cost for the performance just so you can say you have the "most powerful single GPU card" in the world. Honestly, I think the author put WAAAY too much emphasis on this whole "who's the king of the Single GPU". And it's pretty obvious he's biased towards Nvidia over ATI. But that's not surprising, who isn't biased these days?

I'll stick to my 6870 for now and hope for a deal on the 290/290X or maybe even get the ASUS Matrix 7870 that takes up 3 slots in your case lol.

avatar

PhuxAche

I went from a 8800GTX (2006) to a GTX480 (2009), and seen a good leap in performance. Then from the GTX480 to GTX680 (2012), and slapped myself hard on the forehead for doing so! My GTX680 will now last me another 3 or 4 years, then I will get the next stupidly-ridiculously overpriced card, and actually see a massive performance increase. But by that time, the consoles will be outdated compared to what the PC can do then, and we have to play substandard games until the XboXe=MC2 and PS5 comes... what a vicious cycle!

avatar

limitbreaker

You're absolutely right, I bought a gtx 780 to feel good about that "most powerful single GPU" (well almost) title and i got screwed. Not because the 290x came out and dropped the price of my card but because Nvidia should have included a more powerful card to begin with.

The Titan should have had the full 2880 cores and the gtx 780 with 2688 cores to begin with. ANYONE who has bought the Nvidia "most powerful card" got screwed shortly after and i learned my lesson.

avatar

Baer

Like them or not, they just appeared as available on the EVGA site. The reference 780 Ti units are available now and the ACX cooled and Hydro units will be available soon.

avatar

RUSENSITIVESWEETNESS

My experience has been that most titles experience significant video tearing without vsync enabled, so I'll suggest we do use it, although we would rather not.

avatar

PhuxAche

I hate tearing, and I love Vsync. BUT in this day and age of technology, tearing should be already eliminated already. Tearing should be a 90's/Naughties past time!

LG.. lets's bring out a 4230p TV, but we can push aside stuff like tearing cos that's sorted with a vsync app..

avatar

Nimrod

"Nvidia has snatched the single-GPU performance crown back from the clutches of the recently launched Radeon R9 290X, and not just by a small margin either, but by a landslide."

Josh this is complete bull shit. Its not a land slide. In fact its not even an across the board win in every single category.

At 4k the 780Ti is cannt even be called a winner because in most games where its faster it only holds 3-5FPS lead. How much did Nvidia pay you to write this? Or maybe they just gave you a script and you didnt even write it your self.

Moreover, why the FUCK did you run all these tests with 4xAA turned on? AA performance is its own benchmark not a baseline. If the 780Ti is a little bit faster at AA than the r290x then you have completely ruined all of the control in your tests and totally biased the results immediately.

An extra 150$ aint worth no 5FPS in BF4. Not when ASUS is going to release a 290x with a better cooler that will give me back that 5FPS.

MaximumBS

avatar

PhuxAche

Lol! I LIKEY!

avatar

kixofmyg0t

It's MaximumPC man, you know that nvidia cards are always going to be faster here. If it's 1 to 2FPS faster it "completely demolishes the AMD card". Look elsewhere if you want non biased reviews.

I'm not about to defend the reference 290X though. It's cooler is a horrible design. But elsewhere on the web people have thrown a aftermarket cooler that was used to cool Fermi cards onto a 290 non X and it blows away the 290X with the right cooling.

Once non reference 290's and 290X's start hitting market it will be a different ballgame.

avatar

maleficarus™

If all you want is AMD rules reviews just head over to [H]ardocp! They are the most AMD biased review site on the internet!! I called them on it a few years back and Kyle banned me for it LOL

avatar

limitbreaker

I think the smart man would rather have a fully unbiased review.

avatar

joshnorem

I appreciate the feedback. For cards like this we typically test only at 2560x1600. That is the resolution people buy these cards for - not 1080p, and not 4K, yet at least. So though I did the 4K tests to show you what it can do, it's not a terribly influential piece of the puzzle in my opinion. So at 2560 the  780 Ti dominates, that is why I wrote that.

That said, I am looking at the 4K tests once again and will update it tomorrow once I have done some more testing. We barely got the card in time to meet the embargo date, so I was just barely finished testing when I had to write things up. I am redoing the tests to check all my numbers.

 

avatar

Nimrod

I think 4k testing is very relevant really. If the industry going to make the move to 4k they might need a lot pressure in that direction. All the vendors want to talk about their 4k performance so it deserves to be scrutinized.

I wonder how long it will be before we see another monitor price fixing lawsuit with the new wave of 4k screens. I dont understand technical justification for charging 5,000. for something that is only 4x bigger than something that can be had for 250$ Its a 20x markup for 4x the resolution.

avatar

kiaghi7

YEAH! why did he use 4xAA... Like people are actually going to use a feature they paid good money to take advantage of!

That's preposterous!

The next thing you know, they are going to want to live in the house they bought and drive the car they paid for! It's anarchy I tell you!

And how dare he post the facts that one card, regardless of which, is better or worse in various categories...

Clearly it's a conspiracy! Yeah!

It's got to be ALIENS! That's it!

The 780Ti edged out your prefered card... It's a simple matter of life, technology marches on...

You throw out that you can get a 290x with a better cooler... Well that settles it, because it's impossible to get another cooler on a 780Ti right? Or overclock it? Both?

Regardless of which one edges out the other, both are top-end cards, and few if any, will ever really notice the completely marginal difference(s) in capabilities, so why are you freaking out?

It's not like someone from MaxPC marched into your house, while you were in the basement, errrr "your apartment", and slapped your mother or something...

By your own words, the margins are tiny and hardly anything to get worked up over... So the question becomes, why are you getting so worked up over them? (Rhetorically posed mind you)

avatar

Nimrod

Are you fucking retarded? You dont bench with AA turned on, everyone knows that.

The 780Ti is only 5FPS faster in BF4 according to this test. In many other tests its within the 2-3FPS margin of error. Hardly the crusher MaxPC makes it out to be. At 4k the margin is even tighter likely due to the larger VRAM on the 290x

Maybe you are one of those idiot COD kiddies or maybe you play a lot of the single player Far Cry3 game. If thats the case then buy this card because those are the only games it delivers a crushing blow in. The entire premise of this article is that the 780Ti kills the r290x across the board.

The reason it upsets me is because this reeks of the phoney fake paid for journalism that is rampant in this industry. Not what i want to see from MaxPC.

avatar

DarkStarX

Nimrod,

It's sort of ironic that you say kiaghi7 must be a COD kiddie, when your post reads like it was written by a COD kiddie responding to someone daring to say the BF series is better.

Will Asus release a 290X with a better cooler then is faster then the reference card? Sure they will, just as they will for the 780Ti so your rant futher down is meaningless.

I also find it telling that you rant about testing using AA, but then go on to tout how the benchmarks weren't a landslide when tested at 4K, can you say "Cherry Picking?"

Now if you look at the results objectively you will see that at 2560x1600 the 780Ti beat the 290X, and even the Titan, in every single test so yes it was a landslide. Yes it only was 5 fps faster in BF4, but it's also worth seeing that this is 10% faster then the 290X, which . Also in COD: ghosts it shows a 19 fps advantage, which is roughly 30% faster.

Other reviews I've seen, with various settings vary on how much faster the 780Ti is, but in general all show it to be faster across the board, even then the Titan, which of course the 290X wasn't even able to beat across the board. Yes it's more expensive, but I don't need MaxPC or any other site to tell me if it's worth the added cost as I am more then capable of doing that myself.

Now with that said, I don't rely solely on MaxPC to draw my opinion from, but instead compare their findings and opinions to other sites, such as [H].

Finally, because you wet yourself calling me an NVidia fanboy let me point out that not only is my current rig running an AMD GPU, but 2 of my last 3 GPUs before this one were from AMD/Ati. Simply put, I go with whatever is best for gaming at the time I do the build, not what might be 3,6, 9 or 12 months later.

avatar

kiaghi7

Ahhh well the profanity and hyperbolic ranting clearly shows how level headed you are...

Yes, you -DO- bench with AA turned on, innumerable tests on this very sight as well as many others routinely benchmark with 4XAA or higher AA turned on, and you know why they do that? Because the END USER will use that feature because that is precisely what more powerful cards allow for, more capable rendering of what would otherwise be stressful for the low-end and mid-range cards.

I'm not disputing that the margins are small, but the margins are there, if the 290X, the card you CLEARLY favor were ahead by the modest margins you'd be breaking your arm to pat yourself on the back for your preference in cards, but because it's an Nvidia alternative it's therefore reason to lose your everloving mind?

Not to mention going on a curse laced rant about what you insist is unimportant, but clearly important enough to you to foam at the mouth over... So it's unimportantly super important!

I'm not buying either card, I have no need of either. I haven't played COD games since they went "modern warfare" setting... I much prefer the WW2 setting personally, but that's beside the point. I have played Far Cry 3, and enjoyed it quite a bit for what it was, but I'm still not quite sure what that has to do with the entire point of your mental-Chernobyl meltdown...

So you can stop with the inane personal attacks, all it shows is how impotent your position is and how ludicrous you're acting over what you have already started out saying isn't very important... If you want to try to demean my intelligence, you should first work on demonstrating that you have some small measure of it yourself with which to compare.

If you insist it's fake journalism, then that is a matter in and of itself, I do not defend nor condemn MaxPC's article or their writing. I have taken exception to many of the things they've written, but I don't go into a flaming nerd-rage profanity fit because I may disagree with something. Wouldn't it be more constructive to thoughtfully explain your point of contention than to cuss them out for testing under realistic usage.

Seriously, are you honestly going to pretend to suggest that you (or anyone) is going to buy either a 290x or a 780Ti and -NOT- use 4XAA (or higher) on every single thing? If you even feint for a second that you would, then I hereby call you a damn liar. It's utterly asinine to purposely rig the testing to an artificially low threshold because the AMD/ATi card might do better under less strain...

That's akin to saying "well I could lift more weight if the weights just didn't weigh so much".

Let's also do a Notepad test to see how fast each can render text!
Followed by a 48 hour no-holds-barred marathon of MINESWEEPER!
That'll really put them through their paces!

Again, if you disagree with the testing methodology, and you clearly do, why not state your case and actually -prove- your point rather than resorting to things like:

"you're wrong $%@#$@$ ^@#$% (^&(* you dirty ^@#$$#@%^"

You are also of course able to go to other sites that may test differently, or you can even setup your own test of your prefered 290x against a printout of the 780Ti, so it can't compete at all and will be utterly crushed by INFINITY PERCENT more! That'll show em!

At the end of the day, if you calm yourself down a little, and just look at the numbers, you can readily see that both cards are more or less the same darn thing, and frankly if I had to make a choice based upon performance for the dollar, the 290X wins hands down. It will do everything the 780Ti does within a margin entirely unnoticeable to the human eye, for less...

So where's the beef?

You just know there will be some "ZUPUR KLOK'D 290XXX" coming out in a month that will just edge out the 780Ti.

At that point, I expect you to be right back here, filling your tightey whiteys with fuming vitriol that now the margins are tiny in the other direction.

avatar

Nimrod

Holy fucking wall of text Batman. Im going to touch on the two main points that make you sound ignorant or new to the PC world.

Im not saying that people dont play with AA and I never implied that either. You clearly do not understand proper testing methodology when it comes to benchmarking and understanding graphics cards and there performance. You are likely a new comer to the PC world if this is the case.

Testing with AA turned on is a test OF AA performance, not the base performance of a card. AA performance is incredibly inconsistent across different cards, games and individual driver updates thought the life of the card and performance fluctuates sometimes wildly. This is critical because NO not everyone who buys these cards is doing it so that they can run with AA. Many of them just want more FPS on a 120/144HRz screen and therefore get more kills.

AA testing has ALWAYS been done AFTER a test without it first and then compared. This is something anyone who has been on the scene for a while already knows.

In this case it sounds like the author just didnt have enough time to run all the benchies that he wanted to so it explains why it is the way it is. I think he was caught between a hard deadline for the Mag and restricted access to the card.

Trying to make it look like I am an AMD fanboy or a COD fanboy just proves that you lack the proper computer knowledge to understand why i said what i did.

Next, if you had actually analyzed the benchmarks you wouldnt have to ask why I mentioned Far Cry 3.

avatar

kiaghi7

"You clearly do not understand proper testing methodology when it comes to benchmarking and understanding graphics cards and there performance. You are likely a new comer to the PC world if this is the case."

Yeah... I'm just going to place this right here:

http://www.maximumpc.com/amd_radeon_r9_290_benchmarks
http://www.expertreviews.co.uk/graphics-cards/1303510/amd-radeon-r9-290
http://www.extremetech.com/gaming/167309-radeon-r9-290x-hardware-specs-and-benchmarks-of-amds-titan-killer-leaked
http://www.techpowerup.com/191768/radeon-r9-290x-clock-speeds-surface-benchmarked.html

Yeah... I'm noticing many many many MANY MANY MANY reviews WITH at least 4XAA on, and the stark minority -WITHOUT- AA on... Even among sites that test both with and without AA enabled.

You like to keep saying that reviews aren't done that way, but both here, even ON the 290X as well as site after site after site I've checked routinely do testing and benchmarking with 4XAA on. If anything it is YOU that is in the exception in that regard.

If you do not like that statistic, I completely understand, and if you choose not to use it as a means of measure or comparison that is entirely your choice and preference to do so. MaxPC and many other reviewers however do indeed choose to use that variable, so it is you that is incorrect in the assertion that it is not used in benchmarking.

As to your continued personal attacks on me, and assertions of my being ignorant. I feel no need to prove myself to you, I've merely demonstrated a point, and questioned your frothing-at-the-mouth rantings over a methodology which is clearly quite common throughout the industry. You are free to disagree with the methodology, but that in no way makes anything I've said wrong/ignorant, particularly when all the evidence points toward you being incorrect in your repeated claim that it's not done that way when it clearly is being done that way quite commonly.

YOU may not choose to do it that way, but YOU don't necessarily get to control how others do things, which is likely why you've resorted to insults and cursing since you don't think you can get your way. I guess stomping your feet and having a temper tantrum at your house isn't enough, you had to digitize it here... Hence all the incoherent cussing rather than trying to actually prove your point.

"Trying to make it look like I am an AMD fanboy or a COD fanboy just proves that you lack the proper computer knowledge to understand why i said what i did."

I don't think you were addressing me, but I'll mention just the same that I've never said anything whatsoever about you in regards to Call of Duty, I merely mentioned that I wasn't interested by the modern warfare setting of recent years and that I prefered the WW2 COD games. As to your feelings for/against AMD, I genuinely do not know, but you seem awfully invested in the defense of the 290X, and deeply offended by the pixels on your screen showing the results.

I've used Nvidia, ATi, AMD/ATi, 3DFx, and Matrox cards over the years for various purposes, I have no preference in producer so long as the product does what I need it to do. My only "dog in this race" is getting to see technology march ever forward and getting better so the end result to the consumer is better. Frankly I'm quite pleased that the results between AMD and Nvidia are getting closer so that they will both get hungry and competitive again!

Yet again I reiterate, for what you initially claimed was not important, you seem to feel it's exceedingly important enough to perpetually insult me and everything you disagree with solely because you don't like how the results came out.

avatar

Nimrod

You are wasting your breath and here is why, quoted from me in my last post; "AA testing has ALWAYS been done AFTER a test without it first and then compared."

This is the SECOND time you have taken something i have said out of context and written a dissertation on it. I never said people dont test AA at all or that they shouldnt.

And if you want to spam links why dont you have a look at this one

http://images.anandtech.com/graphs/graph7492/59670.png

Read the rest of that review if you want. The benchmarks convulsively show that the r290x CF is faster than the 780i SLI in almost everything. Your logic about AA testing applies to SLI/CF just the same does it not?

avatar

kiaghi7

I have not taken you out of context, in fact it is you that is routinely guilty of that, and still manage to be incorrect in your assertion(s). I merely use your own words against you like a weapon to show you how incorrect you are. At this point you're literally arguing with yourself, I merely need to keep you waving your own tail in front of your nose and off you go in a circle of trolling.

"Moreover, why the FUCK did you run all these tests with 4xAA turned on?"

"You dont bench with AA turned on, everyone knows that."

Both quoted verbatim from your posts... If you have to resort to lying to pretend like you haven't been wrong this entire time, be better at it.

They ran the one of the tests WITH 4XAA (just like numerous other reviewers do) and they ran the other test WITHOUT any AA on (also just like numerous other reviewers do).

So you're just grasping at straws now. You can't defend your original curse laden rants anymore because you realise how preposterous you've been acting, and now you can't even defend how obscenely you've acted toward others because of your uninformed indignant diatribe was called out and your position is exposed as indefensible.

So now you just resort to the troll stand-by, if you can't win the argument, then you must therefore attack the person who showed you up.

As to your link, did you not read it? It clearly shows that the 780Ti is actually superior. Both in SLI/Crossfire modes and singularly... Perhaps in your nerdrage, you were blinded to the different entries for the 780 and the 780Ti, but the 290x is indeed behind the 780Ti in your very own link.

So while I appreciate you proving my case, YET AGAIN, it's unnecessary, I've been the only one pointing out facts this entire time.

avatar

Nimrod

Sometime an idiot like you just needs to be called an idiot. You clearly done respond to real world facts and you just make believe what ever suits you the most. No point in doing anything other can call you what you are.

avatar

AFDozerman

TL;DR

avatar

Mediziner

You really should, it's preeeetty entertaining

avatar

The Mac

seriously....

avatar

borkbork

Well that's one of the longest rants I've seen in the comments here.

Nimrod does have a point, a lot of places bench AA separately. MaxPC did rush this out though, so I'm sure we could cut them a little slack.

avatar

limitbreaker

Although i would have worded it a lot more civilized, I'm completely with you on this Nimrod. I can't believe Maxpc would allow such terrible reviews to go up on the site. I personally buy Nvidia cards because i love the 3dvision support but I want to see fair and unobjective reviews on supposedly reputable sites like maxpc and not this horrible stuff that's being pushed. Did you read the review for the 290x? It was even worst than this!

I think I'm going to unsubscribe from the magazine now...

avatar

PCLinuxguy

Still rocking a GTX 690 on my IB i7-3770K.

avatar

joshnorem

It's still a great card :)

avatar

PCLinuxguy

It sure is. I would love a titan, but that's too much for me to spend since the 690 is still not horribly old and runs everything I can punish it with. Looking forward to your next round of card comparisons.

avatar

Lonestar166

There is a new build on my horizon, and two of these puppies are going in it. Makes me proud to be a PC gamer. Thanks for the info Maximum PC, awesome job.

avatar

joshnorem

Two of them would be able to play pretty much anything for the next few years we would imagine. Congrats on the build!

avatar

Baer

I am doing a new build and am waiting for the ASUS Rampage IV Black edition. I held off getting the HPU's as I did not need them yet and I am glad I did. My plan is a pair of these driving surround (3 X 24" 1920 X 1200 @ 5760 X 1200, HATE 1080P :-) )
For once the timing worked out well for me.

avatar

AFDozerman

Time for a 290x boost edition.

EDIT: I think I just realized something: this explains Mantle. AMD saw this coming a long way off when Nvidia announced GK110. They took a look at it and decided they needed to do more than just release a faster version of GCN (2.0 was probably half way through the works then), so they started throwing together a new API to allow devs to squeeze more performance out of the same GPU.

If true, AMD probably probably breathed a sigh of relief when titan came out as a gimped version of the real thing. Gave them awhile to really put the spitshine on it.

avatar

John Pombrio

I really think mantle is a non starter. Unless AMD releases a version for NVidia cards, any games a developer makes would have 60% or more users unable to run the game or will be hobbled in some way. Why on Earth would anyone do that? Even on next-gen consoles, it would be iffy. Plus, it has yet to show that it is "blazingly fast" as AMD is hyping it to be.

avatar

AFDozerman

Well, mantle IS an open standard, although they weren't too clear about in what way it is open. If it is truly open, Nvidia might possibly pick it up and create their own side of things, although I don't see that ever happening. As far as consoles go, it has been confirmed that they do not use Mantle.

Honestly, if only one thing succeeds, I imagine that it would be the new AMD specific OpenGL extensions. They offer near mantle performance for much less coding. If you're already going for OpenGL, an AMD specific codepath is a no-brainer.

avatar

The Mac

well, since at least 6 developers have already announced they currently have it implemented in their engines, i guess well see.