AMD Radeon HD 7970 Specs Leaked to the Web

28

Comments

+ Add a Comment
avatar

Ghost XFX

I think I'll hold on to my two 6870's. Til they can find a way to utilize 100+frps, there's no need to upgrade right now.

avatar

Rift2

I'm going Nvidia Kepler or Bust =)

I couldn't even get a good frame Rate with a ATI card at Rift launch.

Don't play Rift anymore

avatar

Carlidan

Interesting read.

avatar

Gezzer

These cards won't come out too soon for me Already have my three sweet 24" monitors for Eyefinity. 

But for some reason my 4870 512MB just doesn't work with them.

<presses face up to glass> Open, open, open.......

avatar

kixofmyg0t

Wait what happened to the 7900 series using XDR2? Last I was tracking the 7800(mmmm good old Geforce days right there for those who remember)'s will use GDDR5 while the 7900's will use the massive upgrade to XDR2.

 

Im holding off on getting a new video card until AMD confirms theyll use XDR2, if that happens ill break bank to get it. 

avatar

Gezzer

Heard AMD changed their mind. Decided DDR5 still had some life left. Though the fact it was a Rambus tech might of had something to do with it.

avatar

kixofmyg0t

I think AMD maybe chose GDDR5 because of price.

On equal bus widths; XDR2 detroys GDDR5. 256-bit wide XDR2 still has a wide margin of more throughput than even 384-bit wide GDDR5....but is a hair more expensive.

 

I guess it's overkill really. XDR2 isn't needed yet.

 

avatar

CHR15x94

AMD's also invested tons of money into GDDR5. Not to mention, there's still differential GDDR5, which from what I understand would provide effectively double the bandwidth of the GDDR5 currently used.

Does anyone even make XDR2? I know Rambus designs it but have any companies even shown interest in producing XDR2 chips?

Either way, as long as the cards are fast and work well I'm happy. My 6950 doesn't work properly (thank you XFX, love your lack of advanced RMA and wonderful quality control) so hopefully I can get a 7950 to replace it. Just hope it performs well and doesn't cost me an arm and a leg...

avatar

Gezzer

You might be right. Considering that AMD farms out the actual production of the cards to their partners they might of had some pressure applied to reduce component cost by the said partners.

Well hopefully this means AMD has a bit better margins on the cards. I mean I don't want to get hosed, but AMD's be running with tight margins for so long I hate to see the company go under because it's just not profitable enough. Then it'd be a Intel and Nivida win with us being the losers in the long run.

avatar

kixofmyg0t

From what I read AMD designed Tahiti to be able to use GDDR5 or XDR2 so they can actually upgrade it down the line if they need to.

 

512-bit XDR2=stupid amounts of bandwidth. That would be awesome.

avatar

blackdog

Radeons had a 512 bit bus some years back but didn't work out so well, hope they get 384 right 

avatar

kixofmyg0t

Ur right they did......of GDDR3. AMD(then ATI) experimented with various types and bus widths back in the day to find an edge.

avatar

praetor_alpha

The Radeon 2900 has a 512 bit bus. The Geforce 280 and 285 has 512 bit buses also, but for some reason, Nvidia decided to scale back bus width for newer Fermi chips; maybe for Kepler too, but no one knows.

avatar

Holly Golightly

The design is alright I suppose. I prefer the design of the 6,000 series to be honest with you. It is more angular and has these bad-ass stripes attached to the sides. Still, the key is performance. In order for me to buy, it must come with the backplate. I do not like the fact that graphic cards are upside down for ATX motherboards. So a back plate will give it the illusion that it is upside up. Looks great on windowed rigs.

avatar

noobstix

Heh, after being stuck on a 256-bit bus for all these years, they're finally going to increase the bus width to something Nvidia has used for years already (AMD probably reached the limit on what they could do with a 256-bit bus).  Not only that, it still only requires an 8-pin + 6-pin instead of 2 x 8-pin.

avatar

kixofmyg0t

AMD wasn't "stuck" on 256-bit. They have used 512-bit before. Heck AMD(then ATI) was the FIRST to use GDDR5. They went back to GDDR3 because at the time it gave no advantage and jacked up the price of their cards.

AMD chose to stay with 256-bit simply because they didn't NEED anything wider to compete with nVidia.

 

Now AMD has a new process, a new architecture and a wider bus which should help cement their lead over Kelper.

avatar

Khoiboi

I just bought the 6970 earlier this year and already something else trumpts it, =/ forget it, i'm getting a mac

avatar

vig1lant3

Sure, why not get a Mac?  If you can't upgrade the gear, you'll never need to worry about getting the latest and greatest components.  Yeah...good idea...

avatar

Gezzer

Good idea. Limited hardware choices for an extra premium is where it's at.

At least you won't have buyer's remorse. Well that is if you drink the iKoolaid.

avatar

scoop6274

Screw mac, you might as well buy a Commodor 64! No worries about having to upgrade the video card on that!

avatar

Baer

So that will be sure you keep older hardware until you buy a new mac instead of having the choice to upgrade with whatever and whenever you want.

avatar

Veedek

Doesn't a 6970 play anything that you throw at it? Do you buy graphics cards for bragging or for gaming?

avatar

KenLV

Does for me with a stock 2700K...
BF3 maxed setting (ultra all) @ 1920 X 1080 on a 27" and smooth as butta'
Same for MW3 (extra all maxed all else)
As far as Rage, well, the problems with Rage are IMO with Rage.

avatar

praetor_alpha

It should, but then there's games like Rage...

avatar

Recidivist

Wha....What?

avatar

Archtard

Yeah, because no new hardware will ever trump a Mac. Wait my year old PC ($1,800) trumps a new $3k Mac. 

avatar

Gutzx

Lol because Macs don't use outdated hardware

For only and arm and a leg you can now have your very own TOP OF THE LINE ATI Radeon HD 5770 with 1GB GDDR5!!! 

http://store.apple.com/us/browse/home/shop_mac/family/mac_pro

Welcome to the elite inner circle Khoiboi

avatar

vig1lant3

Heh!  Let's not forget about the 2 year old Xeon CPU's.  Cutting edge architecture in 2009!!  Definitely worth $4000 today, right?  Hahaha...I wonder what it feels like to get raped by a tech giant and not even realize it?

 

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.