Rumor: AMD Readies HD 5870 and 5850 for September 22nd Launch

11

Comments

+ Add a Comment
avatar

AleceHelix

Why the all capitals? 

avatar

RAMBO

AMD / ATI CAN KEEP THEIR GHRAPHICS CARDS! UNTIL THEY MAKE FULL USE OF CDDR5

WITH A HIGHER MEMORY BIT PATH THEN I WILL ONLY BUY NVIDIA. I JUST READ ON THIS SITE

THAT THE NEW 5000 SERIES IS A 128BIT PATH, HELL THEY JUST CUT THAT DOWN TO HALF

OF WHAT IT WAS ON THE 4000 SERIES CARDS, AMD COULD BEAT NVIDIA IN PERFORMANCE EASY

IF AMD WOULD GET THEIR HEAD OUT OF THEIR ASS'S AND DO SOMETHING RIGHT FOR A CHANGE.

I ALSO SEE THAT THEY COME OUT WITH A QUAD CORE FOR 99 BUCKS, SOUNDS GREAT HUH! UNTIL YOU

SEE IT HAS NO CAHCE ON THE DIE, HELL I WOULD PAY $120 FOR IT AS LONG AS IT HAD A DECENT AMOUNT

OF CAHCE. AMD NEEDS TO FIRE WHO IS MAKING THEASE BONE HEAD DECESIONS TO RELEASE UNDERPERFORMING

PRODUCTS, OR THERE WILL SOON BE NO ONE TO GO AGAINEST INTEL AND US WILL BE FORCED TO GIVE UP

A LEG OR ARM JUST TO BUY A CPU OR GRAPHICS CARD; WAKE UP AMD OR IT WILL BE R.I.P.

avatar

gendoikari1

Well, AMD hasn't (officially) made the HD 4890X2 part yet, so I wonder if a HD 5890X2 part would come? AMD seems to (unlike Nvidia) care about DX11 (Nvidia hasn't said anything about their future cards).

avatar

Ogdin

 Sorry you got suckered by the nvidia naming scheme but a 9800gtx+ is a 2 year old card,rebadged 8800gtx. Its actually slower in some benchmarks.If you did a little reading before you bought that computer you'd know that it didn't stand a chance of maxing out Crysis.

  And i agree with you,not coded very well.

avatar

kevinkrg

OK, here is a consumer perspective.

 

I built my gaming comp for $500 six months ago so I could play crysis.  Being a year and a half old game at the time, I thought my new gaming computer could run it maxed.  I couldn't even run it well at medium

I bought a 9800gtx+ oc with my build.  Every other game was fine.  But this card was top of the line at the time (I mean, the gtx series just came out, so it was near top of the line).  I am still waiting to play crysis even on high and to get 40 fps.  15 fps is just crap.  

 

If I pay $300 for the 5850, I hope it is safe to assume I can get 40fps on very high.  If not, well, then Crysis is officially programmed terribly.  

avatar

bingojubes

it was probably poorly coded, but i think most of the problem was that it wasnt really compacted, so the game wanted to render trees that you could not even see. with all the foliage in the first level, that kinda sucked too, cause it wanted to render every single tree, even if you were not looking at it.

i found that if i took a Hum-V and chopped all the trees down, i could gain 1-2 FPS. kind of a time consumer, but kinda helped while i was in that particular area.

avatar

vulchan

I think I'll wait it out this time; either to see what nvidia has, or just go with two 5890's.

avatar

Ogdin

 So because you only run settings that require a 4870,they should stop making new cards? Pull your head outta your hind end.

avatar

bikerbub

I know the guy mentality. I am a part of the madness. I upgraded to 8gb of mushkin ddr2 clocked at 1200mhz. (im pretty sure i would have just gotten 4gb if i had to pay for it :D, it was a gift) I just don't understand why people have to be so ridiculous. I know that the more advanced the gpu and the faster the graphics memory, the more pixels of shaded, animated, and perfectly rounded game you can squeeze onto you're monitor. The only time that it's really necessary is when you are playing on something like a 46" tv.

 I also recently read about the 4 way XL-ATX board. WTF?!?! Really? sure its freakin awesome, but it's like having a few Tb s in your rig, its just flat out unnecessary unless you back everything up a few times over.

My stance is that unless programmers start to take advantage of the possibility of a gamer having an extremely pro graphics card(s), I think my P5q-e with Core2 Duo E7300 and HD4870 should suffice for now. It's all the same with processors, unless you NEED to multitask excessively, or are a video editing guru, a speedy dual core should do the trick.

avatar

hiimchris3

@bikerbub 

Sure, the upgrade may make no sense to a person in your position, but there are people out there that are enthusiasts about this stuff. If they have the money and are willing to spend it, why stop them? Also, if you knew anything about displays and video cards for that matter, you'd know that 46" TVs run at 1920x1080, which is the equivalent or less than the resolution of 22 - 24 inch monitors. With that being said, you can't assume a 46" HDTV needs more GPU power than a mainstream monitor used for gaming. Your 4870 may be good enough for you right now, but there are people out there that want more performance.

You also claim that programmers cannot take advantage of "extremely pro graphic cards." This should be obvious... The better your GPU, the faster calculations are processed, the better the game runs. Get your facts straight and stop making ridiculous claims. 

 

avatar

bikerbub

alright. that post wasnt great. what i meant to say, is, more power to the people who want to upgrade to the newest and the best, but at a point, the old preforms so good that it is almost an unnoticeable difference. I dont have the stuffed computer budget that some lucky people have, so I take a more reserved stance. And by the way, if the monitor was not HD, and even if it was, the 46" would require about 2x as many pixels as a 22" monitor to look equally clear. The in a comparison between regular def. to HD (480p to 1080p), 480p has about 350k pixels, while 1080p has somewhere around 2 million pixels.

And think about this for a second. Lets say you can play crysis at 100fps, because you're running 2 pairs of GTX295s, if you have a regular 22 inch lcd monitor, the refresh rate is probably only around 60Hz, so at max you only see 60fps. And 60fps is crystal clear. for example, Halo3 runs at 30fps.

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.