The 20 Most Important Moments in the History of ATI



+ Add a Comment


It's important to note that Commodore did not use ATI chips in the Commodore 64, but in their line of IBM compatibles. The PC series enjoyed very modest success. It was the C64 that claimed "best selling single model computer of all time."



Goodbye, ATI. Don't screw your new line of graphic cards up, whenever you may make them.




several thousand "discrete graphics aware" respondents spread out across the U.S., U.K., Germany, China, Japan,  Brazil, and Russia. According to John Volkmann, AMD's VP of global  corporate marketing, "the Radeon brand and the ATI brand are equally strong with respect to conveying our graphics processor offering."


This statement is pure utter horseshit. They wanted to just claim the advances and reputational achievements for themselves and seeing as the ATI brand name ECLIPSED AMDs status as a graphics producer they will eliminate that problem.


Never buying another AMD product again foir JUST this reason as I know people who were engineers at ATI in Newmarket Ontario who bustedf their asses for that company and to make it what it became.

AMD can blow me.




Have fun spending 999 dollars on a Hexa-core CPU, dude. You're over analyzing the situation, even more so than AMD. ATI has had a huge role in AMD's success as of late, but i don't think there was some huge agenda behind the name change.



The tv input part crapped out after one year of ownership/
I paid 350 for it in 2004.
Another fact I hated was loose proprietary connector on the back of it.
That was my last ATI video card.

On a side note, I can't believe that major technical website has the most major spam problems with most major spam protection that pisses off it's regular users.



Of course they're going to be more likely to get spammed than a less popular website.



No dipshit, they don't think that at all. And infact, it has nothing to do with AMD's products, even though they are fine chips that intel fanboys feel threatened by (hence your derogatory post). Now the concepts of business might be a little much for you little fanboy, but supporting and maintaining a brand name gets more difficult and expensive as it gains in size, and it makes perfect fiscal sense for AMD to consolidate the two into a single entity.

You still there? Or have you gone back to watching RayWilliamJohnson and listening to the black eyed peas?



you dont have to be an asshole with your explaination.  perhaps he didnt think of that aspect when trying to figure it out.  he made a simple comment and you fired back with a rude paragraph. douche



I was sad when they were bought out by AMD, and now I'm even sadder. This marks the death of a great brand, regardless of how they may or may not have been backed by AMD for the past couple of years.

Having been an ATI fan for ages, I was disappointed to find myself swapping my 5850 recently for 2 GTX 460's in SLI. I originally intended to run 2 x 5850's in Crossfire, but buggy drivers (see cursor glitch in SC2) and poor minimum framerates, combined with AMD's second-rate customer service and being unable to find another reference 5850 anywhere save for a used one... The list goes on and on.. Oh and did I mention the ridiculous prices of the 5850 until they recently started to trickle down? Sure it was cheap when it launched, and that's how it should have stayed - at the $250 price point, not the painful $350 I paid for mine at the time.

Alas, my GTX 460 1GB models are wiping the floor with the ref / non-ref 5850 xfire combo I had tested for a short time before returning the non-ref card (it was just fugly; way too long for my case and completely dashed my plans of using a liquid cooled solution down the road). SLI scales better, hell even my first GTX 460 by itself was outperforming my 5850 in the games I played thanks to improved drivers and more game companies jumping on the Green bandwagon to optimize their code for NV gpu's.

On a side note, ATI chips were being used in some gamecube models before the Wii was ever released, so to say that the Wii was the first Nintendo product to incorporate an ATI gpu isn't entirely accurate.

Very sad to see yet another great brand die out in the PC industry. 3DFX, BFG, ATI, who's next? Yes, I know you could argue that ATI lives on thru the AMD brand, but given my experience these past few months, not so.



Excellent article!  I still have my 9700pro.



My very first Radeon GPU was the 9200SE (128 MB).  At the time I had it (which was in the late 90s), it was able to handle some games better than my S3 ViRGE card (which only had 64 MB).  It wasn't until I realized that it was more than just the amount of memory on a video card that factored into gaming performance was when I convinced my dad to shell out a little more on a video card than normal.  I got myself a Radeon X1650 Pro which really satisified me for a few years since some games I was able to run on high detail while other games I could run on medium detail.  In these past 5 years, I've had to move on from that X1650 Pro in order to get something with more juice to play the more recent games.  That's when my brother gave me his Radeon HD3650 which was definitely much faster (I think about double the speed).  Now I'm rolling with my Radeon HD5770 and will be proud that I have one of the last generations of video cards with the ATI name on it.  I think this video card will last me a few more years before I pass it down to my dad or oldest brother to make way for whatever AMD series comes up (probably the 7000 or 8000 series or whatever they'll call it).

Radeon 9200SE (ATI) --> Radeon X1650 Pro (ATI) --> Radeon HD3650 (Diamond) --> Radeon HD5770 (Diamond)



Really enjoyed this article Paul!



This is one of the most enjoyable articles that I have read on a long time.


Thanks Paul, and keep on the good work.




" and viewing the entry-level and mid-range markets as almost an afterthought all led to 3dfx's eventual demise."

No, actually it was the opposite.  3DFX management always said the big money was on the motherboard, and buying mb manufacturer STB was what sunk them.  Voodoo 4/5 was also greatly delayed by everyone working on the Banshee mid-range part.  If they'd stuck to designing high-end they would have survived at least a few years longer than they did.

This recollected quite clearly by a stockholder who suffered big losses.



First DX7 card in 2007? Crap, PS3/360 was waaaay ahead of their time.  :P

Simple typo, awesome article!



Minor mistake:


On p. 3, you mention "The very first Radeon card debuted in 2007 and was a DirectX 7 part." The year is obviously off.



Nice catch, the correct year is 2000. Fixed!

-Paul Lilly



Well it was a good run and I'll miss the ATI name.  They put out some good products (just not so good drivers).  I think I still have my old All-In-Wonder and Rage II cards in a box somewhere too, think I'll keep them for sentimental reasons.



"So I will keep it always!" syndrome. That leads to a closet full of old computer crap that is worthless. Folks are going to die and their kin are going to say "what the hell is this thing?" and (hopefully) recycle it. Why not save them the aggravation and recycle it yourself.

 I have a "one year rule". Have I touched it in a year? If not, do I really need to keep it?



D.I.P.R.I.P now! Blasted spammers, we don't need their kind here...



Wrong place

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.