The 20 Most Important Moments in the History of ATI

Amber Bouman

It's the end of an era, folks. In the coming months, AMD will retire the ATI brand , letting the ATI name ride off into the sunset after a remarkable 25-year run, presumably never to be seen again. Don't mistake that to mean AMD is getting out of the graphics business -- it isn't -- but once the brand is dropped, you won't see the ATI name attached to any new Radeon, FirePro, or EyeFinity products.

The decision came after AMD sent out surveys to several thousand "discrete graphics aware" respondents spread out across the U.S., U.K., Germany, China, Japan,  Brazil, and Russia. According to John Volkmann, AMD's VP of global  corporate marketing, "the Radeon brand and the ATI brand are equally strong with respect to conveying our graphics processor offering." That might be so, but it doesn't tell the full story behind ATI and its 25 year tenure in the graphics business, one that includes witnessing the rise and fall of 3dfx, and continued participation in what's largely become a two-man battle in the discrete graphics space.

Join us as we take a look back at some of the most important periods and events in ATI's history, starting with when it was formed in 1985.

Array Technologies Industry Formed

About 25 years  ago, several things happened. Super Mario Bros. for the original Nintendo Entertainment System was released in Japan, Michael J. Fox jeopardized the time space continuum by traveling willy-nilly through time, Coca-Cola inexplicably changed its formula and released New Coke, and Array Technology Industry, the  company that would later be known as ATI Technologies, was formed by Lee Ka Lau, Benny Lau, and Kwok Yuen Ho.

Between the three of them, the trio had saved up $300,000 , not enough to start a computer firm, but enough capital to form a graphics company. And that's what they did, kicking things off with a staff of six. Being a small upstart in Canada, PC makers were at first apprehensive about using ATI parts, and in just four months, that $300,000 well had run dry. Luckily for ATI -- and for the rest of us -- a bank in Singapore funneled $1.5 million into ATI in the form of a business loan, keeping the company afloat long enough to sustain itself.

Fun Fact: The "Array" in ATI refers to the "gate arrays" that were used in making customer ICs, a method which was later replaced by using ASIC (application-specific integrated circuits) technology.

ATI Produces Its First Graphic Board

You always remember your first, and for ATI, that's the "Small Wonder." Built using ASIC technology, the Small Wonder lived up to its name by supporting all standards, monitors, and systems that had been shipping at the time.

Image Credit: VGA Legacy

It came with an 8-bit ISA bus, up to 64KB of video memory, and supported up to a 640x200 resolution in Monochrome or CGA glory. Don't laugh, this was cutting edge stuff back then!

Commodore Signs On

Without support from computer manufacturers, that $1.5 million loan would have only delayed the inevitable for ATI, buying the company only a year of breathing room before it would be time to close shop. But that never happened, because before ATI ran out of money, the company secured Commodore as a customer in 1986, supplying the world's most popular PC maker of all time with 7,000 graphics chips a week. Going from rags to riches, ATI grew from a company owing $1.5 million to pulling in $10 million in revenue in its first year alone. Equally important, landing Commodore showed the computing world that ATI had the moxie to be a major player in the graphics market.

Fun Fact: With sales estimating from 17 million to 30 million units worldwide, the Guinness Book of World Records recognizes the Commodore 64 as best selling PC of all time. Pretty impressive when you consider the C64 sold for around $600 at launch, which if you account for inflation, is roughly $9 zillion today.

EGA Wonder/VGA Wonder

In 1986, color PCs had begun to appear, and the timing couldn't have been better. Having tasted the sweet nectar of success and with a pocket full of revenue, ATI kicked out its first graphics card under its own brand in 1987, the EGA Wonder. Built around IBM's "Enhanced Graphics Adapter" standard, the EGA Wonder delivered 16 colors at a resolution up to 640x350. A second card -- the EGA Wonder 800 -- would kick things up a notch by supporting an 800x600 resolution.

Image Credit: pctuning.tyden.cz

Later on, ATI would follow suit with the VGA Wonder. This was a 16-bit card that also included a two-button mouse port, which came as a boon to anyone who had already tapped out their available serial ports.

In addition to being ATI's first graphics card under its own brand, the EGA Wonder and VGA Wonder brought into existence the "Wonder" trademark. ATI would later use the Wonder nomenclature to describe its graphics cards that came with a built-in TV tuner.

Mach8 Videocard

Image Credit: Wikipedia

The Mach8 wasn't just an important milestone for ATI, it ranks as a noteworthy blip in the world of PCs, and graphics in general. With the release of Windows 3.0 in 1990, Microsoft's OS had begun to grow in popularity, and it also created a need for 2D acceleration. ATI answered the call with the Mach8, the first ATI product to process graphics independently of the CPU. It was also the first -- and one of the few -- graphics cards to support IBM's 8514/A display standard. But   perhaps most importantly, the Mach8 erased any doubt that the graphics card business was here to stay.

From a hardware standpoint, the Mach8 came with 512KB or 1MB of DRAM or VRAM and could support up to a 1024x768 screen resolution. The Mach8 chip was also used on a handful of other cards, such as the 8514 Ultra, 8514 Vantage, and VGA Wonder GT.

ATI Goes Public

Image Credit: Flickr joseph a

After eight years in the graphics business, ATI went public with stocks listed on the Toronto Stock Exchange. By that time, ATI's annual sales had ballooned to around CAD$230 million, but it wasn't smooth sailing after becoming a public company. For the fiscal year ended August 31, 1994, ATI posted its first lost, a CAD$2.7 million decline on sales of CAD$232.3 million. The stock subsequently took a nosedive from around CAD$20 to less than CAD$5. What's more, ATI was getting left behind in the world of chip design, which had been transitioning from 32-bit to 64-bit. A saving grace was on the horizon, however, in the form of the Mach64.

Nvidia Enters the Scene

The year was 1993 when Jen-Hsun Huang, Chris Malachowsky, and Curtis Priem formed a little company called Nvidia. All three men brought prior experience in the industry to their new venture, including building processors for AMD (Huang). Nvidia started off as a fabless company, meaning it didn't have its own manufacturing plant to kick out wafers and integrated circuits.

That would soon change, setting the stage for a fierce rivarly with ATI that would rage on for years, even as other graphics makers fell by the wayside and/or out of prominence. Today the discrete graphics market is almost totally owned by AMD (which now owns ATI) and Nvidia.

Mach64 Videocard Puts ATI Back on Track

Released in 1994, the Mach64 was ATI's first graphics card to boast full-motion video acceleration. Along with S3's Trio, these cards put the squeeze on companies like Oak Technologies and Cirrus Logic, both of which were beginning to lose market share to the competition with their fancy accelerators capable of offloading video tasks.

Image Credit: hattix.co.uk

The Mach64 shipped with up to 8MB of video memory on a 64-bit bus. ATI's Mach64 architecture also flexed its muscle in the professional graphics market, such as the Graphics Pro Turbo . Not for the faint of wallet, a Graphics Pro Turbo equipped with 2MB of VRAM listed for $600, while a 4MB model sold for $900. A veritable bargain compared to what professional graphics card sell for today.

1995 Happened

In 1995, ATI did something no other graphics vendor had ever done. The company released a videocard for the Mac platform, becoming the first company in the world to cater to both the PC (as in, Windows-based) and Mac. It was called the XCLAIM GA, an OEM part for Apple's PowerPC products.

"ATI is bringing affordable performance to the Power Macintosh market with a graphics accelerator board specifically deigned for Macintosh design and publishing professionals," AIT announced at the time. "Our PCI-based accelerators are priced at a third of what users would pay for traditional NuBus graphics cards."

The XCLAIM GA used ATI's Mach64 graphics controller and supported resolutions up to 1600x122 with a 75Hz refresh rate. It was available in both 2MB and 4MB versions with MSRPs of $450 and $650, respectively.

Equally important, supplying graphics cards to Apple played a role in ATI's return to profitability in 1995. Later in the year, ATI, United Microelectronics, and a few other partners entered into an agreement to build a semiconductor plant in Taiwan, giving ATI the foundry capacity it needed to remain a major player in the graphics market.

ATI Releases World's First 3D Graphics Card

By the time 1996 rolled around, the graphics market was no longer new, but it still hadn't seen its first 3D accelerator. That changed when ATI launched its 3D Rage chip, which was also called the Mach64 GT because it mashed 3D technology with the Mach64's 2D capabilities. The 3D Rage powered most of ATI's graphics solutions and would later provide a compelling alternative to 3dfx's mighty Voodoo chipset. The advance into 3D territory helped ATI sell over a million chips in 1996.

Birth of the All-in-Wonder Series

A graphics card with a built-in TV tuner? Brilliant! That was the idea behind ATI's now infamous All-in-Wonder line, which was first launched on November 11, 1996. The original All-in-Wonder was built around the 3D Rage II+ engine and slid into any available standard PCI port (it would be another year before AGP would emerge).

Combining TV functionality with computer graphics proved popular and there would be several AIW cards to follow. In almost every case, these cards would run a little slower than the 3D cards they were built around, but it was a trade-off HTPC enthusiasts were willing to make. The AIW line lived on for over a decade until ATI unofficially retired the series with the All-in-Wonder HD in 2008, the last AIW card to date.

First to Fully Support AGP

By the late 1990s, it was no longer a question of whether or not there existed a market for 3D graphics hardware, but how best to cater to it. There were two problems with the PCI bus. First, it was simply too slow. But compounding the problem, any device plugged into a PCI port had to share bandwidth with other devices.

In 1997, the Accelerated Graphics Port (AGP) was born. Built specifically for graphics cards, the AGP port solved both problems, but early graphics cards failed to take full advantage of the new spec and were nothing more than bridged solutions. With the release of the 3D Rage Pro, ATI goes down in history as the first company to serve up full support for AGP (AGP 2X).

ATI Goes on a Buying Spree

Someone over at ATI got the memo that you have to spend money to make money, and so in 1998, ATI spent $3 million scooping up nearly all of the graphics design assets from Tseng Labs, a struggling graphics company. Among other things, the deal included Tseng's facilities, proprietary designs, and team of 40 engineers. Later that same year, ATI would acquire Chromatic Research, Inc. for $67 million. At the time, Chromatic was involved with producing System-on-a-Chip (SoC) technology for set-top boxes and other CE devices.

The acquisitions helped ATI double its sales from the previous year to CAD$1.15 billion. Moreover, ATI was now the No. 1 graphics supplier in the galaxy.

ATI would make some other acquisitions in the years to follow, including the buyout of FireGL Graphics, making official the company's foray into professional graphics. But it's biggest acquisition might have been for ArtX, which made Flipper Graphic Processing Chips, such as the one used in Nintendo GameCubes. This acquisition would play a role in ATI landing its graphics chip in Microsoft's Xbox 360 consoles.

3dfx Goes Belly Up

The bigger they are the harder they fall, and 3dfx was at one time big. Really big. They were also heralded as a market leader and pioneer in gaming grade graphics, and quite frankly, we get a little misty just thinking about it. 3dfx truly revolutionized 3D graphics technology, and had they been managed better, might have been bigger than both ATI or Nvidia today. But mismanagement of funds, lengthy production schedules, and viewing the entry-level and mid-range markets as almost an afterthought all led to 3dfx's eventual demise.

Had 3dfx not gone bankrupt , it's hard telling what might have happened in the 3D graphics market. But when Nvidia acquired 3dfx's intellectual property in 2000, one of ATI's biggest competitors was no more. By that same token, ATI's main rival, Nvidia, received an influx of talented employees and valuable resources.

Birth of a Radeon

Image Credit: pcgameshardware.com

Congratulations, it's a videocard! It's hard to believe, but the Radeon name has been around for a decade. The very first Radeon card debuted in 2000 and was a DirectX 7 part. It also supported OpenGL 1.3. The Radeon brand signaled ATI's ascent back into the high-end graphics market and gave the company a competitive product line to go up against Nvidia, which by 2000 had become the fastest growing graphics company in the world.

Along with a fresh new name came a fresh new way of doing business. After getting pummeled by Nvidia, whose market share was almost twice that of ATI's, ATI brought in some new faces for upper management and began licensing its technology to Taiwan board suppliers rather than make all its own add-in boards. This give ATI a bigger budget to work with, and also the ability to create two chip-design teams, the upshot being ATI could tighten its release schedule for new products.

Going forward, once AMD retires the ATI name completely, the Radeon brand will continue to be used to designate new graphics cards.

9700 Pro

A case can be made that nearly all of ATI's products were significant in some way, but some rank higher up on the charts than others. The 9700 Pro, for example, was not only ATI's first DirectX 9 part, it was the first DX9 card in the world. So what's the big deal? Microsoft's DX9 API was one of the biggest things to happen to graphics in roughly a decade.

Image Credit: ibxtlabs.com

The 9700 Pro was also the first major product to come from ATI's newly acquired ArtX design group. A beast of a GPU, the 9700 Pro ran circles around every other graphics solution on the planet and leapfrogged ATI to the front of the performance pack ahead of Nvidia.

Fun Fact: The 9700 Pro was the first videocard to utilize a flip-chip package, whereby the die is flipped over and squished directly the cooler's heatsink.

Console Makers Come Calling

Do you own an Xbox 360 console? If you do, you're also the owner of ATI graphics hardware. Microsoft took an interest in ATI's acquisition of ArtX, the same company behind Nintendo's GameCube graphics, and inked an agreement with ATI to produce GPUs for the Xbox 360 console. ATI calls the part "Xenos," a 500MHz chip with 337 million transistors, 48 shader pipelines, and 16 texture filtering units.

Around the same time, Nintendo also tapped ATI to produces the graphics chip for its upcoming console(s). The first of these is the currently shipping Wii, which uses ATI's "Hollywood" GPU. In 2009, AMD would ship its 50 millionth Hollywood chip, making it the most successful AMD technology-based game console chip to date in terms of unit sales.

ATI Added to the NASDAQ 100

The NASDAQ 100 comprises the 100 largest non-financial stocks on the NASDAQ Stock Market, and at the tail end of 2003, ATI made the list.

"This achievement is a testament to the hard work of the people at ATI who have build a world-leading visual processor company. Through our teams' efforts, our visual processors have become the products of choice for every platform, from the PC to the digital consumer markets of color cell phones, DTV, and game consoles," Ky Ho, Chairman and CEO of ATI, said in a statement at the time.

Indeed, ATI's multi-pronged approach was paying off. Having secured contracts with both Microsoft and Nintendo, ATI goosed its revenues to almost $1.4 billion in 2003, and more importantly, returned to profitability on net income of $35.2 million. And to think, ATI began with just $300,000 in capital!

AMD Acquires ATI Technologies

After a remarkable 21 year run, ATI on July 24, 2006 was bought out by AMD for about $5.4 billion ($4.2 billion in cash and $1.2 billion in stock). At the time, analysts viewed the merger with a barrel full of skepticism, wondering how AMD would be able to juggle a graphics business when it was already struggling financially as a processor company. Ironically, ATI stock surged following news of the deal, while AMD stock dropped as investors questioned whether or not AMD overpaid for ATI.

Concerns aside, it was the end of a more than two-decade run for ATI as an independent company, but not the end of ATI as a brand, at least not yet.

Sobering Fact: The average marriage lasts 24 years. The ATI brand has been alive for 25 years.

ATI 5000 Series -- The Last ATI Cards Ever?

While the world waited for Fermi -- and waited and waited and waited -- ATI, now owned by AMD, catered to gamers with its HD 5000 "Evergreen" series of GPUs. These awesome chips put AMD back on the performance throne and delivered DirextX gaming to enthusiasts of nearly any budget. Best of all, ATI accomplished a new level of performance and support without turning PCs into Mini Bake ovens.

Still today, the dual-GPU ATI HD 5970 ranks as the fastest single graphics card around. And now that ATI announced plans to retire the ATI name, the 5000 series might be the last cards to ever carry the familiar logo for a graphics company that started way back in 1985.

Around the web

by CPMStar (Sponsored) Free to play

Comments