Every year when we're building the new Dream Machine, it's hard not to feel a little nostalgic. We've built 15 of them in the past, after all, and each one was an experience (read: harrowing ordeal) all its own. We thought we'd tap into a little bit of that nostalgia, and bring you a Dream Machine retrospective. We asked current and former MaximumPC editors to tell us about their thoughts on the Dream Machine, and their experiences putting them together. Read on for their thoughts!
It's the end of an era, folks. In the coming months, AMD will retire the ATI brand, letting the ATI name ride off into the sunset after a remarkable 25-year run, presumably never to be seen again. Don't mistake that to mean AMD is getting out of the graphics business -- it isn't -- but once the brand is dropped, you won't see the ATI name attached to any new Radeon, FirePro, or EyeFinity products.
The decision came after AMD sent out surveys to several thousand "discrete graphics aware" respondents spread out across the U.S., U.K., Germany, China, Japan, Brazil, and Russia. According to John Volkmann, AMD's VP of global corporate marketing, "the Radeon brand and the ATI brand are equally strong with respect to conveying our graphics processor offering." That might be so, but it doesn't tell the full story behind ATI and its 25 year tenure in the graphics business, one that includes witnessing the rise and fall of 3dfx, and continued participation in what's largely become a two-man battle in the discrete graphics space.
Join us as we take a look back at some of the most important periods and events in ATI's history, starting with when it was formed in 1985.
Last month, we took you on a tour of computing's most venerated classic PCs. In our classic PC hardware retrospective, we highlighted the computers that deployed the innovations we take for granted today. But just as a car without gas is just roadblock, computer hardware without software is essentially paperweight. And while it’s true that the hardware is the visually sexier component of a system, the software is equally important and often more challenging to create. Today, we take a look at the history of early computer software, from the first character-based interfaces to the last pre-32-bit OSes (yes, Mac OS included). We also spotlight the notable programs that ran on these various platforms, including the first productivity and design applications. And because we're avid gamers, we couldn't neglect video gaming's contribution to the software world -- we included the firsts of each gaming genre.
The soul of any computer is its operating system. This software component is the basic interface between the hardware and/or hardware BIOS (Basic Input/Output System) and the rest of the software. It provides the basic capabilities such as user interaction, storage management, communications and so on.
Early Operating systems were fairly primitive with text-based interfaces, limited I/O, few storage options and marginal expandability. This was appropriate for the limited hardware of their day (who wants a 32K operating system on a 48K system?) but on today’s seemingly unlimited platforms we’re looking for more power. The evolution between there and here has been fast, furious and interesting.
Imagine a world in which all cars are like the Toyota Prius: four-door midsize hybrids. Sure, they aren’t bad cars, you can paint them any way you want and even modify some parts, but in the end you still just have a generic Toyota with a funky paint job.
That’s the world of personal computing today. It doesn’t matter if you’re running Windows, Mac OS, or Linux. Your machine is almost certainly using Intel chips at its core and almost everything else is fairly generic—even the world’s greatest case mod with water-cooled dual-Xeons and quad-SLI graphics is just a really fast PC.
This was definitely not the case 35 years ago. A quick tour of the Computer History Museum in Mountain View, CA, reveals machines that were as varied and unique as the companies that made them.
The microprocessors, if there even was one, were supplied by Intel, MOS, Zilog, RCA, or any number of other companies. Memory was static, dynamic, and shift-register. And without the Internet, programs were loaded from paper tape, punched cards, cassette tape, floppy disks, cartridge, or even manually switched in by hand.
In the following pages, we take a close look at some of the most influential personal computers of the past 40 years. From pre-microprocessor machines to the venerated IBM PC, each of these systems contributed in some way to the modern personal computing era.
Try to imagine where 3D gaming would be today if not for the graphics processing unit, or GPU. Without it, you wouldn't be tredging through the jungles of Crysis in all its visual splendor, nor would you be fending off endless hordes of fast-moving zombies at high resolutions. For that to happen, it takes a highly specialized chip designed for parallel processing to pull off the kinds of games you see today, the same ones that wouldn't be possible on a CPU alone. Going forward, GPU makers will try to extend the reliance on videocards to also include physics processing, video encoding/decoding, and other tasks that where once handled by the CPU.
It's pretty amazing when you think about how far graphics technology has come. To help you do that, we're going to take a look back at every major GPU release since the infancy of 3D graphics. Join us as we travel back in time and relive releases like 3dfx's Voodoo3 and S3's ViRGE lineup. This is one nostalgiac ride you don't want to miss!