The development of PC display technologies over the last 30 years has taken us through many chapters: from IBM, the creator of the IBM PC, pioneering color display technologies (and ceding development to third-parties ATI, 3dfx, and nVidia); to the quest to provide both sharp text and colorful graphics; through the ever-increasing size of displays; to LCD flat panels overtaking TV-type CRTs; the move to 3D graphics rendering and, currently, to 3D viewing. Here's a brief history of these and other milestones in PC graphics history.
Graphics cards have gotten faster and added more features. So we have to ask the question: is it really worth adding a second GPU to your system? Will you get enough of a performance boost to justify the extra power draw and added cost? The answer is more complex than a simple yes or no. It all depends on what games you’re running, how much you dial up features like anti-aliasing, whether you’ve dived into the world of stereoscopic 3D and what monitor you’re running.
Perhaps the most important factor in the decision is display resolution. If you’re running a 1680x1050, 22-inch display, a single midrange or high end card will get the job done. Adding a second GPU is overkill. If you’ve got a 30-inch, 2560x1600 display and want to crank up the AA and postprocessing features, then that second GPU can be a big help.