From Monochrome to Multitouch: A History of PC Displays

Marcus_Soperus

The development of PC display technologies over the last 30 years has taken us through many chapters: from IBM, the creator of the IBM PC, pioneering color display technologies (and ceding development to third-parties ATI, 3dfx, and nVidia); to the quest to provide both sharp text and colorful graphics; through the ever-increasing size of displays; to LCD flat panels overtaking TV-type CRTs; the move to 3D graphics rendering and, currently, to 3D viewing. Here's a brief history of these and other milestones in PC graphics history.

PC Display Pioneers: 1981-1986

The IBM PC and other early PCs provided three display choices: monochrome display adapter (MDA) provided crisp 720x350 text resolution, but no graphics at all, and your monitor contained green, amber, or (if you were lucky) white phosphors. The second option, the color graphics adapter (CGA), included choice of two-color 640x200 or four-color 320x200 graphics (such as this screen shot from the original F-15 Strike Eagle game) and text. For even fuzzier results, you could use a composite video signal out of the IBM CGA adapter instead of the normal 9-pin digital port (the same physical port was used by MDA and CGA).

Image source: Glenn's Guides

For monochrome users who longed for graphics, 1982 brought a third-party solution: the Hercules Graphics Card (HGC), featuring MDA-resolution text and 720x348 graphics (see Flight Simulator, shown below). Some versions of the card and its many clones also included a parallel (LPT) port.

Image source: You-Hobby-Videoblog

You could run both HGC and CGA cards in the same computer, or buy an IBM PC compatible such as the Leading Edge Model D which featured HGC and CGA-compatible graphics on the motherboard.

Image Source: Wikipedia

In 1984, IBM introduced its first high-resolution color graphics standard, the Enhanced Graphics Adapter (EGA) with 16-color 640x350 graphics and CGA emulation. It required different displays than CGA, and the card and display were much more expensive. As this game screen comparison from Sierra's Space Quest shows, EGA wasn't nearly as good for gaming as its follow-up, VGA.

Image Source: ATMachine's House of LucasArts and Sierra Oddities

Two third-party products introduced in 1985 helped unify fragmented display standards:

ATI’s Graphics Solution series of cards, supported CGA, MDA, and HGC standards in a single slot and later versions also supported EGA. The use of very large scale integration enabled ATI's Small Wonder Graphics Solution to be much smaller and more versatile than the original IBM EGA card. These cards use the original 8-bit ISA bus.

Image Source: VGA Legacy

The original NEC MultiSync CRT monitor enabled a single monitor to handle CGA and EGA signals, and was so versatile it could also be adapted to VGA. Unlike older monitors, which usually featured 12-inch to 13-inch diagonal screen sizes, the MultiSync was one of the first 14-inch diagonal measurement displays.

The Analog Revolution – 1987 and beyond

Although IBM’s MicroChannel bus, introduced in 1987, was a short-lived flop, the Video Graphics Array (VGA) standard IBM launched at the same time has endured to the present day. Even the most powerful graphics cards still support VGA standard resolutions (640x480 graphics, 720x400 text). Although VGA uses analog signaling, enabling original versions to display 256 colors at the maximum resolution from a total color palette of over 262,000 (256K) colors, it also supports earlier CGA and EGA standards and quickly replaced EGA.

ATI launched its VGA Wonder “Any Monitor, Any Software, Any Time” card in 1988 to help users transition to VGA. It used a DB9F connector for the older display standards, and a DB15F connector for VGA; this author used a card like the one shown to make the move to VGA in stages by first using the card with an EGA monitor, and then switching to a VGA monitor. This card uses the 16-bit version of the ISA bus.

Image Source: VGA Legacy

The next big jump in resolution was 1999’s XGA (Xtended Graphics Array), which supported up to 1024x768 resolution. This is also the recommended minimum for Windows 7.

Bus Wars: PCI, VL-Bus, AGP, and PCI Express

In 1993, Intel began replacing the ISA bus with the Peripheral Component Interconnect (PCI) bus version 2. The PCI bus supports 32-bit data and runs at 33MHz, twice the speed of 16-bit ISA. PCI was a particularly good match for the first Pentium processors introduced in 1993.

The rival VESA local bus (VL-Bus) was a 32-bit extension of ISA used for video as well as ATA/IDE hard disk interfaces on 486 and later some Pentium computers. VL-Bus’s reliability and timing shortcomings led to a short operational life. VL-Bus cards were much longer than PCI cards because of the extra connector, as in this comparison of an ATI Mach 64 card using VL-Bus and a Rage128 GL card using the PCI bus.

Image Source: Wikipedia

Bus Wars Part II began in mid-1997, when Intel rolled out the Accelerated Graphics Port. AGP quickly became the preferred video card slot type for systems with Intel or AMD processors. AGP 1x was twice as fast as PCI and eventually ran at speeds up to 533MHz (AGP 8x).

Image Source: author's hardware collection

Bus Wars Part III began in 2004, when the first PCI Express (PCIe) x16 video cards arrived. PCIe v1 provides about twice the bandwidth of AGP 8x (4Gbps versus 2.13Gbps), and PCIe v2, introduced in 2007, provides twice the bandwidth of PCIe v1. By 2006, most high-performance PCs included PCIe x16 support.

Image Source: Maximum PC

The 3D Revolution

For the first nine years of the VGA era, graphics were strictly a 2-D affair. 3dfx's Voodoo changed that in 1996. By connecting to an existing PCI graphics card and performing 3D rendering, it made 2D-only graphics cards seem rather...flat. A successor, the Voodoo 2, supported a new feature called Scan Line Interfacing (SLI). SLI used two Voodoo 2 cards to alternatively render even and odd-numbered scan lines on a single display. As this diagram shows, a Voodoo 2 SLI setup used three cards: one PCI or AGP for 2D, and two Voodoo 2 cards for 3D.

Image Source: The 3dfx Help Page

Although Voodoo 2's SLI was praised for smooth 3D, users wanted a single-card solution for 2D and 3D operations. Early 3dfx single-card 3D models included the Voodoo Banshee and the Voodoo 3, with ATI countering with its Rage Pro and NVIDIA offering its RIVA TNT and TNT2. The subsequent NVIDIA GeForce 256 provided much better performance than 3dfx's Voodoo 3, and in 2002 NVIDIA would gain much of 3dfx's intellectual property as 3dfx went bankrupt.

NVIDIA versus ATI

NVIDIA and ATI (now part of AMD) have battled for 3D mastery for over a decade, introducing new GPUs as new versions of Microsoft's DirectX 3D API were introduced. Some of the rival products from 1999-2004 included:

DirectX 7: NVIDIA GeForce 256 and ATI Radeon 7xxx series

DirectX 8: NVIDIA GeForce 3, GeForce 4 Ti and ATI Radeon 8500/9000 (DirectX 8.1)

DirectX 9: NVIDIA GeForce FX and ATI Radeon 9700/9800

Multi-GPU Wars: NVIDIA SLI versus ATI CrossFire

In 2004, NVIDIA brought back the SLI acronym for a new method of dual-GPU 3D rendering it called Scalable Link Interface. In 2005, ATI introduced its own dual-GPU 3D rendering technology called CrossFire. Initially, SLI provided a cleaner connection (see photo) than CrossFire's bulky external cables. However, ATI also uses its own type of bridge board in its more recent CrossFire and all CrossFire X implementations. NVIDIA SLI now supports up to three GPUs, while ATI's newest Eyefinity surround 3D technology supports up to six GPUs.

Image source: asisupport.com

Display Wars

From 1981 until the early-2000’s, the preferred display technology for desktops was the cathode-ray tube (CRT), which ranged in size from 12-inches to 21-inches diagonal measure.

The first monochrome liquid crystal displays (LCD) for computers were built into mid-1980's laptop computers such as the HP-110 shown below. The first laptops with LCD color displays were introduced in 1989, but ran at low resolution CGA (640x200) or double-scan CGA (640x400), while the first LCD color screens with VGA resolution appeared in 1992.

Image Source: Hewlett-Packard

Early color LCD displays used slow passive-matrix displays with washed-out colors, but by the mid-1990’s, active matrix displays (which use a transistor for each pixel), provided improved color and faster response. Starting in the early 2000s, LCD displays began to show up on desktops, and by 2003, LCDs outsold CRTs. A KDS RAD-5 similar to the one shown below was the author’s first desktop LCD panel and is still going strong after 9 years.

Image source: ii.alatest.com

Starting in about 2004, widescreen LCDs in various sizes began to push aside 4:3 aspect ratio LCDs, and most recent 19-inch or larger displays support 720p HDTV, while most recent 22-inch or larger displays support full 1080p HDTV. The current size champion is the 30-inch LCD with resolutions up to 2560x1600, such as this HP ZR30w (chosen for the 2010 Dream Machine ).

'Greening' LCD Panels

The latest trend in LCD panel design is the use of LEDs for backlighting , providing lower power usage and better black levels while eliminating the need to use mercury in the panel's construction (mercury is a component of fluorescent tubes and CFL's, and is a dangerous element).

3D Viewing

If you want to see the 3D action on your display in true 3D, you need special glasses and a monitor or HDTV capabole of a 120Hz refresh rate. NVIDIA has you covered right now with its GeForce 3D Vision kit, and its 3D Vision Surround now supports 3D viewing on three displays, as tested recently by our own Alan Fackler .

AMD supports 3D graphics, too, but you'll need to acquire 3D glasses from third-party manufacturers such as Sapphire . With more data going to the display(s) than ever before, the venerable VGA port isn't up to the challenge.

Display Interface Wars - DVI, HDMI, and DisplayPort

LCDs, unlike CRTs, use digital signaling. But when an LCD is plugged into a VGA (analog) port, an extra conversion step is required. The Digital Visual Interface (DVI) port developed in mid-1999 supports both pure digital connections (DVI-D) or the option to connect to either digital or analog displays (DVI-I) with an adapter. DVI-I ports are found on most graphics cards and some systems with integrated video.

Image Source: author's hardware collection

Many recent graphics cards and systems with integrated video now also include an HDMI port. HDMI carries HD-quality video and audio signals, enabling direct connection to HDTVs and modern home theater systems.

Image Source: author's hardware collection

DisplayPort and ATI Eyefinity

Unlike DVI and HDMI, the DisplayPort 1.2 standard developed by the VESA tradegroup in 2009 enables daisy-chaining of displays and supports up to six displays. The ASUS Matrix HD5850 card shown includes an HDMI port (top), a DisplayPort port (middle), and a DVI-I port (bottom).

Image Source: Maximum PC

DisplayPort can use powered adapters to drive displays with DVI or VGA ports, it can group multiple displays into a single logical unit, and it can daisy-chain displays. AMD takes advantage of these features in its Eyefinity surround gaming technology. Eyefinity is supported by AMD's Radeon HD 5400-series (and higher) GPUs and can display a single game across six independent monitors.

Touchscreens

Touchscreen technologies have been around for years for such duties as information kiosks, but didn’t become common in PCs until the development of tablet-based computers starting in the mid-2000s. The development of Windows 7, with its support for multitouch capabilities (see the sample of Microsoft Surface below), is enabling touchscreen portables to break out of the traditional medical and business uses into mainstream use.

Image source: Microsoft

On the Horizon

Since the first IBM PC and its monochrome green phosphor display rolled out of the IBM Boca Raton facility in 1981, display technologies for PCs have gone through almost unimaginable changes. Who would have believed then that computer displays could be used as completely satisfactory replacements for TVs? That 3D viewing would be possible in the living room? That computers could be used to replace darkrooms and slide projectors for photo processing and viewing? The advances in PC display technology have made all of these uses common.

What's next on the display horizon? Put on your wizard's cap, fire up your crystal ball, click Comment, and tell us what you see in the future of PC display technology. And, while you're at it, share your favorite stories of the best (and worst) of PC displays and graphics cards past.

Around the web

by CPMStar (Sponsored) Free to play

Comments