We love the excitement of being on the cutting edge, but have to also acknowledge the risks of being early adopters of hardware. In fact, there have been numerous occasions where tech enthusiasts have put their faith into the seemingly fastest or the more innovative pieces of technology, only to be burned months or years later when that tech is revealed to have to a serious design flaw or falls victim to sudden obsolescence. In this roundup, we spotlight some of the most memorable PC parts and computing gadgets that showed huge promise but just didn’t deliver in the end. Whether it was high defect rates, underperformance, or bad launch timing, these products were poised to be market leaders if not for their spectacular failure.
Back in 1999, a company called Zen research collaborated with Kenwood to develop a revolutionary optical drive technology called TrueX. Instead of using a single laser to illuminate and read digital data off of a CD, TrueX proposed a multi-beam approach to illuminating and detecting multiple tracks of data on a CD at the same time using a diffracted laser beam. A normal red laser diode is sent through a diffraction grating that splits the beam into seven parts, each theoretically illuminating a separate data track. The trick that allowed TrueX drives to actually process these parallel beams was a multi-beam detector array to "pick up" the seven beams. In practice, this allowed Kenwood's TrueX series optical drives to achieve launch speeds of 40X (40 times the one-speed rate of 150kb/s) and even reach 72X while only spinning at the rate of 10X drives. 6-8MB/s reads were revolutionary at a time when DVD technology was still in its infancy, and the Kenwood drives were must-have components for enthusiast builders.
Unfortunately for Kenwood (and for many people who bought the TrueX drives) the mainstream 40X model suffered a high failure rate and actually failed to deliver on the hyped up speed promises of this new technology. In fact, Kenwood had to eventually settle a class-action suit filed by disgrunted 40X CD-ROM owners who complained about lackluster speeds, poor media compatibility, and even failing hardware. A design defect was discovered in the 40X models that caused the drives to completely fail to read discs. The drives eventually disappeared into obscurity when other manufacterers opted not to license Zen's laser-splitting idea, even though the company had plans to bring its technology to DVD drives. Both Zen Research and Kenwood's TrueX series disappeared soon after the debacle.
Anyone building a computer at the turn of the millennium must have remembered the supreme fail of IBM's Deskstar 75GXP. At the time of its release in 2001, the 75GXP was the fastest ATA/100 hard drive on the market, boasting a platter density of 15.3 GB/platter, 2MB cache, and fast spin speeds of 7200RPM -- still the standard RPM for most consumer hard drives eight years later. Its performance under optimal conditions reflected these impressive specs. The 75GXP bested its rivals from Maxtor and Western Digital in all speed tests, breaking IDE drive records. Its price also reflected its performance; depending on capacity, these drives sold for north of $200, which would easily get you a spacious 1.5TB drive today.
It's too bad then, that adopters of this read/write marvel were caught off guard by the incredibly high failure rates of the 75GXPs. Examining closer at the hardware in the 75GXP revealed that the drives suffered from two distinct problems. The magnetic read/write heads used in these drives were prone to failure, disrupting the way the head interacted with the disk platters. Data corruption from faulty heads led to the infamous "click of death" scenario where users could actually hear the scratchy death rattle coming from a drive as it was about to fail. Additionally, the 75GXPs were also equipped with firmware that led to the failure of the NV-RAM chip on the controller board. Worse yet, the symptoms of a failed NV-RAM chip were similar to that of the "click of death," which led to many users mis-diagnosing their drive failures and hindering data recovery efforts. A class action lawsuit was filed against IBM for manufacturing defects that led to drive crashes, which IBM never claimed responsibility for even after settling the case in 2005.
The turn of the century was marked by many significant technological advances that changed the way we used personal computers, not the least of which was the growing adoption of broadband connectivity in consumer households. But as broadband penetration was in its infancy, the 56K dial-up modem was how the vast majority of users connected to the internet. In 1998, Diamond announced a technology called Shotgun that would promise to double the speed of your internet by bonding two dial-up connections with either two modems or one of their special SupraSonic II dual-port modem cards.
Bonding, as it turned out, as a protocol that had been around for years in ISDN lines (actually an acronym for Bandwidth on Demand Interoperability Group) that would let modems pick up extra data from a second connection as bandwidth was needed. Effectively, this meant that you could get double the speed of 56K (up to 112Kbits/sec) through just regular phone lines -- almost the speed of ISDN and early ADSL lines! Additionally, since this was just a software protocol, it could be applied to basically any modem. Users could theortically bond any combination of 28-56K modems for added speed.
Shotgun modems failed for the same reason we aren't using 56K modems today: broadband killed dial-up. Diamond's shotgun modems came out right before DSL and Cable connections became a practical option due to expanded service areas and competitive pricing. Additionally, Shotgun required that users pay for two phone lines and two dial-up accounts (or a special higher-priced single account), and only worked with a few ISPs.
Diamond actually continues to sell dial-up modems. As for bonding technology, the idea lives on in Network Load Balancing, otherwise known as dual-WAN routing. Dual-WAN routers will balance traffic from two WAN sources (like two cable lines) to give you the combined speed of two separate connections. Network redundancy is another perk of running two WAN lines, though the service is primarily used by businesses.
How many of you knew that Intel had once designed and sold a discrete graphics card? In 1998, the CPU maker surprised everyone with the announcement and release of the Real3D Starfighter, a card based on their independently developed i740 GPU. i740 chips shook up the industry by being one of the first graphics cards to take advantage of the AGP slot at a time when market leader 3dfx was still using the PCI bus. Some analysts predicted that within a year, Intel could claim more than 20% of the 3D accelerator market.
Despite a big marketing push the i740 could never muster enough horsepower to compete with the best cards from Nvidia and 3dfx. The decision to use onboard memory exclusively for the frame buffer hurt performance, since the card had to use system memory to hold texture data. This meant using the AGP bus to access texture memory while competing with the CPU for system memory bandwidth. However, its reputable 2D speeds made it an ideal companion card for the 3D-only Voodoo 2s, and many OEMs bundled the i740 with Voodoos. Intel eventually decided to lower their sights to the budget-market, incorporating the i740's architecture into their GMA line of integrated graphics. And while the i740 remains a sore point in Intel's history, their upcoming Larrabee architecture -- due for release in 2010 -- will be their second attempt at competing in the discrete graphics market.
Back when 3.5" floppy drives were the dominant portable media format, Iomega wowed the tech world in 1994 with its seemingly revolutionary Zip Drive. The promise of 100MB of storage in a rewritable format that was only slightly bigger than the floppy disk (which maxed out at 1.44MB) was almost irresistible, and consumers flocked to this new medium for backups. The relatively low cost of these drives and disks -- especially when compared to the radically expensive CD-R drives -- also helped it gain popularity. As prices dropped and other suppliers licensed Iomega's technology, Zip drives nearly became a ubiquitous storage format.
Unfortunately, Zip drives eventually faced obstacles from three fronts by the end of the 90s. First, the drop in price of CD burning hardware and media, which boasted more than six times the capacity of the smallest Zip disks, led consumers away from this proprietary format. The rise of the DVD didn't help, either. Additionally, hard drive capacity was also growing at an amazing rate, giving users gigabytes of storage on the cheap to facilitate their backup needs. Furthermore, a small percentage of Zip drives were plagued by a hardware defect that led to an famous "death click" condition when the drive's read head became misaligned. The click of death didn't just disabled the drives, but also ruined perfectly good Zip disks. Iomega also released a SCSI-interface Jaz drive that held 1GB of data and didn't suffer from the click of death, but these drives never sold as well as the original Zip.
Here's a technology that fell victim to the speed and versatility of CPUs. Back when DVD-Rom drives were first becoming popularized for PC use, software playback of movie DVDs paled in comparison to what could be achieved with a discrete decoder. MPEG-2 decoder cards from Sigma Designs and Creative Labs bore the load of processing DVD video playback and did a better job of it than CPUs. These cards let you play non-interlaced full-motion video in a resizable window at high resolutions, and also took care of Dolby surround sound processing.
Eventually, faster CPUs were released that could handle runing multiple tasks while decoding DVDs, and GPUs also started incorporating support for acclerated video decoding. Even MPEG-2 or MPEG-4 encoding is practical with today's multi-core CPUs.
From our GPU retrospective : As good as the Voodoo1 was at the time, 3dfx found out not everyone was willing to invest in a two-card solution for 2D and 3D graphics. To remedy the Voodoo1's shortcoming, 3dfx released the Voodoo Rush in 1997, which added a 2D chip to the original graphics board, either as an integrated chip or a daughtercard. Gamers no longer had to fiddle with daisy-chaining multiple videocards, but at the expense of performance. A kludgy solution at best, some estimates put the 3D performance hit at up to 20 percent, a direct result of sharing bandwidth between chips. Making matters worse, the Rush suffered from poor 2D performance and instability, making it one of the few unforgettable cards in 3dfx's storied history.
After the Voodoo Rush, 3dfx attempted to release a similar product based on the Voodoo 2 chip. Much less menacing than its name implies, the Voodoo Banshee was more about 3dfx proving to the public it could design a single-chip videocard capable of both 2D and 3D rendering, just like the competition had been doing. With faster clockrates than the Voodoo2, the 128-bit Banshee was poised to be the fastest, most flexible videocard on the market, and that presented a problem for 3dfx, who feared the Banshee would cut into sales of the Voodoo2 released just weeks earlier. To prevent that from happening, 3dfx designed the Banshee with only one texturing unit, taking away its ability to support multitexturing.
Before Rambus RDRAM ever appeared on PCs, it was a component of the Nintendo 64, which utilized 4MB of RDRAM running at 500MHz. Two years later, PC builders would get a taste of this proprietary RAM when Rambus and Intel entered into a license contract to exclusively use RDRAM with Intel CPU chipsets. That meant that if you wanted to build a Pentium 4 system with an Intel motherboard, you had to use RDRAM. Intel even planned to make a half billion dollar investment into Micron fabs to boost the adoption of RDRAM. Intel figured that the success of RDRAM could generate a lot of licensing revenue from Rambus, and was confident in the superiority of RDRAM over alternatives. At the time, PC-133 and even DDR SDRAM wasn't nearly as fast as the 400MHz (PC800) Rambus offering, and it wasn't until dual-channel DDR 400 came out that SDRAM was dethroned.
Even with its speed advantage, memory vendors strongly resisted the adoption of RDRAM due to the high cost of manufacturing and expensive licensing fees. And as SDRAM prices continued to fall, motherboard manufacturers opted to utilize chipsets that supported SDRAM from chipset makers like VIA rather than Intel. Even Intel's subsidy of RDRAM by bundling RIMMS with Pentium 4 CPUs didn't help the standard. Today, Rambus has ceased development of RDRAM and is currently pursuing lawsuits against other RAM makers which it alleges conspired to depress the price of DDR memory to [successfully] kill off RDRAM.
The first "internet appliance" the 3Com Audrey was poised to revolutionize the way we interacted with computers in the household. Released in 2000 for a modest $500, this touchscreen PC was marketed as a way for families to take the PC out of the office and into the family living space. It came bundled with a dial-up modem, two USB ports, and a Compactflash slot, meaning you could email, or surf the internet for dinner recipes without leaving the kitchen. It even synced up to Palm handhelds (though those didn't fare so well in the long run, either). The Audrey's creators envisioned the device as the hub of the connected household.
Even though the Audrey attracted lots of media buzz and a devoted following, not enough consumers were convinced that they needed a simplified computing device for their living rooms or kitchens. The dot-com crash didn't help matters, and 3Com pulled the plug on the Audrey just a few months after its launch. The open-source community quickly adopted the device to hack its operating system, and has since developed custom internet browsers, MP3 players, and photo viewing software for the Audrey.
Despite the Audrey's failure, the concept was sound, and it paved the way for current all-in-one PCs like HP's Touchsmart line and Dell's Studio One computers. In fact, one of our current favorite gadgets, the Chumby, has a striking resemblance to the Audrey.
Intel’s NetBurst. Like all things, Intel’s NetBurst architecture that was used in the Pentium 4 can be spun two ways: a success that sold tens of millions if not hundreds of millions of parts over the years. From the enthusiast point of view though, it was a failure. During its life, it was lost various battles to the Pentium III it was supposed to replace, AMD’s Athlon and Athlon XP as well as the Athlon 64.
As we’ve said before, the Pentium 4 could never close the deal and coffee, as we know, is for closers. That’s not to say there weren’t successes. The Extreme Edition variants were indeed fast as were the initial “Northwood” derivatives. Near the end of its life, the Cedar Mill versions also had good reps as overclocking parts. Still, for a company that had a multi-decade track record of smoking the competition, the Pentium 4 / NetBurst era is nothing to call a success. It didn’t help that the first Pentium 4s were tied to the uber-expensive Direct RDRAM memory that ignited a rebellion among RAM vendors against Intel (see above).
The Pentium 4 even faced an internal rebellion. With the realization that the hotter than hell design would never make it a true mobile CPU, a splinter development started working on a Pentium III derivative codenamed Banias. Banias would eventually turn into Pentium M which morphed into Core Duo and finally Core 2. More than AMD, Core 2 put the final bullets into the Pentium 4 and sent it to languish in the stomach of the Sarlacc for a thousand years.
Did you buy into these or any other technologies that just didn't deliver? Share your nostalgia in the comments below!