Supposedly, the wild popularity of smartphones, tablets, e-readers, smart TVs, and hand-held videogames has brought us the “post-PC era.” To hear some folks talk, PCs are not only in decline, but are almost as doomed as dinosaurs. For proof, they point to slipping PC sales and to troubled PC vendors like Hewlett-Packard.
Note: This column originally appeared in the February 2013 issue of the magazine.
Every girl crazy about a sharp dressed...computer chip
Long ago, all men’s suits were handmade by tailors. Then mass production made off-the-rack garments more affordable, and now only the wealthy or fastidious buy fully tailored suits. A similar trend has transformed the semiconductor industry, making custom microprocessors a luxury only for well-heeled companies.
No one should escape the deflating experience of suddenly feeling old by seeing something they once used now exhibited in a museum. (“Hey, I used to have a rotary landline telephone just like that!”). To bring this discomfort to younger folks than ever before, some enthusiasts in Silicon Valley are founding a Digital Game Museum.
Microsoft is the latest slow-moving behemoth to realize that the gravitational center of personal computing is moving from desktops to pockets. Intel sensed the shift about four years ago and developed the lower-power Atom processor. Now, Microsoft is porting the next generation of Windows to run on low-power processors based on the ARM architecture.
Last month, I talked about the growing need for radio-frequency (RF) spectrum to support Internet services on smartphones and other mobile computing devices. Some experts say we’ll need 700–800MHz of additional spectrum—none of which is available now.
We can’t manufacture RF spectrum. It’s a finite resource, and only some of it has the range and penetration required to blanket a region. Data compression conserves spectrum, but there’s a mathematical limit (Shannon’s law) that prevents further compression without losing data integrity. Today’s communications standards already approach the limit.
The telecommunications industry wants to grab more spectrum from TV broadcasters, who surrendered a big chunk of airspace in the recent transition from analog to digital TV. The telecoms want UHF channels 40 to 51, or even 20 to 51. Some people want to end terrestrial TV broadcasting altogether—which would still free less than half the spectrum we supposedly need.
Every new version of Windows promises faster booting, but PCs still take too long to boot. Despite faster processors, hard drives, memory, graphics, etc., we still waste a few minutes watching the machine come to life.
Indeed, many PCs seem to never stop booting. Years ago, we measured boot times by clicking a stopwatch while pressing the power button and waiting until the disk activity light stopped flickering. Nowadays, background tasks (antivirus scanners, software updaters, incremental defraggers, application preloaders, and various other daemons) awaken at startup and can stay busy for hours.
We might say the system has finished booting when the Windows desktop appears and we can launch apps and start working. But performance can be sluggish as the machine struggles to finish its startup chores.
I’ve had fun shopping for graphics cards, especially when a power user is within earshot. I’ll innocently ask the salesperson, “What’s your slowest graphics card?” The reaction is precious.
As I’ve confessed before, I’m not a gamer. Years ago I edited a videogame magazine and didn’t realize how weary I had become of games until the magazine unexpectedly folded. I stopped playing that day and haven’t resumed since. That’s why I don’t need fast graphics. Playing a YouTube clip is the most taxing graphics workload demanded of my computer.
Often, I won’t even buy a graphics card. I’ll scrounge a hand-me-down from a friend or cannibalize a junked PC. My oldest computer in regular use contains a discarded engineering sample of an Nvidia GeForce4 Ti-4200 from 2002.
Are you cringing yet? Mock me no more, power users. I’m reconsidering my wayward ways.
After decades of fitful progress, parallel processing is suddenly hot and will soon be commonplace on ordinary PCs. For applications rich in data-level parallelism, performance is soaring by leaps and bounds.
Multicore CPUs from Intel and AMD are all good, but the game-changers are the next-gen GPUs from Nvidia and AMD/ATI. These chips are evolving from highly specialized 3D-graphics processors for games into broader computing engines for nongame software. Nvidia is leading the charge with a new GPU architecture that, for the first time, supports general-purpose computing as strongly as it supports graphics.
Nvidia’s new Fermi GPUs will support error-correction codes (ECC), one terabyte of memory, concurrent kernels, and faster double-precision floating-point math. These features are largely unnecessary for 3D graphics but vital for high-performance general-purpose computing. (In fact, ECC slows down graphics processing, which is why it can be disabled in Fermi chips sold for the consumer market.)
In August, Nikon introduced the world’s first digicam with a built-in video projector. The Coolpix S1000pj has a tiny projector—called a picoprojector—that can display photos and videos at 640x480-pixel resolution. In a dark room, projected images are visible up to six feet away, up to 40 inches wide.
Although picoprojector technology has been appearing in small video projectors and a few other devices, the S1000pj moves this revolutionary technology into a mainstream consumer product. Soon, “embedded” picoprojectors will be everywhere.
An embedded picoprojector is one that’s built into a device other than a stand-alone video projector. Digital cameras, video camcorders, and camera-equipped cell phones are obvious candidates. Embedded picoprojectors will probably become as common as webcams in notebook computers. Hand-held videogames, media players, portable TVs, and ebook readers are additional possibilities. Picoprojectors will be used for advertising displays, vehicle entertainment systems, heads-up control panels, and other applications that can benefit from their space-saving properties.
If there were such a thing as post-traumatic stress disorder for weary veterans of OS wars, I’d have it. Frightening flashbacks of MS-DOS vs. CP/M... Windows 3.0 vs. Apple System 6... OS/2 vs. Windows NT... Windows vs. Mac again... then Linux vs. Windows vs. Mac. And that’s not counting the smaller conflicts that engaged OS-9, CP/M-86, AmigaDOS, and others too numerous to mention.
Now Google’s Chrome OS is challenging Windows? Please.
Look, I’ve railed at Microsoft as much as anyone, sometimes in these very pages. And my other computer is an iMac. But one thing I’ve learned is that a new OS needs a strategic advantage before it can defeat a deeply entrenched OS.