No one should escape the deflating experience of suddenly feeling old by seeing something they once used now exhibited in a museum. (“Hey, I used to have a rotary landline telephone just like that!”). To bring this discomfort to younger folks than ever before, some enthusiasts in Silicon Valley are founding a Digital Game Museum.
Last month, I talked about the growing need for radio-frequency (RF) spectrum to support Internet services on smartphones and other mobile computing devices. Some experts say we’ll need 700–800MHz of additional spectrum—none of which is available now.
We can’t manufacture RF spectrum. It’s a finite resource, and only some of it has the range and penetration required to blanket a region. Data compression conserves spectrum, but there’s a mathematical limit (Shannon’s law) that prevents further compression without losing data integrity. Today’s communications standards already approach the limit.
The telecommunications industry wants to grab more spectrum from TV broadcasters, who surrendered a big chunk of airspace in the recent transition from analog to digital TV. The telecoms want UHF channels 40 to 51, or even 20 to 51. Some people want to end terrestrial TV broadcasting altogether—which would still free less than half the spectrum we supposedly need.
Every new version of Windows promises faster booting, but PCs still take too long to boot. Despite faster processors, hard drives, memory, graphics, etc., we still waste a few minutes watching the machine come to life.
Indeed, many PCs seem to never stop booting. Years ago, we measured boot times by clicking a stopwatch while pressing the power button and waiting until the disk activity light stopped flickering. Nowadays, background tasks (antivirus scanners, software updaters, incremental defraggers, application preloaders, and various other daemons) awaken at startup and can stay busy for hours.
We might say the system has finished booting when the Windows desktop appears and we can launch apps and start working. But performance can be sluggish as the machine struggles to finish its startup chores.
I’ve had fun shopping for graphics cards, especially when a power user is within earshot. I’ll innocently ask the salesperson, “What’s your slowest graphics card?” The reaction is precious.
As I’ve confessed before, I’m not a gamer. Years ago I edited a videogame magazine and didn’t realize how weary I had become of games until the magazine unexpectedly folded. I stopped playing that day and haven’t resumed since. That’s why I don’t need fast graphics. Playing a YouTube clip is the most taxing graphics workload demanded of my computer.
Often, I won’t even buy a graphics card. I’ll scrounge a hand-me-down from a friend or cannibalize a junked PC. My oldest computer in regular use contains a discarded engineering sample of an Nvidia GeForce4 Ti-4200 from 2002.
Are you cringing yet? Mock me no more, power users. I’m reconsidering my wayward ways.
After decades of fitful progress, parallel processing is suddenly hot and will soon be commonplace on ordinary PCs. For applications rich in data-level parallelism, performance is soaring by leaps and bounds.
Multicore CPUs from Intel and AMD are all good, but the game-changers are the next-gen GPUs from Nvidia and AMD/ATI. These chips are evolving from highly specialized 3D-graphics processors for games into broader computing engines for nongame software. Nvidia is leading the charge with a new GPU architecture that, for the first time, supports general-purpose computing as strongly as it supports graphics.
Nvidia’s new Fermi GPUs will support error-correction codes (ECC), one terabyte of memory, concurrent kernels, and faster double-precision floating-point math. These features are largely unnecessary for 3D graphics but vital for high-performance general-purpose computing. (In fact, ECC slows down graphics processing, which is why it can be disabled in Fermi chips sold for the consumer market.)
In August, Nikon introduced the world’s first digicam with a built-in video projector. The Coolpix S1000pj has a tiny projector—called a picoprojector—that can display photos and videos at 640x480-pixel resolution. In a dark room, projected images are visible up to six feet away, up to 40 inches wide.
Although picoprojector technology has been appearing in small video projectors and a few other devices, the S1000pj moves this revolutionary technology into a mainstream consumer product. Soon, “embedded” picoprojectors will be everywhere.
An embedded picoprojector is one that’s built into a device other than a stand-alone video projector. Digital cameras, video camcorders, and camera-equipped cell phones are obvious candidates. Embedded picoprojectors will probably become as common as webcams in notebook computers. Hand-held videogames, media players, portable TVs, and ebook readers are additional possibilities. Picoprojectors will be used for advertising displays, vehicle entertainment systems, heads-up control panels, and other applications that can benefit from their space-saving properties.
If there were such a thing as post-traumatic stress disorder for weary veterans of OS wars, I’d have it. Frightening flashbacks of MS-DOS vs. CP/M... Windows 3.0 vs. Apple System 6... OS/2 vs. Windows NT... Windows vs. Mac again... then Linux vs. Windows vs. Mac. And that’s not counting the smaller conflicts that engaged OS-9, CP/M-86, AmigaDOS, and others too numerous to mention.
Now Google’s Chrome OS is challenging Windows? Please.
Look, I’ve railed at Microsoft as much as anyone, sometimes in these very pages. And my other computer is an iMac. But one thing I’ve learned is that a new OS needs a strategic advantage before it can defeat a deeply entrenched OS.
It’s getting almost impossible to be a fully equipped techie. There’s always another new gadget threatening to leave you behind, even if you’ve already got a desktop PC, laptop, netbook, home WLAN, game console, e-book reader, smart phone, iPod, GPS, portable DVD, digicam, DSLR, HDTV, HD camcorder, Blu-ray, DVR, dish, and surround-sound home theater.
What’s next? Media phones.
Nope, they’re not smart phones. We’ve already got that. Media phones are next-gen landline phones tethered to broadband Internet service in a home or office. Typically, they have cordless handsets for voice calls and a fairly large (8-inch or so) touch screen. Built-in DSL or Wi-Fi provides fast, always-on Internet access. VoIP can provide cheap long-distance calling. Like conventional phones, media phones needn’t be booted or shut down.
Many people still think of Apple as a relatively small computer company, even though it’s a large consumer-electronics company. Those folks were surprised by recent reports that Apple is hiring more chip designers. They question the wisdom of plunging deeper into the risky and costly venture of designing custom chips.
But Apple’s moves are a logical response to current events. We are witnessing a fundamental shift in computing, as important as the debut of personal computers in the 1970s.
Desktop PCs—and to a lesser extent, notebook PCs—are the old wave. The new wave integrates mobile computing and communications with ubiquitous Internet access. Although notebook PCs can ride this wave, they are the largest species of new personal computers. Netbooks are better examples. Best of the new breed are the Apple iPhone, RIM Blackberry, and Palm Pre. More are coming.
In a rare example of limb-crawling, Intel’s technical marketing manager recently made 10 predictions for the next 10 years. But he didn’t crawl very far. Most predictions were boring references to long-standing development projects at Intel and elsewhere.
“Realistic computer-generated images.” (Hey, Intel, we’ve already got that.) “New classes of portable devices with 10 times more battery life.” (Who else saw that coming?) “Personal Internet devices will be truly personal.” (Like I’ve been saying for years.) “Low-cost silicon photonics for faster, more reliable data transmission.” (Intel and many others have been working on that technology forever.)
Nevertheless, two predictions are interesting. The boldest was “Malware will become a thing of the past.” The idea is that microprocessors will incorporate security features to stop malicious software from attacking the operating system and application software. It’ll be like a Roach Motel for malware—bugs crawl in, but they won’t crawl out.