One of the problems with our accelerating technological progress is that the evolutionary path is strewn with dead formats. Remember cassettes? VHS? Betamax? Laserdiscs? I was reminded of this again when I got involved in some serious de-cluttering. I found multiple boxes of SVHS-C cassettes left over from ten and twenty years ago. Many of them are treasured memories so I decided to dub these to DVD with the eventual goal of importing into Avid to edit them.
For dubbing purposes, I picked up a Sony VRD-MC6, which Sony calls a “multi-function DVD recorder.” It’s a convenient little box for burning DVDs from various other sources. It has a small screen to show you what’s being burned to the DVD and it can write to single and double-layer discs. Perfect for my needs.
Working my way through ten years of recorded videos was both joyous and frustrating. Read on for some of the lessons I’ve learned from several decades of shooting personal videos and candid stills.
I believe America’s greatest strength has been its ability to cultivate the most profitable crop in human history—geniuses. This country is the way it is because of men and women with genuine vision and the ability to move that vision into the realm of accomplishment.
The great strength of Apple computers was always the commitment of Steve Jobs to “make it better.” Jobs’ return to Apple was the smartest move the shareholders ever did. (Apple’s darkest days occurred during the reign of whatshisname, the soda salesman. Whatever experience he had managing a company that made its profits from selling carbonated sugar water, it wasn’t the kind of visionary experience that a computer company needs.) So the loss of Steve Jobs now could be as critical a moment for the company as it was when he was forced out in 1985. A visionary company needs a visionary leader.
In fact, our current economic woes may very well be due to a failure to invest in the next generation’s crop of geniuses. We have spent too many years failing nurture vision and innovation. Industry has made the near-fatal mistake of thinking that “make it cheaper” is an acceptable substitute for “make it better.” The evidence says that it is not.
(This was written before Steve Jobs died, and it was never intended to be disrespectful, only slyly satirical. Because of publishing schedules, it is only appearing now. I admired Jobs and I will sincerely miss his presence in the consumer electronics industry. His influence went far beyond his own company. He was a human catalyst accelerating the pace of computer evolution to warp speed.)
1984 was and still is a year forever tainted by George Orwell’s novel of the same name. Orwell, “Big Brother”, and even the year itself have become shorthand terms for totalitarianism or anything that even hints of it, whether it’s a security camera or a political philosophy you disagree with or Microsoft’s Windows validation software. “Orwellian” is a way of saying “like the Nazis, but without Godwin’s Law.
During the 1984 Super Bowl broadcast, Apple showed one of the most memorable commercials ever filmed. If you’ve never seen it, you can probably find it on YouTube. Directed by Blade Runner’s Ridley Scott, the commercial shows a woman in a track suit running through a totalitarian environment. She dashes past all the drone-like people sitting on benches and hurls a hammer at a huge screen that represents the Big Brother of George Orwell’s novel, 1984.
I was minding my own business, happily writing a novel, not thinking beyond the needs of the story, when the following sentence suddenly occurred: “The Baby Cooper Dollar Bill, for example, was only fifty years old….”
I stared at the sentence for 15 seconds. I knew what it meant. The entire anecdote had flashed into my head simultaneous with the creation of that first ominous sentence. I typed, “The short version:” and began. 1741 words later, I had the longest paragraph I’d ever written.
And one of the most terrifying predictions I have ever written:
All the other articles list the top ten Windows Annoyances. I’m going to list the bottom ten. These are things that work, but they’re sloppy.
Maybe the programmers thought good enough was good enough. It isn’t. Maybe the programmers forgot to stress-test their work. They should have. Maybe they didn’t think about the actual work environment where their software would be running. Oops.
And perhaps, some of these behaviors are my fault—things that are particular to my machine, quirks that have developed over time as the detritus of heavy use piles up like scree at the bottom of a cliff. Whatever the case, they’re still annoying.
The problem with predicting the future is that there’s so much of it. You can predict some pieces of it because some trends are obvious, but you can’t predict how all the pieces are going to fit together, and even more difficult, you cannot predict what human beings will do with all those different pieces once they have put them together.
The smartphone is a great example. Robert A. Heinlein predicted cell phones in The Star Beast, first published in 1954. Other writers predicted tablets as well. But nobody predicted Twitter or sexting. Those were surprises.
We’re on the threshold of another leap forward in the punctuated evolution of computing technology and the first pieces are starting to appear. I think it’s inevitable that some of these pieces are going to mate, mutate, and evolve into something new.
First of all, it is pronounced noo-klee-ar. Not noo-koo-lur.
Please. If we accomplish nothing else in the next twelve hundred words, could we at least stop mispronouncing it?
Without fail, every August anniversary of the first atomic war (Hiroshima and Nagasaki), the commentariat trots out the usual Monday morning afterthoughts about the rightness or wrongness of President Truman’s 1945 decision to use nuclear weapons.
Regardless of which side of the argument you take today, we also have to consider the circumstances under which the decision was made and the thinking of the moment. With the victory in Europe secured, Americans wanted the war in the Pacific to end as well. The nation was emotionally exhausted.
The prospect of an invasion of Japan was daunting. Some military planners estimated a half million casualties or more. Soldiers who had fought their way across Europe were already being shipped to the Pacific theater. Marines who had island-hopped all the way from Guadalcanal to Iwo Jima knew how ferocious the Japanese soldiers were, and many did not believe they would survive an assault on the home islands of Japan.
From Truman’s perspective, the decision to use the bomb was dictated by circumstances.
For more about the impact of the atom bomb, and how it relates to technology, read on.
James Burke, who made the marvelous TV shows Connections and The Day The Universe Changed (worth buying or renting!) once demonstrated that one of the most important inventions in the history of information technology was the vertical array of storage shelves—the filing cabinet. Why? Because it allowed for a visual system of organization. It was the first database. It made it possible to access information a lot more quickly than spelunking through a stack of scrolls or books.
The computer, of course, makes it possible to have far more complex databases than will fit on a single wall, and provides near-instantaneous information retrieval. One of the first and most important (and possibly the most overlooked or taken for granted) uses for personal computers—after word processors and spreadsheets—was database handling.
The “maximum” in MaximumPC means doesn’t just mean the fastest speed or the highest ratings—it means more than best. It means pushing the envelope to be the best possible.
As geeks and nerds, we are always striving for the best possible, because we’re never satisfied with where we are or what we have. We want more. That’s everything you need to know about the forward thrust of technology—the unsatisfied human desire to have more, better, and different. In the long stumbling, bumbling, fumbling history of our weird little species, we have invented so many marvelous tools to expand the power of our muscles, but only one tool to expand the power of our brains—the computer.
As a species, for the first time in history, we have the opportunity to be more accurately informed and make wiser decisions than ever before. —assuming we use our technology wisely.
Too often, we forget that the most important component in any system is the user. We forget that we are the authors of our own choices. Even worse, we forget that we actually have a choice.
If that’s true, then the Internet is a serious pummeling by an unruly mob, with an occasional mugging mixed in.
The architects of this beating are web-designers. The best evidence of this can be found at Vincent Flanders’ website,http://www.webpagesthatsuck.com/. The theory behind Web Pages That Suck is that you can learn a lot about good design by looking at bad design. Flanders also has two similarly-titled books on the subject and his website and his books ought to be mandatory reading for anyone designing, building, or even maintaining a website.
Read on formy three rules for creating a web page that doesn't suck.