Back in the pre-Cambrian era of programming for the personal computer, there were only two options: assembly language and BASIC.
Assembly language was a compiled language, producing object code that was the machine’s native language, but it was hard to learn and writing good code was a time-consuming process. Debugging it was even harder. BASIC was an interpreted language. It was easy to learn, but because each line of code had to be interpreted on the fly, it was slow. And it wasn’t a structured language with functions and procedures, all you had were subroutines, so you ended up with a lot of spaghetti code.
Then one day Turbo Pascal crawled up out of the primordial ocean and triggered a Cambrian explosion of software evolution.
I have a holiday tradition. Every December, I buy myself a present. That way I guarantee I will get at least one gift I actually want. (I have a closet full of sweaters and shirts that other relatives thought were “just perfect” for me. No. Just no.)
This year, I ended up standing in the aisles of Fry’s Electronics, debating with myself. I really wanted a new camera—the Sony DSC-F828 is 7 years old and I’ve been studying the specs on various high-end DSLRs. But there’s also this great new game from Disney called Epic Mickey. I don’t have a Wii, but Epic Mickey looks so good that I was tempted to buy a Wii just for Epic Mickey alone.
But … I hesitated.
One of my nephews has a Wii, and he got Epic Mickey for his birthday early in November. So I invited myself for dinner one Sunday and afterward he set up the game for me. He showed me where all the controls were and walked me through the first few moments of the game.
I’m a big Disney fan and Epic Mickey opens up with some great references to classic Disneyana: Mickey Through The Looking Glass, The Sorcerer’s Apprentice, all those Disney anthologies with the animated paint brush creating whole landscapes with a stroke, early Disneyland, Night On Bald Mountain (and/or The Blot), and a calendar that acknowledges Mickey’s early films as the pages fall off.
After a spectacular setup, the game reveals Mickey in a place called The Wasteland. We meet Oswald the Rabbit, who Walt created before Mickey, in the laboratory of a mad scientist. This is an introductory level, a teaching level for the mechanics of the game. Your job, playing as Mickey, is to destroy one control console by spinning (shake the controller), leap across a broken platform and destroy a second control console. You leap by holding down the A button.
Like all platform and jumping games, it requires fast reflexes.
Call me a grinch, but I’m not a big fan of Christmas.
What should be a celebration of the Prince Of Peace has turned into an International Capitalist Feeding Frenzy, with too many businesses and even whole economies depending on a month of consumer spending for their survival for the rest of the year. I don’t consider that a sustainable model for a personal budget and I doubt that it is any more sustainable when it is multiplied by hundreds of millions. It is a global potlatch that serves nothing but greed. In my not-so humble opinion.
The last time I launched into this particular bit of curmudgeonry, That Pesky Dan Goodman asked me what I think we should do instead. Without pause, I replied, “Kindness Day.”
No, not the occasional random act of kindness just because you feel like it, but conscious deliberate kindness, methodically planned and executed, whether you feel like it or not. An act of generosity that makes a difference for someone else.
My first thought was August 17th, the most nondescript day of the year. August doesn’t have any real holidays of its own and 17 is a number totally without significance in any logic system that I know of. Then I decided that one day a year isn’t enough, so I have declared a personal Kindness Day of my own on the 17th of every month. (What you do on the 17th is your business. What I do on the 17th will be an act of deliberate unselfishness.) (If I can think of one.) (It might be a stretch.)
To kick it off, I want to do a tribute to one of the most under-appreciated men in both computing and science fiction. (I’m not sure if this will be published on the 17th, if it isn’t I’ll have to do another act of kindness too….)
Not too long ago, I was sitting with a group of friends, schmoozing about computer games and our experiences with Starcraft II.
There’s a decision point in the single-player game where you have to choose whether to go with Tosh or abandon him and go with Nova. That’s the place where I got stuck and stopped playing. Why? Because I’ve been writing the script for the next Starcraft Ghost Academy manga, which deals with the backstory of both these characters. (No spoilers here, but I know why they hate each other so much. Neither of them are villains, it’s deeper than that.) But having written about their training at the Ghost Academy, I’ve fallen so in love with these characters that I cannot choose one over the other. Eventually, I will, but not without considerable regret about the path not taken.
In the meantime, I’ve been playing the custom maps on battle.net. Some are from Blizzard, some have been written by talented enthusiasts. One of my favorite 4v4 maps is a seductive little exercise called Nexus Wars. Imagine two parallel lanes accessing a base at each end. You play on a team of four, defending one of the bases. Your goal is to destroy the opposing base by sending warriors down your lane. You and a teammate play one lane, your other two teammates play the other lane—but you can all help each other, of course. This is a game where teamwork is essential.
You control an SCV and place buildings strategically to defend your base and access your lane of attack. You get income at timed intervals. The more structures you place, the more income you get. The installations automatically generate units at timed intervals, marines, roaches, zealots, mauraders, stalkers, queens, thors, hydralisks, colossi—depending on the structures, whatever you can afford to build. As the various units are generated they proceed across the lane toward the other base, attacking whatever opposing units they come in contact with. Your team wins by generating an overwhelming army of units to counter the army of units headed your way. There’s a lot of back and forth pushing, because every unit has a strength as well as a weakness. You need to counter ground and air units appropriately. In addition, every player has one nuke to use if the other side threatens to overrun your buildings.
Nexus Wars is fairly easy to learn. And while it looks like it’s a game of strategy, it’s actually a game of logistics. The team that better understands the strengths and weaknesses of the various units will always have the advantage. While most games are over in thirty minutes or less, if you get well-matched players who know what they’re doing, a game can go a lot longer. At the forty minute mark, all units increase to 300% damage to prevent stalemates, but even with that deadline in play, I was once in a game that lasted longer than an hour.
Eventually our discussion went from strategy and tactics to specific experiences. And that’s when it got especially interesting. Nexus Wars—like all of the team-player maps—is at its most fun when the members of each team actually behave like a team. It stops being fun when one or more players start acting like bullies, whether they’re on your side or the other. All of the multi-player games on battle.net have a chat mode for players to interact, plan strategies, give advice, bemoan the occasional lag—and sometimes behave very badly.
Bullying has been in the news a lot lately—particularly cyber-bullying. In some of the most horrific situations, cyber-bullying has even pushed teens to suicide.
Hit the jump for more of this week's Future Tense!
The first time I saw a floppy disk, the person holding it was wearing bell-bottom jeans and a tie-dyed T-shirt. (Yes, I am that old.)
The 8-inch floppy was designed by Alan Shugart and David Noble at IBM. It was introduced to the market in 1971 as part of the System 370. Prior to that, data had been stored either on tape or punch cards. (“Do not fold, spindle, or mutilate.”) The first floppy disk came to market in a black protective sleeve 8-inches across—it stored an astonishing 80 kilobytes of Read-Only data.
When Shugart left IBM in ‘72 and went to work at Memorex, he and his team developed the first read/write floppy. They were called ‘floppy’ because they really were. Inside that stiff protective sleeve was a very thin, floppy sheet of mylar, covered with brown industrial rust—essentially magnetic tape that could be spun in a circle. When spun fast enough the disk presented a somewhat rigid surface to the read/write head.
The first personal computers showed up in 1975. IMSAI and Altair marketed S-100 systems. You got them to work by toggling instructions in, one bit at a time. 8-inch disk drives still cost more than the computer. But in 1976, Wang Laboratories proposed a 5¼-inch ‘mini-floppy’ and within two years it was the standard for “microcomputers.” (That’s what we called personal computers then—microcomputers.)
Hugo Gernsback wrote a novel in 1911 called Ralph 124C41+. In it, he predicted things like night baseball and motorized roller skates and all kinds of other things that we take for granted today. He established science fiction as a literature of prediction. For the next three decades, much of science fiction was about the coming age of technology. Science fiction writers pondered death rays, nuclear war, 3D television, super-computers, and even the preposterous idea that someday men would fly to the moon. Asimov predicted robots and Clarke predicted communication satellites. Heinlein predicted cellphones and CAD. Leinster predicted the internet.
And of course there were a lot more predictions that were flat out wrong. The problem with prediction is that most of the time we’re predicting the future as a function of the past—more of the same, only bigger and faster. We forget the black swans, the law of unintended consequences, and synergy. Especially synergy.
Some predictions are obvious. Some are not.
At the beginning of last century, you could predict automobiles, but could you also predict all of the social effects they would produce? In the twenties, being able to hop in a car and drive twenty miles to a roadhouse was an escape from the watching eyes of nosy neighbors—you could drink and dance and gamble and go upstairs to a private room. In the fifties, motels provided even easier places for couples to couple. Drive-in movies were teenage passion pits. So you could argue that the automobile shifted our cultural perception of moral behavior. But could you predict it?
I was planning to wait for the Zune phone to ship, but the Samsung Vibrant was already here. It’s a state-of-the-art Android phone with a four-inch AMOLED screen so bright it’s startling. It has a five-megapixel camera and shoots excellent 720p video. The touch screen is responsive and even has Haptic feedback. The Vibrant has 16gb internal memory and it’ll take another 32gb on a MicroSD card. Of course, it has GPS and Bluetooth and all of the other stuff you expect in a top of the line unit. And it comes pre-loaded with Avatar, just to demonstrate how effective it can be for watching videos. It’s an impressive piece of gear and it demonstrates that the smart phone has become a mature technology.
Because it’s an Android phone, there are hundreds of apps either preinstalled or ready for download—Kindle, Twitter, Facebook, Google Earth, AccuWeather, Telenav, Google Maps with traffic and satellite layers, news updates, locations of banks and ATMs and other points of interest. You can use it as an MP3 player or for watching videos. You can read books on it. You can add more apps to access movie reviews and theater locations. There are restaurant finders, calorie counters, stock managers, checkbook programs, bar code scanners for quick product information, web browsers, lost phone locator, currency converters, tip calculators, lots of games and ringtones, and even a tricorder app that uses the device’s internal sensors for local scanning of gravity, magnetism, acoustics, and more. The Android marketplace is filled with useful apps you didn’t know you needed until you installed them.
Oh, and you can make phone calls on a smartphone too.
I was two and a half years old before I ever saw my mother without a telephone held up to her ear. The sight so terrified me that I ran and hid and refused to come out from under my bed until my dad dragged me out by my feet. This particular trauma may also have something to do with the fact that I did not start talking until I was three and a half.
Okay, I exaggerate, but only a little.
The telephone was my enemy, it was not just a ferocious rival for my mother’s attention—it was clearly the undisputed victor. My mother could happily chatter into that strange black device for hours on end. And no one could interrupt her. Even a dirty diaper had to wait before she would change it. After I became ambulatory, she assumed I needed even less attention. Communication with her was possible only through the telephone and I didn’t have one. I couldn’t even ask permission to go outside and play while she was on the phone. She simply ignored me or waved me off, her way of letting me know that I would always be less important than the disembodied voice on the other end of the line. The Mah Jong club and the PTA girls were more important than her own experiment in genetic recombinance.
I did finally get some small revenge. Years later, when I finally had a house of my own, a phone of my own, and an answering machine of my own—I never picked up when she called. I always made her leave a message. And if it was an angry one, I didn’t return the call.
Eventually, I figured out that it wasn’t my mom I was angry at, she had a heart as big as her mouth. No, it was the telephone itself I despised and the effect it has on human relationships.
Last year, IBM announced that it had built a computer that exceeds the neural capacity of the cortex of a cat.
My first thought on hearing this news was that the world does not need a computer that is snotty, stubborn, and coughs up hairballs on the couch. (I already have a computer like that, including the hairballs—one of these days, I just gotta clean the fan.) But fortunately, that was not IBM’s goal.
That same press release went on to say that IBM eventually wants to build a computer that simulates and emulates the abilities of a human brain for sensation, perception, action, interaction and cognition.
And once they accomplish that, why stop there? If you can build a machine that matches the cortical ability of a human, why not keep going and build a machine that exceeds that by ten times, or a hundred, or as far as you can go before the limitations of the physical universe kick in?
The column before last, I wrote about vinyl records and how amazing the technology for analog sound really is—because you’re fighting the obstinacy of the physical universe throughout the whole signal path.
During the seventies and well into the eighties, I invested quite a bit of time and money into my own sound system and I remember fondly playing with all kinds of electronic devices designed to remove clicks and pops, minimize tape hiss, expand musical peaks for more dynamic impact, and even add an extra octave of bass at the bottom. I also added an equalizer to compensate for sonic peaks and valleys in my living room.
Bob Carver’s Sonic Hologram did a kind of electronic signal-cancelling, so you wouldn’t hear the left speaker at your right ear, nor the right speaker at your left ear. That was a pretty astonishing effect, which has since evolved into all kinds of digital ‘spatializer’ enhancements. You could also add two speakers at the back of your listening room to extract out-of-phase information from the stereo signal and give yourself a quadraphonic experience.
A lot of the various equalizers and signal-processors were extraordinary devices for the time, genuinely pushing the envelope of sonic manipulation and enhancement. And remember, all of this was done in the analog domain. Occasionally, I still see some of these devices showing up as techno-props on crime-investigation TV episodes where some nerdy-genius forensic expert is magically extracting a remarkably clear audio signal from an overwhelming hash of noise. (If only….)