Some of my favorite early Internet memories came from visiting chat rooms: I started out using Microsoft Comic Chat and graduated to AOL chat rooms early on. When the Internet was young (and there wasn’t as much to do), it was pretty easy to become entranced by the number of random topics in which one could instantly discuss in real time. It was all honest fun, but I won’t lie, there was definitely that underlying sense of “OMG, I can totally lie about who I am, and no one will be the wiser.” This, of course, being a choice that many Internet users make to this day.
Having grown out of AOL, I moved on to vanilla IRC, where everything changed. Finally, an actual sense of community (and that desire to please the channel ops for some mod privileges). Yet somewhere along the line, ICQ and AOL Instant Messenger came along and usurped IRC by simply establishing presence: if you wanted to talk to someone, you just sent them an IM -- no more waiting around in the chat room to see if they’d pop in.
But the chat culture that we once knew and loved hasn’t disappeared completely, although its shape has changed significantly. IRC is still widely used, but these days it tends to be a tool too raw for use outside the geek set, where it’s frequently employed in conference “back-channels” or listener discussions for podcasts such as This Week in Tech. IM has become a de-facto mode of communication amongst friends and co-workers, so ubiquitous Google has begun to merge it with email (first with Gchat, and now with Google Wave). But for that random and serendipitous sense of discovery, where can the chat-hungry turn?
You may not have heard of it before, but “augmented reality” is coming, and it’s more than just cool tech—it will change the world.
Augmented reality has been a Hollywood staple for the last 30 years—although it’s more commonly associated with robots and cyborgs than people or PC enthusiasts. Put simply, it’s a technology that overlays a real-world scene with relevant contextual information, directly from a computer. In Robocop and Terminator, augmented reality was used by the movie’s eponymous characters to overlay friend or foe info. In Minority Report, it was used to display targeted ads, unique to each individual, as they walked through a city landscape.
Long ago, I came to the conclusion that The Sims was designed for Someone Else. I don’t know who. Hottentots, perhaps.
I played through The Sims 3 with awe, respect…and profound boredom. It’s a brilliant piece of work, and if God is kind I’ll never have to play it again this side of Purgatory.
Meanwhile, I’ve been returning to Prototype. I like Prototype. I also liked it when it was called Spider-Man 2 and Hulk: Ultimate Destruction. If a game is worth playing once, it’s worth playing two more times with different character models.
Games are all about wish-fulfillment and power fantasies. Some people are content to wield their mighty power to get three gems in a row. Others would prefer to jump 10 stories in the air and punch a helicopter out of the sky. If you have the opportunity to do the latter, I have no idea why you’d choose to do the former, but people are strange.
It’s getting almost impossible to be a fully equipped techie. There’s always another new gadget threatening to leave you behind, even if you’ve already got a desktop PC, laptop, netbook, home WLAN, game console, e-book reader, smart phone, iPod, GPS, portable DVD, digicam, DSLR, HDTV, HD camcorder, Blu-ray, DVR, dish, and surround-sound home theater.
What’s next? Media phones.
Nope, they’re not smart phones. We’ve already got that. Media phones are next-gen landline phones tethered to broadband Internet service in a home or office. Typically, they have cordless handsets for voice calls and a fairly large (8-inch or so) touch screen. Built-in DSL or Wi-Fi provides fast, always-on Internet access. VoIP can provide cheap long-distance calling. Like conventional phones, media phones needn’t be booted or shut down.
As the summer wanes, the days get shorter, and the wind starts hinting of fall, you’ll naturally ask, what’s hawt in curriculum this year? Forget sex ed and intelligent design, the latest educational brawl is copyright!
Curriculums are being shipped to thousands of schools across America to teach our children all about intellectual property—every lesson plan authored by a lobbying group or industry association. It’s even legally required now in California’s famously overfunded schools.
I’m pretty into this copyright thing, but I still try to drop by the real world on occasion, just to see how it’s going. In real life, schools are struggling with larger classes and fewer resources. Now, instead of music or art (or my favorite elective, ninjutsu), we’re going to have our overworked teachers inculcating children about one side or the other of the copyfight? Great.
It’s been nearly four years since the Xbox 360 helped consoles get their graphical groove back, which – of course – kicked off the current console generation. Time flies, doesn’t it? The Xbox 360, then -- if we’re going by Tech Standard Time (TST) -- should now be on its last legs. A dinosaur on its death bed, facing extinction by the meteoric approach of a new “next-gen” Microsoft console. But it’s not. In fact, if Microsoft and Sony have things their way, the current console generation will keep on chugging along for another five years.
Not long ago, for us PC gamers and our beefy, ever-evolving rigs, this would have been a moot point – or even a nice bit of superiority to hold over console gamers’ heads. “Our graphics are prettier than yours! Neener-neener-neener!” But times have changed. PC exclusives are few and far-between, and many are only one mediocre first week of sales away from being ported to consoles (*cough*Crysis*cough*). The large majority of games are unable to take full advantage of PC hardware, because consoles and their aging innards are holding everyone else back. Sorry state of affairs, ain’t it?
It's so hip and fresh. Open-source singlehandedly represents the latest and greatest thinking in the modern-day technological movement. Drop it into a conversation and you're suddenly talking like a futurist. Throw it into a company's strategic roadmap and suddenly we've created innovation and depth. Suggest that virus-makers are embracing open-source, and you've got the attention (and clicks) of Web geeks worldwide.
Wait a minute. Open-source viruses? How does that work?
So AMD’s ATI graphics division has got something in the works that supports up to six monitors.
If you’ve ever navigated even two displays with a mouse, you may realize something: multiple, high resolution displays may be outstripping the mouse’s capability as a primary user interface tool. Now toss in six 30-inch monitors – 24 whopping megapixels in all – and you’ve got a real problem. Even if you drop that to six more affordable 1920x1080 displays, that’s still over 12 megapixels you need to navigate. Just visually tracking the mouse cursor becomes problematic.
Still, it's a setup I’d love to have.
What’s needed for huge pixel count displays is multi-touch. Windows 7 now incorporates an actually useful multi-touch display capability, but it’s currently relegated to all-in-one PCs with multi-touch, a handful of laptops and the expensive (at $12,500 a pop) Microsoft Surface. Still, multi-touch isn’t perfect.
I suffered a loss recently: My trusty, first-generation iPhone’s touch screen gave up the ghost. On a sunny day in early June, it let loose this mortal coil. And, like every other piece of technology I’ve ever owned, the touch screen stopped responding at the worst possible moment—as I was in a cab on my way to the first leg of a two-week trip.
Upon landing in Los Angeles, my first stop was an Apple store, where one of the Apple-proclaimed “geniuses” explained my options. My first choice was to get a replacement phone for a mere $200 (I hadn’t bothered to buy the extended warranty). My other option was simply to pound sand. I took my busted phone and bid the Apple store and its smug “geniuses” farewell, vowing to never buy another iPhone.
Next stop was AT&T to purchase a new, non-iPhone phone. I put my name on the we’ll-help-you-when-we’re-good-and-damn-well-ready list, and started looking at phones. After an hour or so of waiting, I walked out of the building with a new Blackberry Bold and considered my mission accomplished.
Videogames have taken us everywhere. Space, the Wild West, the Oregon Trail, the future, heaven, hell, purgatory (Ever played Big Rigs? Yeah), World War II, the apocalypse, the post-apocalypse, and World War II again. You name it, and gamers have probably been there, done that, and gone to Hot Topic to pick up the T-shirt. So, what’s left? Where are we to boldly go without even a walkthrough to guide us? Well, if you’re I’m asking me, I’d say we should forget the rest of our well-trod universe and try picking our own brains. Yep, it’s time for a bit of good old-fashioned psychology.
At this point, I imagine many of you are remembering simpler times, when tales of Rorschach inkblot tests, salivating dogs, and men who loved their mothers lulled you to sleep in your public educational institution of choice. And a few of you might be thinking of Psychonauts – to which I say “good!” We’ll get to that in a little while.
Anyway, games obviously aren’t the domain of stuffy old guys with fancy degrees and fancier couches. However, that doesn’t mean some of the more universal psychological themes can’t find their way into videogames. Case in point: Batman: Arkham Asylum.
While Arkham may be known foremost as the only Gotham prison less effective than a wet paper bag, it is – in actuality – more of a correctional institution than anything else. The game, then, portrays Arkham’s staff members as hard-working ladies and gents who are trying their darndest to crack classic nutcases like the Joker, the Riddler, Scarecrow, and Killer Croc. The player, as Batman, stumbles upon evidence of these correctional interactions in the form of taped interviews focusing on different villains.