Last week was just full of surprises. (RIP, all.) Thankfully, though, one shining, heroic force swooped in to save the world from snowballing into complete unpredictability. That final bastion of normalcy – that conqueror of chaos -- was, of course, Transformers 2: Revenge of the Fallen.
The film – which starred explosions, Shia Labeouf, and explosions (but unfortunately, not Shia Labeouf exploding) – defiantly dodged negative reviews, negative word of mouth, and a near-negative Metacritic score to gross $112,000,000 in its opening weekend. Yep – nothing like a vapid, needless summer blockbuster to restore your faith in the world by destroying your faith in humanity. The movie’s success, though? Not surprising in the least. It’s a loud, action-packed film with more carnage than meaningful dialog. It’s simple, easily digested cheese. People eat that stuff up.
But then, no one expected Transformers to tug at our heartstrings and revolutionize storytelling as we know it. That’d just be silly; I mean, it’s a movie about robots fighting. Clearly, all eyes here are focused on the action – no time to roll them at the plot.
So then, how come we often expect tear-jerking, thought-provoking tales from big-budget videogames with premises nearly as dramatically inhospitable as Transformers? Why do we expect triple-A videogames – which, at this point, are quickly sneaking into movie territory in terms of development costs – to mold angry men, gunfire, and shrapnel into spellbinding tales when our prior buying tastes (see, for instance: Transformers) have shown that all we want is a loose thread to hold the action together? Especially when other story genres (you know, anything that's not action) lend themselves far better to interesting plots, untethered by the need for a five-minute shootout every six minutes?
PC gaming began on mainframes and research computers. It moved to personal computers when independent developers put their games on floppy disks, sealed them in Ziploc bags with Xeroxed art, and sold them in hobby stores. If it is going to have a future that is not yoked to console design paradigms, we are going to have to recapture those roots and start paying closer attention to the small developers who are designing with us, and not 14-year-old console gamers, as their primary market.
Imagine having your car serviced and finding 100 unexplained miles on the odometer, plus evidence that burglary tools had been stashed in the trunk. Would you be pissed? I was.
Except it was my computer, not my car, that a repair shop messed with.
We’re so focused on threats coming from the Internet that it’s easy to forget the hazards closer to home. The best antivirus software, firewalls, and spyware scanners are worthless when someone violates a trusted relationship. Maybe you can learn from my experience.
High blood pressure. Teeth marks in keyboards. Keyboard marks in monitors. Millions dead. These are only a few of the symptoms typically associated with gamer rage, but as with any potent malady, thousands of talented men and women are racing to find a cure. Recently, however, two groups picked up the pace and sprinted to the head of the frustration-fighting pack. Their names are Bethesda and Nintendo.
Both companies are currently developing games that, in a manner of speaking, play themselves. They are -- to put it in cynical, crotchety, “back in my day” terms – finally handing players a Win Button. Bethesda has applied the name “SMART A.I.” to its get out of frustration free card, but it merely gives you the option of taking a breather while the A.I. controls your character’s movement toward a specific location. In other words, encounter anything with an itchy trigger finger and you’re S.O.L.
This is nowhere near as extreme as New Super Mario Bros. Wii’s approach, which will – according to Mario creator Shigeru Miyamoto – do everything for you at your behest. Free from the slippery grasp of your feeble fingers, Mario will cut a swath through Bowser and his cohorts as though possessed by the reptile-battling soul of Steve Irwin. He will have his revenge. But will you enjoy it?
Pressing the huge, red picture of a button won't do anything! Click the read more link instead.
There was something different about this year’s E3. Ok, aside from the 30,000 or so extra people and the occasional, chuckle-inducing swine flu masks strapped to the faces of germophobic show-goers. It was something subtle – invisible, even – but it happened with a great deal of frequency.
It was cheering.
Clapping, laughter, excitement. By and large, at this year’s show, people really, really liked what they saw. This should be a good thing, but in my cranky, cantankerous opinion, it’s not. Why? Because every last cheer, whistle, and imitation air horn blast sounded in raucous approval of the status quo. Another FPS. Another God of War clone. “Our game is a lot like Half-Life, but mixed with Halo,” developers would cheerily exclaim, bathing in the glow of audience members’ beaming smiles.
From me, however, E3’s flood of samey shooters and risk-free sequels elicited only one reaction: a quiet cry of “Down with the hardcore.” Allow me to explain.
As I mentioned earlier, most every big ticket title at this year’s E3 was some sort of rehash, sequel, or clone. Here’s a quick list of particularly obvious offenders: Modern Warfare 2, BioShock 2, Left 4 Dead 2, Halo: ODST, Halo: Reach, Dante’s Inferno, Metal Gear Solid: Rising, Assassin’s Creed 2, Crysis 2, Mass Effect 2, Alpha Protocol, etc, etc, etc. That’s not to say that my fanboy froth isn’t overflowing for many of those games; it is. I came away from E3 jumpy (though that might’ve been the fault of LA’s less-than-friendly neighborhoods) and excited as could be. However, I’m excited for me. Right now. I’m not, however, excited for the future of the gaming industry.
What drives a perfectly sane person to become a videogame company's public relations manager? I can't quite be sure, but I'm willing to bet that whatever it is, it isn't pretty. The mission that -- again -- they choose to accept seems simple enough: deliver information into the eager hands of journalists and laygamers alike, in hopes of eventually building your game's hype-tower up to stratospheric levels. What's so wrong with that? Well, nothing, actually. But all it takes is one quick slip-up at the intersection between mission intention and mission execution to turn that colossal hype tower back into splinters and dust. Those things, for the uninitiated, do not typically mix well with the copious amounts of blood, sweat, and tears that go into game development.
Thus, toward the end of a game's hype cycle, we see little fiascos like the one well-respected journalist Tom Chick encountered with Sony's latest second-party effort, inFamous. Chick had received an early copy of the game for review purposes, and chose to divide his criticism into two separate lists: one praising the game's pioneering efforts in the field of electrically charged super heroics, and the other (gently) reaming the game for pilfering from the plot of Kids WB cartoon Static Shock, among other things. No review scores were assigned to either of Chick's lists, but his somewhat brutal -- though justified -- honesty was enough to send the PR machine into a tizzy. As a result, Sony canceled an interview between Chick and the game's developers.
The stunning plot twist? Chick reported Sony's little gaffe, as journalists occasionally do, and readers weren't too pleased with the publisher's Indian-giving antics. The site's comment section rang with cries of "Gerstmann-gate," the PR explosion between website GameSpot and publisher Eidos that resulted in the firing of Jeff Gerstmann, one of GameSpot's senior review staff, for assigning Eidos title Kane and Lynch a 6 out of 10 review score.
Continue reading for some hot Spy and Mama action.
I’ve been playing Peggle lately, and – confession time – I love it.
Despite the attached “casual timesink” stigma and even though the game’s main gameplay conceit is essentially as complex as watching a slinky bounce down a staircase, I can’t get enough of it. On top of that, it serves as a perfect contrast to the other stigma-prone game I’m currently loving in that can’t-let-the-family-find-out sort of way: Mirror’s Edge. Why the wariness? Well, Mirror’s Edge was supposed to lead EA’s innovation charge, but the game’s over-reliance on frustrating trial-and-error-based gameplay caused it to fall slightly short of its lofty goal.
As with Peggle, though, that “controversial” gameplay conceit is my main reason for loving it so much. So, to sum up: Peggle is simple and fun, while Mirror’s Edge is brutal, but still enjoyable. Playing one when I’m fed up with the other makes them perfect compliments. End of story, right?
But this complimentary contrast isn’t without a point. See, typically, the ridicule Peggle receives is purely in jest. The game’s casual and addictive, so – obviously – you’re putting your hardcore gamer cred on the line by playing it. “Oh that Nathan! Giving [Big Name Game X] the cold shoulder for Peggle? What a loon!” And then hilarity ensues. Etc. But the truth is, Peggle’s a fantastic game, and most will acknowledge that.
Mirror’s Edge’s jump-die-jump-die-???-profit shtick, though? That’s the kind of thing that inspires gamepad-shaped holes in the wall and cursing strings that’d make Q-Bert blush. Lower than expected review scores and a general air of disappointment shortly after the game’s release reflect that. As a result, I’d wager that the type of gameplay Mirror’s Edge took so many verbal blows for is on its way out. Which is a shame, because I think it still has a place in today’s gaming climate.
Read on to find out why Mirror's Edge 2 -- if one ever appears -- probably won't be much like the first.
When they strap me to the chair, I won’t fight it.
The man was frail and frightened. All he could do was drop to the floor and beg for a quick death from his much more physically imposing enemy. And I gladly obliged. His name, when highlighted by my cursor, was red, after all. He was one of the bad guys, right? Right?
The above scenario occurred while I was playing through Fallout 3’s Broken Steel DLC, and would’ve been just another day in the Wasteland if not for a few key factors. First up, according to my Pip Boy, I’m Wasteland Jesus, doer of all things selfless and just, hands sparkly clean and free of innocent blood. Second, my enemy – a scientist – wasn’t the violent type. He ran without giving me any sort of trouble, yet I gave chase. I was the schoolyard bully, and he the undeserving nerd. Sure, his red name tag told me that perforating his fancy future lab coat wouldn’t yield any karmatic consequences, but I had no way of knowing if he was actually evil. But I still killed him and, to be perfectly honest, I wasn’t the least bit sorry.
Really, what does such a scenario even say about the habits videogames foster in us? Sensationalists would, of course, say that this is just another example of the big, mean gaming industry’s trivialization of death, regarded by many as the de facto Serious Topic. To which I respectfully reply: You’re dumb.
If you take a few moments to sift through gaming’s ever-expanding walk of fame, you’ll quickly notice that many of our hobby’s biggest, most memorable stars and starlets are, well, dead. SPOILERS. Aeris (or Aerith, or whatever Square’s calling her these days) from Final Fantasy VII. The dog from Fable II. The baby metroid from Super Metroid. And my personal, though lesser known favorite: the random helicopter pilot from Resident Evil 4. In the cases of many of these deaths, players mourned for these characters, and even tried to – for the most part, unsuccessfully – bring them back to life. Gamers still experience death like everyone else. Game designers know that, and use it to make their games more emotionally affecting.
So why, then, are we still capable of callously capping “enemies” that can’t or won’t fight back? My guess? It’s that darn good vs. evil meter doodad so many new-fangled games present us with these days.
Continue reading for the battle between good and evil
I normally stay out of the Linux conversations because it's like placing oneself between two packs of rabid, fanboy wolves. Not that being enthusiastic about your operating system of choice is a bad thing. It's just a lot of flame for one meager columnist to handle.
That said, I couldn't help but notice a number of articles passing around the Web this week, praising Linux for pushing past the one-percent adoption rate for desktop operating systems. Huh? One percent? That's like throwing a ticker-tape parade for a one-year-old. I mean, kudos to Linux for making it this far and all, but I think that people are selectively focusing on the "concept" of the number a bit too much. Because when you dig a little bit deeper into the statistics, you'll find that Linux's big "Achievement Unlocked" isn't really that big of a deal at all.
As I’ve noted before, when you’re not playing action games, the killer GPU in your PC is basically a case heater. For the most part, it uselessly sucks power and radiates heat as you perform mundane computing tasks: web browsing, word processing, spreadsheet calculations, MP3 playback. GPUs are the most underutilized resource in PCs.
Finally, that’s changing. AMD now bundles its ATI Stream parallel-processing software in the latest ATI Catalyst graphics drivers. As users download and install these free drivers, they automatically prep their systems to run ATI Stream programs that leverage the GPU as a massively parallel processor. Before, users had to download ATI Stream separately. AMD is following Nvidia, which began bundling its CUDA parallel-processing software with display drivers in 2007.
Although ATI Stream and CUDA are for programmers, anyone can use the application software written for these platforms. When you install and run an ATI Stream or CUDA application, it automatically executes on the x86 CPU and on the GPU, which does the heavy lifting. Most people won’t notice anything different—except better performance.