No need to rub your eyes--it's true. The No BS Podcast has finally come back after a long Winter hiatus. The first podcast episode of 2010 is also a bittersweet one, as it marks Will's last as host of the show (though he may return occasionally as a special guest). But life goes on, and we'll be bringing other familiar staff voices to join the podcast, including Nathan and Alex. In this episode, however, we discuss CES, Google butting heads with China, and our thoughts on this year's new technology trends.
And don't worry, we haven't forgotten about Gordon's annual year-end Rant-a-thon show. We'll be posting that in the coming week!
Do you have a tech question? A comment? A tale of technological triumph? Just need to get something off your chest? A secret to share? Email us at email@example.com or call our 24-hour No BS Podcast hotline at 877.404.1337 x1337--operators are standing by.
After 119 monthly issues and roughly nine and a half years (3,474 days, according to Wolfram Alpha), this is my last issue as Editor-in-Chief of Maximum PC. I’d like to pretend it’s been grueling work—from the crazy costumes to our intern-torturing escapades to the great smoke alarm incident of 2000—but, I can assure you, it’s been a blast.
Most importantly, I’ve enjoyed working for you guys over the last decade—a decade that’s been chock-full of amazing technological triumphs, as we accelerate ever faster toward the Singularity. To give a little perspective to that decade, here are the four achievements (presented in no particular order) that I think have made the biggest impact on the world, during the time I’ve been at Maximum PC.
Sure, losing weight and spending more time with my family would both be great New Year’s resolutions, but let’s face reality: I’m not going to do either of those things. Instead, I’ve made four tech resolutions—I call them “techolutions”—that I earnestly pledge to follow in 2010 (or at least until the next time I have something more fun to do).
Back Up All of My Data
Right now, with the Windows Home Server I’m rocking at home, I have a pretty reliable, idiot-proof way to back up all the PCs in my house. But these PCs don’t hold all my data. I have gigabytes of stuff stored on computers that are beyond my home server’s reach, in the cloud and on my work PC. This year, I resolve to back up everything at least once a month—this includes everything from my Outlook archive at work to the contents of my Dropbox folder. Just in case.
Take Better Care of My Batteries
Battery maintenance should be much easier, but sadly, it isn’t. This year, I pledge to keep a partial charge on all my Lithium-ion-powered devices—never overcharging, never draining completely, and always unhooking my batteries when I’m not going to use a device for a while. I resolve to do everything reasonable and within my power to extend the life span of my batteries.
Read the rest of Will's New Year's resolutions after the jump.
It all started with a phone call from my mom. While she’s not a regular Maximum PC reader, she read my Windows 7 review online, and called me because she was worried about the, umm, “colorful” comments. I told her not to sweat that feedback—that those folks are fanboys, people who suffer an excess of product-focused enthusiasm.
The conversation got me thinking, though. When I posted my positive review of Win7, I expected a strong response from the fanboy contingent. I expected people to accuse me of being a fanboy (that happened, check), and I expected my critics to attack my opinions (checkerino), expertise (Chekov), and moral turpitude (ditto).
I wasn’t surprised by the Windows XP fanboys, who let me know that their intractable world lacks a place for any new versions of Windows. Also not shocking? That the Apple fanboys are convinced that Snow Leopard is faster, better, and cheaper than Windows 7. And I would have been disappointed if the Linux fanboys didn’t tell me that I’m a dumbass for paying for an inferior, closed-source OS. What I didn’t expect? Well, what I couldn’t prepare myself for was the Windows Vista fanboy.
You may not have heard of it before, but “augmented reality” is coming, and it’s more than just cool tech—it will change the world.
Augmented reality has been a Hollywood staple for the last 30 years—although it’s more commonly associated with robots and cyborgs than people or PC enthusiasts. Put simply, it’s a technology that overlays a real-world scene with relevant contextual information, directly from a computer. In Robocop and Terminator, augmented reality was used by the movie’s eponymous characters to overlay friend or foe info. In Minority Report, it was used to display targeted ads, unique to each individual, as they walked through a city landscape.
I suffered a loss recently: My trusty, first-generation iPhone’s touch screen gave up the ghost. On a sunny day in early June, it let loose this mortal coil. And, like every other piece of technology I’ve ever owned, the touch screen stopped responding at the worst possible moment—as I was in a cab on my way to the first leg of a two-week trip.
Upon landing in Los Angeles, my first stop was an Apple store, where one of the Apple-proclaimed “geniuses” explained my options. My first choice was to get a replacement phone for a mere $200 (I hadn’t bothered to buy the extended warranty). My other option was simply to pound sand. I took my busted phone and bid the Apple store and its smug “geniuses” farewell, vowing to never buy another iPhone.
Next stop was AT&T to purchase a new, non-iPhone phone. I put my name on the we’ll-help-you-when-we’re-good-and-damn-well-ready list, and started looking at phones. After an hour or so of waiting, I walked out of the building with a new Blackberry Bold and considered my mission accomplished.
Our own Will Smith uses Twitter to announce new articles and content on Maximum PC, my wife and I use Twitter to keep track of our kids and their friends, and "Britney Spears" uses it to entertain and inform her fans. Why the quote marks? A weekend article in The New York Timesreveals what Cnetsays "we all sort of knew already" - Twitter is full of ghostwritten entries.
Some of the sports figures, celebrities, and politicians who use ghostwriters on Twitter and other Web 2.0 social network sites include Britney Spears (although her staff is now signing their own entries), 50 Cent, Candidate/President Barack Obama, Kanye West, Ron Paul, and others. However, the Times also gives credit where due to to celebrities who write their own tweets like Shaquille O'Neal and Lance Armstrong (who one-handed a recent tweet about breaking his collarbone).
Join us after the jump to sound off about celebrity social-network ghostwriting.
In our March 2009 issue, we dressed our illustrious Editor-in-Chief up as a one of the ravenous antagonists from our Game of the Year, Valve’s Left 4 Dead.
The transformation from living human to decaying dead took almost two hours, though in the end it made for an amusing, but slightly horrifying, photo shoot. Read on to find out how we managed to turn this famed zombie slayer into one of his victims, or follow along to attempt your own zombie transformation.
Nvidia stands at a crossroads, with two closed, proprietary APIs that have mainstream potential: the general-purpose computing CUDA API, and the PhysX physics-acceleration API, which sits on top of CUDA. These are both promising technologies, but only owners of Nvidia hardware can harness their power. Meanwhile, there are two emerging open standards that mirror what Nvidia is doing with its proprietary development. One is OpenCL 1.0, and the other is a general-purpose GPU computing API, which Microsoft will include in DirectX 11. There are a relatively small number of consumer applications that use CUDA, PhysX, or OpenCL right now, but the possible applications for the tech are endless—grossly simplified, these APIs let graphics chips perform CPU-like functions.
The question Nvidia needs to be asking is simple: Will developers write their general-purpose GPU computing apps using a proprietary API that works on only a subset of PCs—those stuffed with Nvidia hardware—or will they use an open API that will work on every PC on the market?
I just returned from a special theater screening of War Games—quite possibly the only good film Hollywood has ever produced about computers, computer nerds, or hacker culture. Shockingly, the movie, which was first released in 1983, holds up quite well, despite the use of archaic hardware (acoustic couplers and vocoder boxes), a laughable sentient military supercomputer, and an occasional lapse into typical Hollywood lingo.
The abundance of 8-inch floppy discs also gave people in the theater a laugh, as did the fact that characters were practically chain-smoking throughout the entire movie. But none of the showing’s pervasive air of yestertech could take away from the fact that War Games remains awesome.