What? The decade’s over? Oh my. That means it must be time for some good old fashioned retrospection. For our first look-back story, Maximum PC asked me to identify the biggest tech goofs and/or missteps. I decided to take the task personally. I make no apologies or attempt to take a big look at industry failures. This is my personal look at the events, products, and non-products that conspired to crush my eternally sunny and optimistic nature.
When you’re done contemplating these grotesques, please make sure you chime in with your take. What products, events, or developments disappointed you the most over the last year? Hit us up in the (hopefully SPAM-free) comments section below.
Before Will Wright shipped Spore, he could do no wrong. Game developers and gaming geeks followed his every utterance with reverence and bated breath.
Now it’s like, “Will who?”
Okay, I’m exaggerating. A little. But there’s no denying that Spore was a massive disappointment. Part of the reason is that the game is very different from the way it was initially talked up and marketed. If you want to create cuddly, ugly little creatures in an editor to kill other ugly little creatures, it’s one of the best things that exist.
But as a game about evolution, or as a strategy game about evolving a new civilization based on the principles of evolution, Spore proved to be a huge disappointment. Gameplay was often impenetrable, and perhaps most damning, it was neither fun, nor absorbing.
Will Wright has had failures in the past, though. (Hello, SimCopter?) He tends to bounce back after a failure. After SimAnt, he came out with the Sims, which was a huge turnaround. So maybe the next title from Will will once again be something truly great.
I’m ever the optimist.
Ah, Google, how I once loved thee. Don’t be evil . Who wouldn’t like a corporation that aspired to that?
But between the human rights/privacy controversy in China, the discovery of the Google Streetview fiasco, and a slew of start-up acquisitions, it’s starting to feel like Google is just another company, albeit one that does cool stuff sometimes and generally executes well.
Sorry Google. You’re just not special in the way you seemed to be in early days. Incredible revenue growth and international reach will do that to you, I suppose.
I’ve been the owner of three different netbooks and I’ve returned or resold all of them. Even ones with bigger screens and better keyboards just suck. Mostly, I expect my PCs to be somewhat responsive, but every netbook I’ve owned or reviewed has distracting UI lag when anything else is going on in the background. So I’ve changed gears. These days, my go-to device for quick and casual email and web browsing is an iPad. It’s responsive and has great battery life.
It appears that the public is also starting to feel the same way. Sales of Netbooks have fallen off, as sales of iPads have climbed. It’s great that Microsoft and Google want to compete with Apple in the tablet marketplace. If they want to be successful, I have two recommendations: responsiveness and battery life.
I have a Zune.
I like my Zune. I like the user interface of the software, and I take my Zune Flash when I work out.
Overall, though, the Zune has been less than successful in the market. I listen to a number of different podcasts and, more often than not, I’m manually entering URLs into the Zune software to subscribe to a podcast. On the other hand, every podcast in the known universe is on iTunes.
Microsoft also seems loathe shipping multiple new models. When Zune HD came out, most of the other models weren’t updated, even though you can still buy them. Meanwhile, Apple’s shipped two generations of the iPod Nano, and the new, almost watch-sized model is almost too cool for words.
So yeah, Zune. I liked it, but I’m also disappointed. Maybe the series will see new life as it migrates onto the Xbox 360 and Windows Phone devices. Here’s hoping.
I’ve never been much of a fan of stereoscopic 3D, since it tends to give me headaches. More recent iterations, like Nvidia’s 3DVision products, are much bette, but that’s because you need a 120Hz refresh rate – which most displays won’t support.
For me personally, stereoscopic 3D works better in a darkened movie theater. I’ve been pretty impressed at some 3D implementations in movie theaters, like Avatar and the recently released Tron Legacy. Most of the time, though, I find 3D in movies to be more of a distraction than an artistic improvement. And I can never quite forget that I’m wearing bulky glasses.
As for putting it my living room, probably not. I may end up with a 3D capable flat panel, because many of them are simply good flat panels. But I have no real desire to watch 3D movies in my family room, given the shallow viewing depth and wide field of view in my own setup.
Let’s turn the wayback machine back to 2003. Nvidia launched a new GPU at CES. The FX series GPU offered a new architecture that was highly programmable – more so than any GPU before it. Nvidia’s FX 5800 also demonstrated that developing a GPU architecture that didn’t work efficiently with Microsoft’s DirectX API was not going to be a raging success.
We won’t get into the painful details; suffice it to say that FX series GPUs left some performance on the table when running DirectX 8 games. On top of that, Nvidia’s marketing team thought it would be cool to build this really gigantic (for the day) cooler onto the card, make it radioactive green in color and really, really loud. It rapidly became known as the “Dustbuster,” since it sounded like a small vacuum cleaner. Interestingly, this was by intent – someone at Nvidia though performance geeks would think it was really cool that this fan would spin up to get really loud, showing off how cool they really were because they had an FX 5800. Uh, no.
As Intel’s Core 2 CPUs became the darling of PC performance geeks everywhere, AMD was left standing on the sidelines, wondering why she’d been dumped by her ardent fans who had been extolling her virtues just a few months prior. Such is the fickleness of the everyday geek: What have you done for me, lately?
AMD’s comeback was supposed to be its true quad core Phenom. AMD would show Intel, who could only manage to glue together two dual core dies to make a quad core CPU. In fact, Phenom was a big disappointment. I remember running the first set of benchmarks, and thinking: “This is it? It’s a good thing AMD is making good GPUs.”
Now AMD is putting its CPU hopes into Fusion. It looks good, but I’m naturally wary as past AMD efforts to keep up with Intel have been less than robust.
At first, the Pentium 4 seemed great. Then, as frequencies ramped up, so did heat generation. Then AMD came along with the Athlon 64 and demonstrated you could not only build a CPU that ran as fast or faster while running at lower clock frequencies, but one that could run 64-bit code as well.
Intel’s solution was supposed to be Prescott. With Prescott, Netburst would achieve its glorious comeback and show those AMD upstarts just how good an Intel CPU could be. And Prescott turned out to be very good – at generating heat. So Intel finally did some corporate soul searching, turned the boat around, and developed Core 2. Most PC users were much happier, and Intel stockholders breathed a sigh of relief. But ouch, what a disappointment.
I bought an HD DVD player. I’m actually not embarrassed to state that. At the time, HD DVD looked like a better high definition disc format than Blu-ray, and Blu-ray was playing catch up, in terms of feature sets, for a good 12 to 18 months before the technology achieved feature parity. Even then, it was more expensive and confusing (due to the different Blu-ray implementation versions) than HD DVD.
What HD DVD’s failures illustrate is that when it comes to consumer electronics hardware, content is king. Like Betamax before it, which also had a passionate following, HD DVD was doomed to failure as more and more studios lined up behind Blu-ray.
I knew well ahead of time that the PS3 would be a disappointment. Really, it couldn’t be anything but a disappointment. That’s because of the ludicrous pronouncements emerging from the mouths and press releases of various Sony executives and PR people. By the time the console actually shipped, if the PS3 didn’t cure cancer and solve world hunger, it would be considered a disappointment.
It says something about Sony’s design of the PS3 and Sony’s marketing efforts that the PS3 managed to disappoint us well beyond the inevitable overhype backlash. The PS3 has been a failure, but not in the complete crash-and-burn, train wreck failure of something like Microsoft Bob. Every time Sony made a move to make the PS3 more attractive, Microsoft – of all companies – danced around them like Muhammed Ali stinging George Foreman.
Toss in Nintendo’s Wii, which came along with an innovative control scheme at an affordable price, and the PS3 steadily lost traction. The recent success of Kinect versus PlayStation Move just exacerbated that, and now the Xbox 360 is outselling the PS3 in North America almost 2:1. It’s nowhere close to the dominance of the PlayStation 2, which sold almost 150 million units.
I’m going to take a stand and say that stealing digital goods is a bad thing, and it’s good that theft of intellectual property is illegal. (I use the word “theft” specifically. Even open source software can be stolen, if the thief doesn’t adhere to the open source license.)
But it’s amazing how few implementers of DRM actually understand how people use digital media. In my ideal world, any specific instance of a digital media should be usable by anybody, just not at the same time. Implementing that is difficult, but solvable on a case-by-case basis. Witness Amazon’s recent announcement that they will loan out their Kindle books for two weeks. (Yes, I know, Amazon was just following in Barnes & Noble’s footsteps.)
Witness the difference between games that are sold by Valve’s Steam service and Ubisoft’s draconian DRMmodel for their PC games. Both achieve the same goal, but I’ve had numerous occasions when a Ubisoft game just won’t work because the servers are too busy, or offline, or were just in a snit about something. Steam works much more smoothly.
HDMI, and the DRM associated with Blu-ray and DVD, doesn’t bother me the way it does some people. As long as I can take out my Blu-ray disc and play it in another player, or loan it to another person, I’m fine with it. Netflix’s DRM on their streaming service is a little more onerous – you can’t loan a Netlix Watch Instantly movie the way you can a DVD. But you can use your account on many, many devices as long as you can connect to broadband.
Of course, the best DRM is no DRM, as we’ve seen with recent moves by Apple removing DRM from iTunes or Amazon’s mostly DRM-free MP3 store.
At the time BTX was announced, the aging ATX motherboard form factor standard looked like it needed an update. BTX came along to solve the problem of increasing heat inside of PCs. As an engineering solution to that problem, BTX was actually pretty good.
As it turned out, BTX was a flop. That’s because it was trying to solve a problem that was self-inflicted by Intel: Netburst CPUs needed higher and higher clock frequencies to increase performance, and the result were PCs that acted as space heaters. (Anyone remember Intel’s pronouncements that CPUs would hit 10GHz in a few years?) BTX solved a problem that was better solved by building more efficient CPUs.
While it’s true that the PC case industry didn’t want to step up to the added cost of building BTX solutions, CPU cooling and case designers developed clever ways to cool hot PCs using CPU coolers that routed airflow from back to front, much like BTX, only without the added cost. But it was Intel’s Core 2 architecture, which delivered much better performance at lower clock frequencies, that put the final nail in the coffin.
This last one is a big one, for me at least.
I was one of the five or six people outside of Microsoft that actually liked Vista, mind you, and I still found it disappointing. The early bugs when it came to PC games were the catalyst that fueled my disappointment. For some time afterwards, I was something of a Vista apologist. I wish it was because the boys in Redmond paid me off – I would at least have a bigger bank account. Instead, it was the classic “there must be a pony under all this horse crap somewhere” mentality.
Windows 7 is better. Much better.
To all those whom I argued with about Vista: I apologize. You were right.
Okay, let’s hear it: What are your biggest tech disappointments for the previous decade?