Decade in Review: the 25 Most Important Tech Moments of 2000 - 2010

Amber Bouman

A lot can happen in ten years.

When we sat down to try to list the 25 most important tech events of the decade, we began by listing, well, events. And it’s true that certain key events shaped the decade in tech. But it’s a moving target; there are also movements and trends that change how we view and use technology over the years.

So instead of trying to create list of discrete events, I’ve mixed them all up. Some relate to specific companies, some point to general trends and a few… well, a few are just odd quirks of our own.

The truth is that, while we have these ordered based on our own notions of relative importance (and a trend you might notice), these are all seminal moments, each in their own way. What might be the most important to us might be the least for you.

So contemplate our list, and then let us know yours.

Also, in case you missed our other coverage, please make sure you see our other year-end stories, including: 13 Biggest Tech Blunders of the Decade.

Let’s get into it.

25. We are All Authors

Whether it’s Facebook, blogs, online photo sharing, Wikis or any of a host of related activities, the ubiquity of broadband connections throughout the world has enabled a vast array of people – talented, untalented and in-between – to express themselves. There’s always someone in the world who knows more than you or I about a specific topic, and being able to tap into that vast knowledge base enriches all of us.

Also, everyone can create digital media. We all have cameras, camcorders and digital audio recorders, even if it’s just our cell phones. User-created content is everywhere, and it’s not just text. Trying to figure out what content is good? Now that’s a different story.

24. Curated versus open content

So we’re all creating content. That’s great, but who decides what content is great and what’s not? Or, on a more sinister note, who decides what’s appropriate or what’s not.

When we entered the 21st century ten years ago, there was a great hope among the Technorati that the Internet would become the great, unfettered open world of information, with the best and most useful info bubbling up to the top based largely on its own. Know-it-all editors would be banished forever.

Uh-huh. Sure.

Some of that has happened, to be sure. But whenever there’s a lack of control, someone wants to step in and impose it. Whether it’s Apple, deciding what’s appropriate for the App Store, the Great Firewall of China, DMCA or a host of seemingly unrelated organizations, commercial or public, control is being imposed. And freedoms are being restricted.

Like the physical world, though, for every reaction, there’s an opposite reaction, as we’ve seen with Wikileaks.

23. The rise of social networking

You can find me on Twitter as loydcase . Or on Facebook. Or on Quora or on… well, you get the picture. A vast array of people are now connected across a variety of social media platforms. At the same time, social media capabilities are steadily being integrated into more traditional applications, whether its gaming, photography or even our work lives.

Social media enables us to connect or re-connect with long-lost friends, or stay in touch with people we’ll never, ever meet in person. It also enables us to throw out questions we have to a wider, often more knowledgeable audience, or help out by answering those questions. And remember the problem of trying to figure out what user-generated content is actually good? Social media is actually pretty effective at helping the good stuff bubble up to the top.

Love it or hate it, social media will become increasingly more pervasive as time goes. We’ll all have to learn to either manage our privacy well, or become comfortable with every nuance of our lives being very public.

22. Everything is a Game

Checked into Foursquare today? Lined up another Xbox achievement? Played your Panera slot machine card?
The tenets of gaming are beginning to penetrate everyday life. We’re not entirely convinced that these are all good, however. After all, do we all want to live our lives in a world that’s a collection of giant Skinner boxes? Probably not.

On the other hand, we might see games that have the power to change the world, as World Without Oil tries to do, or see more game concepts get folded into training and education.

This much is clear, however: Play and play techniques that are built into the heart of modern board and electronic games will attract and shape a wide range of users. How that capability is implemented determines whether gaming will, in the long term, be a force for positive change or not.

21. DIY Culture

The broad reach of the Internet, the nature of internet communities and the ease of access to a wide range of digital tools means we’re seeing more people roll their own products. Even the open source movement has its roots in DIY culture, with groups of developers creating their own OS distributions. Gaming has seen a vast array of indie developers and even complex activities like launching a camera into space have become DIY activities.

DIY has become a huge phenomenon, spawning creations like the gaming phenomenon Minecraft or the Steampunk video series Riese: Kingdom Falling . So whether you simply build your own PC or create something more complex, you’re part of a broader movement.

20. The rise of broadband

Underlying the first three trends is the increasing ubiquity of broadband. A decade ago, cable modems were relatively new. Rates ran around $90 a month for a 2 megabit connection. Now the same fee will get you an unlimited 40 megabits down, and 8 megabits upstream.

Broadband isn’t everywhere, and there are still a lot of people who don’t have it. But even people who may not have a fat pipe running into their homes may use it at school, the local coffee shop or on the train.

It’s broadband that enables us to share photos and video. It’s broadband that enables multiplayer gaming. In the next 10 years, broadband will supplant cable TV, movies, sports, and more.

19. Wireless

I’m talking about wireless in the general sense, as in not needing wires. Whether it’s Wi-Fi or mobile broadband or Bluetooth or the emerging 60GHz frequency stuff, wireless technologies is already shaping up to be to the next decade what wide broadband availability was to the previous one. Coupled with smartphones and mobile PC technologies, wireless will enable us to be more connected than ever – for good or ill.

18. Media Electronica

Slowly, but surely, all media is becoming digital. The obvious forms, of course, are music and video. Movies are also increasingly becoming digital, though many are still shot on film.

Perhaps the most dramatic transformation, though, is how fast books are going digital. Google’s initiative to digitize every book ever written and ebook readers like the Kindle series from Amazon, are rapidly changing the way we read books, how they’re distributed, and how writers and publishers are paid. To be honest, we’re surprised at how fast this is happening.

Purely digital forms of art and media are cropping up too, ranging from electronic games to machinima. Even the creation of music is moving towards a purely digital incarnation. Query: In the coming decade, will we see an art masterpiece, created digitally and distributed digitally, that sells for millions of dollars?

17. Digital Distribution

Since all media is going digital, it’s natural that we’re simultaneously witnessing digital distribution become the norm. Since digital distribution is so disruptive, it’s no surprise that technology-focused companies have grabbed the leadership role in this evolution, and not media companies.

Apple’s iTunes is the big kahuna for music distribution, though Amazon is gaining traction as Zune fades into the background. Steam seems to be the leader in digital distribution of PC games, but Impules, Direct2Drive and GamersGate nipping at Valve’s heels. Netflix began as a company renting DVDs by mail, but has become the movie and TV show streaming king. The big game console companies offer online distribution of indie and some tier one titles. And this doesn’t even take into account the advent of streaming services such as Pandora, Last.fm, or Rdio .

Whether you read, listen, watch or play, it’s a sure bet that some part of that will be distributed digitally.

16. Green Energy

The Chevy Volt won Motor Trend’s Car of the Year Award. That, in itself, says much about how green power has captured the imagination of businesses and individuals.

While most power generation is still derived from fossil fuels, we’re starting to see greater emphasis on alternative power sources. Solar power seems to be getting most of the attention, whether its mega-generation facilities built in the Mojave desert or increasing numbers of consumers installing panels on their homes. Even so, other interesting alternatives are seeing the light of day, including tidal power generation and more wind farms.

Cars like the Prius have paved the way for automobiles with a stronger focus on plug-in capability, including the Volt, Nissan Leaf and Tesla Motors. The next decade will be a seminal one for alternative power.

15. iPhone

Contrary to popular thinking, smartphones existed before Apple shipped its first iPhone in 2007. Microsoft’s Pocket PC OS, which eventually morphed into Windows Mobile, existed in a number of different smartphones. Almost all were clunky and had user interface issues. RIM had been shipping Blackberry phones for years before the iPhone, and had its own dedicated following of “Crackberry” devotees, but the Blackberry was really focused on corporate email and scheduling, and lacked a wider audience.

The iPhone not only made smartphones sexy, but the app store opened up a huge new business for smartphone applications. Other smartphone makers have tried to emulate that with limited success – though Android seems to be gaining significant traction.

14. The Cloud & the Return of Big Iron

Back in the 1960s and 1970s, mainframe computers were the mainstay of computing, through minicomputers providing departmental access to computer resources. As the PC moved to the forefront, the era of big iron seemed to be over. But trends tend to spiral back, rather than move in simple straight lines, and now large scale servers are back in fashion.

Big iron is back, albeit in a different form than the mainframes of past eras. The reason is the increasing use of “The Cloud” – storage and compute resources that exist on the Internet, rather than locally on your PC. The cloud has major benefits, the biggest being easy access to data from any platform or location. Now we’re starting to see interesting experiments like OnLive , which is trying to deliver a robust gaming experience from the cloud using very limited hardware on the client end.

Valid concerns exist regarding data and apps on the cloud. If Google goes down, all your Google Docs are inaccessible. But the cloud is here to stay, and will likely shape how we use computing resources in the future.

13. Android & the Rise of Google

Google was once just a search engine, generating revenue with an advertising-oriented model. As the company amassed a gargantuan warchest from the vast amount of ad dollars the company collected, it began branching out. Some of these experiments proved highly successful, like Chrome and Gmail. Others were failures, like Google Wave. Honestly, even the failures were interesting.

It’s looking like Google’s biggest success after search will be Android. Taken as a whole, this open-source mobile operating system has surpassed Apple as the biggest smartphone OS, though iPhones sell better than any single Android Phone. Android is shaping up to be the Windows of the phone world, while iOS is, well, the Apple of the phone world.

12. PCs become commodities and the death of chipsets

We love our PCs. We build all our own desktops, and tend to mod laptops with larger amounts of storage and RAM. Even so, we know in our hearts that the PC is really just another commodity. When one of the primary selling aspects of many laptops is their appearance, we’ve moved beyond PCs being technological icons.

As PCs have become increasingly commoditized, only a few large companies can really stay in the PC business. At the start of the decade, we saw a half-dozen companies developing and selling core logic. Now we have Intel building chipsets for Intel platforms and AMD creating them for AMD platforms. Nvidia is out of the desktop chipset business, and its mobile chipsets are pretty much restricted to Ion. Via is only doing chipsets for its own Centaur-designed CPUs and SiS seems to have given up, at least in North America and Europe.

As CPU manufacturing processes have shrunk, more core logic functionality is being built onto the CPU. All modern CPUs now have on-die memory controllers, and soon on-die graphics will be mainstream. Only I/O functionality, which changes rapidly, will remain. The net result: lower costs for PC components and even more commoditization.

So while PCs are still essential, they’re now just part of the larger technology ecosystem that’s part of our digital lives. It’s easy to speculate that devices like smartphones and tablets may take over that spot in the next decade.

11. Windows 7

I mention Windows 7 not so much because it’s a great desktop OS. It is. In fact, you could make a case that Windows 7 is really the OS that Vista should have been.

But Windows 7 also represents a renewed and reinvigorated Microsoft. After Vista, the company was shamefully perceived as a technological also-ran, a dinosaur doomed to eventually fade into irrelevance. Now, Microsoft is looked upon as an underdog. From monolith to irrelevance to underdog in 10 years is a monumental seachange, and even though companies like Google and Apple still get more attention, Microsoft appears to be embracing its underdog role.
Additionally, Windows 7 Mobile looks to be a bigger success than its detractors predicted, though it’s still not in the same category as Android or iOS.

The real question is how corporate and technological leadership will shape up at Microsoft in the post-Bill Gates era. If Microsoft has more Steve Sinofskys waiting in the wings, the company will be in good shape.

10. Connected CE Devices

Most of the Maximum PC crew use Netflix Watch Instantly to stream movies and older TV shows to our HDTVs. Most of us do not use PCs for this. Some of us use game consoles, but several use Blu-ray players and TVs that are connected to the Internet with Netflix built-in.

We’re seeing more and more of these web-connected consumer electronics devices arriving on the market. You can control a Dish ViP-722k satellite DVR with an iPad via the 722k’s web connection. You can now do the same with Comcast’s newest cable boxes. An increasing number of flat panel TVs offer apps, widgets and simple web connectivity. It’s a gated experience, to be sure, with limitations. But being able to use a Blu-ray player, which only uses 18W when turned on, instead of a PC or Xbox 360, also saves power.

Of course, that opens up the possibility of really weird device connections, like IP enabled toasters. Overall, it’s a good trend. Our only real worry is security. Already, security issues have been uncovered with some HP web-connected printers. Will my refrigerator someday become part of an illegal botnet, without my knowledge? The crystal ball is cloudy.

9. Digital TV

If you read some commentators a few years ago, the changeover to digital television was shaping up to be a consumer electronics apocalypse. Viewers everywhere would go into fits of rage when their analog TVs never worked.

**Yawn**

As it turned out, the DTV transition, though delayed a few months, went smoother than anyone anticipated. Part of the reason was the legwork that all the broadcast stations had done, making sure all their broadcasts were digitally enabled. In addition, many people already subscribed to cable or satellite, which had already transitioned to digital in their set-top boxes. The sexiness of flat panel TVs has had a big impact, since all flat panel HDTVs are digital in nature.

8. Consoles Take Gaming Multiplatfom

At the start of the 2000s, PCs were pretty much the bleeding edge of gaming. Sure, console gaming was a big deal, but all the nifty new games that took advantage of new graphics features and pushed the state of the art when it came to raw technology ran on PCs. This culminated with the 2007 release of Crysis, a PC game that still hammers high end gaming systems.

Now, though, it’s a multiplatform world. New, innovative, and absorbing games are first targeted at one of the big three consoles. For a few years, it looked like PCs had become completely irrelevant for mainstream gaming, unless you were an MMO addict.

The pendulum has swung back a bit. For one thing, the console hardware cycle has lengthened, so we’re not seeing a new console every five years. Instead, the companies are iterating, with additions to existing platforms, like Microsoft’s recent Kinect add-on for the Xbox 360.

Game companies now maintain dedicated PC teams for large-scale titles, which work alongside console dev teams to make a PC version of a multiplatform game feel like a true PC title, and not a cheap port. Cool new indie games and digital distribution has helped, too.

We’re unlikely to ever see many AAA titles which are PC exclusive ship, though there are exceptions like Starcraft II. For the most part, it’s now a multiplatform gaming world, but what that means is that PC gamers mostly won’t feel left out. (Though I’m still looking for that PC port of Drake’s Fortune.)

7. The eternal battle: AMD versus Intel

AMD has proven more resilient than anyone expected, although its current CPU line is no longer competitive at the high end – or really, even the midrange. Still, the company is doing some really neat stuff, like its current effort to build DirectX 11 programmable graphics into the CPU die with its Fusion effort. Buying ATI Graphics seemed like a dubious move at the time, but it’s paid off in spades, giving AMD renewed energy. Dumping fabs seemed to have also helped.

In the early part of the decade, Intel appeared moribund. Netburst looked like a dead end, and the dual core versions of Netburst – the Pentium D – could have doubled as miniature space heaters. The Prescott generation was supposed to fix all that – but proved to be just as hot. Netburst had hit a wall, and future projects were cancelled. Intel put all its efforts behind an architecture originally developed for mobile PCs by its Israeli design team, known simply as Core. Eventually, Intel delivered the CPU internally called Conroe. The result was the fastest turnaround in a large technology company we’ve ever seen.

The Core 2 line, and its later iterations, reshaped the desktop PC. While AMD had been pushing the IPC efficiency mantra for several years, the Core 2 line legitimized that approach. Now Intel is all about performance per watt, and subsequent generations have improved performance without pushing up power draw. This effort allowed Intel to also push down into lower power CPUs, like Atom. Once again, a company that seemed like it was slowly being relegated to the backwater of tech companies that couldn’t keep up was again in the forefront.

6. The CPU Wars: AMD (ATI) Versus Nvidia

The battle for the heart and soul of graphics processing has been a back-and-forth between AMD (formerly ATI) and Nvidia for years. On the surface, it’s always looked like an uneven fight, with Nvidia’s rapid growth giving it vastly more resources than AMD’s graphics group, including some of the leading graphics architects.

AMD had some of its own cards, including the architecture team brought over after its acquisition of ArtX, which designed the GPU for Nintendo 64. AMDs graphics chips have been reflections of the company that built them – lean, spare designs that made the most of the resources available and were power efficient. AMD GPUs sometimes even briefly captured the raw performance crown for brief periods, as it did with the original Radeon 9700 and the more recent Radeon HD 5870.

On the other hand, you won’t find a more competitive corporate culture than Nvidia, and its GPU division has constantly come back with interesting, but brute-force designs, though they’ve only recently embraced the performance per watt mantra. They’ve held the raw performance crown more often than not, but have stumbled in other areas on occasion, as its issues with mobile GPUs in 2008 demonstrated.

It’s always been an interesting battle to watch, but as the Internet and even desktop apps become more visual, or more able to take advantage of parallel GPU compute resources, it’s also going to have a bigger impact going forward.

5. The Death of High Fidelity

Many of us have 5.1 or 7.1 channel audio systems in our TV rooms. We embrace high fidelity audio at Maximum PC, but we’re beginning to feel like this is becoming more of a rarity.

The iPod and related MP3 players, have pretty much killed high fidelity. Highly compressed audio is the norm, and some studies have shown that the new generation of music listeners actually prefer the sound of compressed music to uncompressed audio streams . That should be no surprise – you like what you hear most often.

Recently, an Audio Advisor catalog, which focuses on audiophile gear, arrived in the mail. We were amazed at just how many products were really pricey DACs and tube-based iPod docks that purported to make compressed audio streams sound better. Sigh—Hi-Fi is truly dead.

We do hold out some minimal hope, mostly because of Blu-ray. We’ve recently seen Blu-ray concert discs that use uncompressed PCM audio for 7.1 channel playback. It’s hard to see this ever becoming mainstream, however.

4. The GPU Achieves Parity

The CPU was the heart of the PC. If had been up to Intel, that would have been the case forever and ever. Almost single-handedly due to Nvidia’s efforts initially, with AMD stepping up in a supporting role more recently, the GPU is finally becoming appreciated at an equal partner inside the PC. Don’t believe me? Consider Intel’s new Sandy Bridge CPU, coming out early in 2011. Intel has been busy touting how effective its dedicated video block is for decoding and encoding video – without even touching the CPU cores. Sure, video is just a baby step, but it’s a pretty big concession from the Guys in Blue.

It helps, I suppose, that Intel and AMD are integrating GPU functionality into CPU cores. But really, it’s the applications that are proving the point. Whether high end apps, like Premiere Pro CS5 or math packages like Mathematica 8, GPU computing is gaining traction.

3. Space Exploration goes Private

As pressure has increased on the NASA budget, private companies have formed to build rockets, with the eventual goal of making near-earth orbit space travel a mainstream mainstay in a few decades. While a lot of attention has been focused on Virgin Galactic, whose design efforts are headed by Burt Rutan, most of the successful efforts have involved using existing launch vehicles to deliver commercial payloads into orbit.

The next decade should see increased activity, as these companies ramp up to deliver on promises of human travel into near-earth orbit. Who knows, maybe for the price of a typical luxury cruise, you’ll be able to pay a visit to an orbiting hotel before the next decade is out.

2. The Internet Bubble

We can’t really talk about the last decade in tech without acknowedging the Internet bubble of 2000-2001. Large numbers of startups crashed and burned as hugely overvalued companies discovered that investors really do care about making profits. We’ll have a moment of silence for companies like Webvan and Pets.com.

Was it a case of too much, too soon? If you look at some of the companies that sputtered out during the Internet bubble, you’ll find similar companies doing business in similar spaces today. What you won’t find is ridiculously overpriced valuations without business models that are geared towards making money.

Except for Twitter, of course.

1. The Fight for the Internet’s Soul

We’ll close with this last one which should give us all food for thought. As we write this, Apple has just announced its pulling the Wikileaks app from the iTunes App Store. Amazon’s hosting arm showed Wikileaks the door. And you can’t send money to Wikileaks via Bank of America or Paypal.

The Wikileaks saga is just one highly visible element in an ongoing battle for the heart and soul of the Internet. The Internet started out as a government-funded research project geared towards helping researchers more effectively communicate.

Since the advent of the Web – initially driven by Tim Berners-Lee, himself a researcher, the Internet has mushroomed into a gigantic hodgepodge of information wells and connected communities. It’s also become fertile ground for corporations, new and old, looking at the Internet as a vehicle for making money. We have no problems with making money on the web. But we do have problems when making money takes precedent over the free flow of information that was the hallmark of the early web.

This will only get worse as more governments get more involved. Given the huge array of different governments, conflicting laws and wide range of world views, there’s a real fear that we’ll get an Internet built on the least common denominator effect – a passive medium available for only the most vanilla, least threatening and least offensive information flows, all metered, regulated and paid for in microtransactions for each bit of data that flows.

The real battle in that arena has only just begun. And it won’t be won by hackers launching cyberattacks on commercial and government websites and servers, that’s for sure. It will only be won if each and every one of us who all have a vested interest in unfettered information access make our voices heard to all our governments and corporate entities.

This is just the beginning. Make yourself heard.

Around the web

by CPMStar (Sponsored) Free to play

Comments