For more than a year, Apple’s iPhone has garnered the lion’s share of press and remained a must-have device for gadget junkies. In an industry in which $300 products quickly become free incentives for signing a contract, the iPhone has managed to remain relevant. This is due in part to Apple’s marketing savvy, which made many people—consumers and journalists alike—look past the device’s shortcomings, but also because the iPhone’s innovative interface and full web browser provided consumers with something truly new.
Now, handset-maker HTC, T-Mobile, and Google hope to get some of the attention the iPhone has received by releasing the G1, the first mobile phone to use Google’s mobile OS, Android.
Vic McGuire found a diamond in the rough when he set out to build his latest mod. While browsing through a computer store, he found a custom case with chrome-plated front air grills in the junk pile and an idea came to mind. After arduously sanding the rust off the grills, Vic had the basis for the HawgWild U.S.A.
In one second, the nuclear fusion process taking place inside the sun produces enough energy to satisfy the needs of the earth’s population for nearly 500,000 years. Photovoltaic cells are capable of capturing some of that energy and converting it into usable electricity; unfortunately, today’s technology can’t do this very efficiently.
French physicist Edmond Becquerel first described the photovoltaic effect in 1839. He discovered that some materials were capable of producing small amounts of electricity when exposed to sunlight. The first photovoltaic cell, however, wasn’t created until 1883, and more than 70 years passed before the next major scientific advance took place, when researchers at Bell Labs developed the first crystalline silicon photovoltaic cell in 1954.
It’s the final hour—the last stretch in your race to freedom. Paper footballs litter your desk and paper basketballs surround the trash can. Yet even after these sporting events have ended, the little hand continues to hold a grudge against the 5. It’s high time you find a more efficient—and less obvious—way to pass the time.
Consider this your go-to guide against workplace stagnation. We’ve spent dozens of hours scouring the Internet in search of the most enjoyable and alt-tabbable browser-based games. They require no installation and, best of all, are 100 percent free. When the boss man walks by, you can easily switch to that budget report for accounts payable—he can’t fault you for grinning like a fool at a spreadsheet!
Grin like a fool at totally work-related stuff after the jump.
Take note, Rainier Wolfcastle, because these goggles may actually do something. Nvidia’s latest visual computing venture is a serious foray into stereoscopic 3D, a technology that has not found success among mainstream consumers (or even enthusiasts) in recent history. 3D movies and gaming at home have always been seen as gimmicky, a perception that can largely be attributed to the fact that you have to wear some pretty goofy glasses to experience the effect. In fact, past iterations of 3D stereographic technology (including efforts by the now-defunct company ELSA) have been especially troublesome because they required bulky headgear (that had to be tethered to your PC) that had a tendency to give gamers headaches after just a few minutes of use. Nvidia wants to reinvigorate the 3D stereoscopic market by developing its own glasses hardware and driver software, which they hope will avoid the pitfalls of previous efforts.
Do we have the technology to make stereoscopic 3D tech practical? And more importantly, is this something that, as a gamer, you’d be open to embrace?
We invariably refer to the video memory in modern videocards as GDDR, differentiating it only by version (GDDR2, GDDR3, GDDR4, and now GDDR5), but the technology’s full acronym is actually GDDR SDRAM, which stands for Graphics Double Data Rate Synchronous Dynamic Random Access Memory.
“Double data rate” describes the memory’s capacity for double-pumping data: Transfers occur on both the rising and falling edges of the clock signal. This endows memory clocked at 800MHz with an effective data-transfer rate of 1.6GHz. “Synchronous” refers to the memory’s ability to operate in time with the computer’s system bus. This allows the memory to accept a new instruction without having to wait for a previous instruction to be processed, a practice known as instruction pipelining.
The ubiquitous Zip file compression format has been a staple of PC users since it first made its debut as PKZIP in the early 90s. Back then, the size limitations of floppy disk media and the painfully low-bandwidth dial-up connections made file compression a complete necessity. The Zip format today, while still popular, has largely been eclipsed by RAR compression, which has offered slightly better compression at the cost of archiving speed. That’s why we were so surprised to hear that WinZip 12, which launched yesterday, boasted an unbelievable 25% compression ratio for JPEG images – without sacrificing image quality. Ever the skeptics, we put the new software to the test, and grilled WinZip’s VP of Development about how this new algorithm works.
Any bets as to whether WinZip's claims are justified?
Face it, activation is a failure. For power users who frequently upgrade their PCs, dialing in to reactivate the OS is beyond irritating. Instead, Microsoft must come up with a novel way to punish pirates without annoying its paying customers. (May we suggest displaying massive popup ads in pirate copies of Windows?) For legitimate customers, a realistic home-licensing program—buy one copy at full price, get four more upgrades for $50 to $100 each—would go a long way toward creating goodwill.
We sat down with Microsoft to hear the company’s side of the Vista story. What lessons have been learned following the worst Windows launch in the company’s history? Is Microsoft doing enough to regain PC users’ faith?
Way back in January 2007, after years of hype and anticipation, Microsoft unveiled Windows Vista to a decidedly lukewarm reception by the PC community, IT pros, and tech journalists alike. Instead of a revolutionary next-generation OS that was chock-full of new features, the Windows community got an underwhelming rehash with very little going for it. Oh, and Vista was plagued with performance and incompatibility problems to boot.
Since then, the PC community has taken the idea that Vista is underwhelming and turned it into a mantra. We’ve all heard about Vista’s poor network transfer speeds, low frame rates in games, and driver issues—shoot, we’ve experienced the problems ourselves. But over the last 18 months, Vista has undergone myriad changes, including the release of Service Pack 1, making the OS worth a second look. It’s time we determine once and for all whether we should stick with XP for the next 18 months while we wait for Windows 7. But before we answer that question, let’s review exactly what’s wrong with Windows Vista.
Earlier this summer, both Nvidia and ATI hosted press events to unveil their new hardware—and the excitement about GPU-based encoding was palpable. We were promised that our videocards would make Photoshop faster and better and our GPUs would encode video 10 times faster than our CPUs. In fact, someone lacking tech savvy would have left these presentations thinking, "Wow, these GPU things can make common computing tasks run insanely fast, and there are a couple of games that work with them too." Of course, as is typical, the truly big promises (like 10x faster video encodes) were off in the future, when the software was "ready."
Well, the software's nearly ready. Elemental's Badaboom uses Nvidia's CUDA interface to do lots of the grunt work of DVD ripping by using the GPU instead of your musty old CPU. I've been in the Lab for the last few days putting this app through the ringer. Our test bed for this challenge is an Intel Q6600 quad core, running at a stock 2.4GHz, with 4GB of memory and a GeForce GTX 280 reference board.