Intel's placing its bets on more than just the company's top-notch fabrication facilities; the company apparently has a stake in creating future generations of robot overlords, as well. Less than a month ago, Intel unveiled a new research project designed to make technology that's smart enough to learn its user's personal quirks and adapt accordingly; then just last week, Intel researchers published a proposal for a new, neuromorphic chip design -- hardware that mimics the human brain.
When web surfers aren't busy calling each other Nazis on forums, they're often cracking jokes about greeting their future robotic overlords with open arms. It won't be funny forever; the groundwork for our eventual demise is already being laid by the best minds in the land. IBM announced that it had created prototype cognitive chips modeled after the human brain almost a year ago, and today, Reuters reported that Intel is launching a research project in Israel dedicated to creating smart tech that can learn the habits of its users. (That way, SkyNet will know the best time to strike.)
Over a year ago, I wrote in this space that 3D TV is inevitable in the home theater market. I still feel that way, and I’ll explain why.
I saw my first 3D movie in 1953. It was House Of Wax, starring Vincent Price, Phyllis Kirk, Carolyn Jones, and featured a pretty scary newcomer named Charles Bronson. It was directed by Andre de Toth, who ironically only had one good eye.
To this day, it remains one of my favorite 3D movies, and I wish Warner Bros. would get off their butts and release it on 3D Blu-Ray, perhaps a double set with Phantom Of The Rue Morgue, starring Karl Malden. I’d also like Universal to release a box set of The Creature From The Black Lagoon, Revenge Of The Creature, The Creature Walks Among Us (not in 3D) and It Came From Outer Space.
Computers are getting smaller. Processors are getting smaller. Why shouldn’t hard drives get smaller, too? Don’t worry – IBM’s working on it. Late last week, the company announced that its researchers had “successfully demonstrated the ability to store information in as few as 12 magnetic atoms.” In comparison, it takes close to a million atoms for current HDDs to store a bit. Apparently, being dense is a good thing!
Somebody had the good idea to put a camera into a cellphone. This was a good idea. It was a great idea. What made it even better was including a slot for a Micro-SD card. I have a 32-gigabyte chip in my phone and I haven’t run out of storage yet. I can shoot photos or movies wherever I go—and email them immediately. I can read e-books or listen to music or watch videos. (The Samsung Galaxy phone has a great screen.)
The smartphone is a combination of many good ideas and its overall usefulness should be a guide for all manufacturers of portable electronics. So why doesn’t the iPad have a memory card slot? Why doesn’t Amazon’s Kindle Fire have a slot for an SD card? Who knows, but here are some other good tech ideas that need to be implemented ASAP.
After watching Captain Picard solving all those Victorian murder mysteries on the Enterprise’s holodeck, we have to say that staring at a basic, flat-panel monitor is sooooo 20th century. Wasn’t the future of television watching supposed to be way cooler than this by now? Yeah, it was, but don’t worry; those spiffy high-tech displays have only been delayed, not scrapped entirely. A veritable army of hard-working engineers have been laboring day and night to bring flexible phones, holograms you can feel, physical 3D interfaces, and touchscreen, well, everything to your living room, car and workplace sometime soon. And hey, we’ve got actual pictures to prove it!
The problem with predicting the future is that there’s so much of it. You can predict some pieces of it because some trends are obvious, but you can’t predict how all the pieces are going to fit together, and even more difficult, you cannot predict what human beings will do with all those different pieces once they have put them together.
The smartphone is a great example. Robert A. Heinlein predicted cell phones in The Star Beast, first published in 1954. Other writers predicted tablets as well. But nobody predicted Twitter or sexting. Those were surprises.
We’re on the threshold of another leap forward in the punctuated evolution of computing technology and the first pieces are starting to appear. I think it’s inevitable that some of these pieces are going to mate, mutate, and evolve into something new.
Every hero is a villain, every villain a hero. Truth is that even the greatest people in history had at least a hint of the dark side within them.
Today we look at an assortment of men inside—or merely tied to—the tech industry. Some are merely controversial, others are clearly of the bad seed variety. But do they deserve their status? How evil are they?
We come to conclusions, from Assange to Zuckerberg. Come along for the ride.
The path of human progress is paved with tiny innovations. While most technological progress has been barely perceptible throughout the history of human invention, a handful of breakthroughs have radically changed the way humans live in the world. Here are 30 of the most life-changing technologies of all time.
The electronics revolution is changing the nature of law enforcement. Security cameras, tracking devices, micro-chips and other anti-theft measures are making it harder than ever to steal things and even harder to profit from that theft.
Meanwhile, expanding technology is giving us near-universal surveillance, making detection of crimes and apprehension of criminals a lot easier. But will enhanced technology give rise to more sophisticated criminals with more evolved criminal activities?