Science and technology have always been close bedfellows, however sometimes scientist’s dream up new technologies that completely and utterly change everything. A pair of engineers at Harvard have been doing just that, and amazingly, have found a way to store around 704TB of data in a single gram of DNA. I re-read the findings of George Church and Sri Kosuri several times, but it took a while to finally grasp the concept that the entire contents of my NAS could be stored on the surface area of my pinky finger.
Ah, the bathroom. Those little bursts of personal time are some of the best moments of the day, an all-too-brief period when screaming kids and jerk coworkers leave you alone and the worries of real life fade away, letting you game on your smartphone in peace. Well, at least until you plunk that smartphone into the toilet, that is. The New York Times R&D Lab’s hard at work to make sure that your Android keeps dry; it's whipped up a “Magic Mirror” designed to help you get a hands-free Interwebs fix in the john.
They always say the best camera is the one you have with you, but if you're like me, then you probably have more than one camera phone shot that probably wasn't even worth the effort. Low light images taken with small sensors often come out dull and granny if you're lucky, but more often the picture serves as a reminder that the pinhole camera you used wasn't really up to the task. The bigger the camera the better the picture seemed to always be the rule of thumb, but a little known technology called Quantum Dots could challenge this theory.
Manufacturer InVisage explains that the quantum dots used in its technology are actually tiny semiconducting crystals and are able to absorb various colors thanks to a "doppelganger trick". If the technology lives up to its promise, it means smaller cameras will be able to capture more light then they do currently, vastly improving the quality of low light shots. "Placing the quantum dots on top of the electronics means more pixels can be crammed into a given area and less incoming light is lost. Moreover, photodetectors based on quantum dots produce less noisy images, so the picture is sharper even if the number of pixels is not increased."
No products have been announced, and sure this is all still just scientific theory, but I think batteries and cameras are two technologies that everyone is hoping will get better and smaller in the next few years. Intel can make a million transistors dance on the head of a pin, but my Blackberry still can't take a picture outdoors after 5 PM, go figure.
Battery technology has long been considered the Achilles heal of modern gadgetry, but the smart folks over at Stanford may just be on track to solve the issue. A new battery type is in the works called "lithium-sulfur" which currently offers an 80 percent improvement in capacity over Li-ion, and promises a theoretical increase of up to 400 percent as the technology matures.
The catch, (yes there's always a catch), is that in its current state of development the battery becomes unusable after only 40 to 50 charge cycles. Li-ion by comparison can typically handle anywhere from 300 to 500 full discharges. This is a pretty major caveat, and likely means it will still be quite awhile before this starts getting commercialized.
The real breakthrough with lithium-sulfur is the silicon nanowire technology which has actually been experimented with by Stanford engineers since late 2007. The new batteries are being described not just as more powerful, but safer too. Of course it's not like batteries kill people right?
Dell, among other PC OEM's is always looking for the next big thing in PC form factors, and industrial designer Pauline Carlos thinks he has just the thing. The "Froot" does away with a traditional monitor, as well as the keyboard and mouse. The device itself contains a pair of projectors, one which takes care of the user interface, and a second one on the back of the unit to project a keyboard.
Its hard to imagine something like this being anything more than a niche product for our audience, but it certainly makes you wonder, will we ever become so comfortable with touch typing that physical keyboards as we know them today would become an unnecessary eyesore for the mainstream consumer?
Nokia, the world's largest mobile phone maker, announced on Tuesday plans to cut 220 jobs at research and development units in Japan. The latest cuts represent continued efforts to reduce costs.
"Japanese manufacturers are important partners who play a critical role in Nokia's global supply chain strategy and with whom Nokia continues to develop its world-class logistics operations," the Finnish company said.
Trimming staff is nothing new for Nokia as of late. Last week, the mobile handset maker handed out 330 pink slips in Finland and Denmark at its R&D operations that globally employ 17,000 workers, Businessweek reports.
Nokia said its latest round of cuts won't affect Japanese operations of Nokia Siemens Networks, or Vertu, Nokia's higher-end handset line.
Anyone who follows Intel closely knows that they don’t just pump out high end CPU’s, but they actually dedicate entire teams to “pie in the sky” ideas of what future technologies might look like. This could be anything from an x86 cluster of CPU’s to render video, or in this case, using your brain to control a computer. It may sound farfetched, but its something Intel and its researchers have been actively studying for sometime now.
Currently scientists are focusing on how the brain reacts when interacting with a computer, and then learning ways to interpret this data to execute commands on the machine.The idea here is to allow your thoughts to take over for your mouse and keyboard. Intel is of the belief that an implant would make this easier, though I’m not entirely sure how many volunteers they are going to get with that idea. “Eventually people may be willing to be more committed… to brain implants" said Intel’s Vice-President of future Technology, Andrew Chien. "Imagine being able to surf the Web with the power of your thoughts”
You may have your doubts, and so do we, but it might interest you to know that researchers have already made significant strides in the field of reading brain patterns, and have already identified certain words such as “bear” that cause everyones brain to react in a similar manner. “I think human beings are remarkably adaptive,” said Chien, “If you told people 20 years ago that they would be carrying computers all the time, they would have said I don’t want that, I don’t need that. Now you can’t get them to stop. There are a lot of things that have to be done first but I think implanting chips into human brains is well within the scope of possibility”. Chien speculates we will be lining up for implants as early as 2010.
Organic light-emitting diodes, or OLEDs, are often touted as the next big thing in display technology, offering brighter colors, true black, lower power consumption, and better off-axis viewing than traditional LCD screens. They’ve popped up in gadgets from high-concept to mundane: The infamous Optimus Maximus keyboard, for example, utilizes many tiny OLED screens in its programmable and customizable keycaps, and both Sony’s new X-series Walkman and Microsoft’s new Zune HD have OLED screens. OLED technology has made great strides in the past 10 years, and cheaper and better manufacturing processes mean they’ve started appearing in everything from media players to phones to high-definition televisions—even keyboards. But what are OLEDs?
If you've ever been subjected to a babel of echoing voices during a teleconference, Microsoft Research is working on a solution. As demonstrated (link requires Microsoft Silverlight) at this week's TechFest, MR's audio spatialization project enables a PC with stereo speakers to spatially separate different members of a teleconference. Audio spatialization's been used for years in 3D gaming, but Microsoft Research has added a new twist: to make it work for teleconferencing, it's also added echo cancellation. As researcher Zhengyou Zhang puts it:
Audio spatialization uses speakers to create the illusion that call attendees have different locations spatially. This allows you to use the audio sense you already have, that you normally use in conversation, to isolate who you’re talking to, and to associate a location in space with a particular individual... In a conference where there are multiple voices coming out of multiple speakers, it becomes important to eliminate the echoes that might naturally occur.
Microsoft Research's latest chance to shine is this week's TechFest 2009. Microsoft Research has a long list of innovations, including the Microsoft Surface touch-sensitive interface, the Unwrap Mosaic video editor, the Songsmith music composing utility, Image Composite Editor, and many more. TechFest serves two purposes: it makes sure that everyone at Microsoft can tap into what's being developed at Microsoft Research, and it acts as a sort of high-tech equivalent to an auto show, demonstrating the concepts that might (or might not) make their way into future products from Redmond.
This year's TechFest features projects as varied as combining multiple cell phone videos to create a high-res version; using digitized books on video DVD to create a high-capacity, low-cost library and school resource for developing countries, and ways to create Augmented Reality, which overlays digital data with real-world information, to name just a few.
So, how important are Microsoft Research projects to Microsoft's future? As Microsoft Research head Rick Rashid sees it, the investment Microsoft makes in research is "really about an investment in survival." What do you think is the coolest concept at this year's TechFest? Join us after the jump and tell us about your favorites.