Easily the coolest part of today’s TED event was Dr. Pattie Maes’s “Reframe” presentation on new technology interfaces. Maes, a researcher at MIT’s Media Lab, energized the crowd with a demonstration of a $350 piece of technology that her team dubs “the sixth sense.” Maes’s Fluid Interfaces research group collaborates on projects and inventions that augment the interaction between human and machine, including both visual and haptic interfaces that are far more immersive than our traditional keyboard and monitor.
Maes started by discussing the five natural senses that humans have developed over the past million years of evolution. These senses help us make important decisions in everyday life, including how we interact with other individuals and our physical environment. But arguably, the most useful stimulus we come across is information that we don’t have easy access to via these senses, such as large amounts of aggregated data and factual knowledge. Increasingly, all of this knowledge is being stored and made available online.
The question, then is whether we could develop (either naturally or artificially) a sixth sense to detect this meta-information that may exist and is relevant to our decision-making. A common response is that modern smartphones already cater to this need, with access to databases like Wikipedia and Amazon user reviews. But the cellphone interface is cumbersome because you have to manually search through google to get to the data you need – the access isn’t immediate.
Maes’s solution: an interface that doesn’t force the user to change their behavior as they access this sixth sense data. Her team created a prototype utilizing a digital webcam, battery-powered projector, mirror, and phone that costs under $350 that satisfies that requirement. Maes actually work this device on stage, which looked like a lanyard necklace with a webcam strapped to the top and a pico projector handing near the bottom.
A video showed how the device would be used in everyday situation. The user hangs the lanyard over their necks and wears colored-coded caps on each of their index fingers and thumbs (red, green, blue, and yellow). The camera recognizes these colors and allows the user to make gestures in front of them to activate certain commands. For example, the demo video showed an MIT researching gesturing a picture frame by connecting his index fingers and thumbs, and the camera automatically takes a snapshot to save on the attached cellphone.
In another example, the user walked up to a novel in a bookstore, and held it in front of him. The camera identifies the book with either visual recognition or an RFID marker, and projects the Amazon rating on the book cover, along with relevant reader comments. The user could even open up the book and get annotated comments projected onto the pages for more detailed data.
Additional potential demoed included projecting relevant video clips on a newspaper as the user read through news stories, or even casting a web cloud on the body of someone you’re interacting with to identify their interests and occupation.
Maes noted that this Minority Report-like device was also similar to the Microsoft Surface gesture technology, with the major difference being that her team’s device is completely portable. Any surface, whether it be a wall or the palm of your hand could be turned into an interactive computing display.
The audience at TED was definitely awe-struck by the demo, giving Maes and her associates a standing ovation after the presentation. We’ll be keeping an eye out for more developments on this amazing technology as it develops.
You can read more about Dr. Maes and the Fluid Interfaces group on their official website.