Wave your hand and the Rolex materializes on your arm like so much smoke. And then… poof… it’s gone. Open the palm of your hand and suddenly your calendar and phone list overlay your life line. Use your fingers and thumb to create a picture frame and snap a photo. Check the latest book reviews on Amazon and display the results on the pocketbook or newspaper you’re holding at the airport newsstand.
Leave it to students at the MIT Media Lab to develop a wearable computing system that turns any surface into an interactive display screen. With an ordinary webcam and a battery-powered 3M projector, they attached a mirror and connected it to an internet-enabled mobile phone. A mere $350 of off-the-shelf components and suddenly the glass window at Macy’s, your car door, or your arm become a computer display. Want to Google the latest Dow Jones, Nasdaq, or S&P 500 returns? No problem, just do a quick search on your shirt sleeve.
MIT’s Media Lab has explored the idea of wearable computing for some time. “Wearable computing hopes to shatter this myth of how a computer should be used,” states the program’s web site, “A person’s computer should be worn, much as eyeglasses or clothing are worn, and interact with the user based on the context of the situation.”
Pattie Maes of the lab’s Fluid Interfaces group goes one step further. As the leader of a team of seven graduate students that developed the system, she characterizes it as somewhat more than a wearable device — she refers to it as a digital “sixth sense.” No, she can’t see dead people. But, as a recent TED demo shows — sans keyboard or monitor — she literally has the Internet cloud on her arm (and her hands, and…).