Using Your Thought-Controlled iPhone to Dial Home
Don your telepathic headset and plug into your iPhone. Want to call your friend? Just look at her picture and think about her.
No, this isn’t SF technology from the latest episode of Caprica (although the smart phones in Caprica look suspiciously like our own). A new iPhone app described in the MIT Technology Review blog developed by Tanzeem Choudhury, Rajeev Raizada, Andrew Campbell and others at Dartmouth College lets you “wink” or “think dial” your relative or friend when his or her photo appears on an iPhone. Dubbed the “NeuroPhone” (not to be confused with the “Neurophone” developed by Dr. Patrick Flanagan in 1958, a device that converts sound to electrical impulses) it’s a brainwave-phone interface that uses a wireless EEG headset to send signals wirelessly to an iPhone. Here’s a video demo of the NeuroPhone:
The “telepathic” headset is actually the Emotiv EPOC, which recently won the 2010 Red Dot Award for Product Design out of a field of 4,252 entries from 1,636 companies in 57 countries. While the wireless Emotiv EPOC headset is being marketed as both a gaming device and as an aid for the disabled, it clearly has the capability for hands-free smart phone use. With 14 EEG electrodes to monitor brainwaves and a gyroscope to locate your head in 3D space, its lithium battery lasts 12 hours and charges via USB. In addition the EPOC can read brain activity related to facial movements, which can be translated in software to infer your emotional state and intentions. The EPOC headset can also provide vegetative patients a way to communicate, let users play hands-free video games, and allow robot owners to send instructions to their bots.
If brainwave signals can be plucked out of the air to get the mood of a space, then brainwaves can be hacked.
The Dartmouth research that resulted in the NeruoPhone is based on pattern-based fMRI analysis (also referred to as multivoxel pattern analysis, or MVPA) which applies pattern-classification algorithms to analyze the voxels – essentially 3D pixels – in a brain scan. The Dartmouth researchers envision futuristic, somewhat scary, many-to-one mobile applications for the NeuroPhone. In one example, a foreign language teacher is interested in seeing exactly how many students — all wearing EEG headsets — actually understood the last question she asked. She pulls out her smart phone to get up-to-the-second statistics on each of her students. Based on aggregate class statistics, she determines student comprehension in real time… oops, they didn’t quite get that one. Another even more futuristic and somewhat invasive example involves a person entering a bar, club, meeting room and immediately getting the brainwave “vibe” of the overall emotional state of the humans in the space (happy, tense, frustrated, sad, bored, hostile, and so forth).
If brainwave signals can be plucked out of the air to get the mood of a space, then brainwaves can be hacked. Just as today’s hackers “sniff” packets of information passing between computers on the Internet, tomorrow’s hackers may have the ability to sniff brainwave packets out of the air to reconstruct thoughts. The term “firewall” will definitely take on a new meaning.
While you probably won’t want to walk around wearing an EPOC, you can bet the form factor will look more like a Bluetooth headset in a year or two. The NeuroPhone is an important development because it is simple to engineer using cheap off-the-shelf commercial components. And the Dartmouth iPhone app is exciting for two reasons: it is the first sensible iPhone application for the EPOC headset; and you don’t have to mess with the touch screen to dial the phone. Today’s machine learning algorithms are now able to interface neural signals to phones to deliver a new mobile computing experience. Soon you’ll be thinking your way through your iPhone or Droid apps.