In the new movie, The Men Who Stare at Goats, reporter Bob Wilton confronts Special Forces operator Lyn Cassady, “I’ve heard that you’re a psychic spy.” Lyn later comments, “We’re Jedi. We don’t fight with our guns, we fight with our minds.” Mind reading – formerly the stuff of science fiction and crystal gazers – is rapidly becoming science fact. A recent CBS 60 Minutes story reports that “technology may soon ‘read’ your mind” in this video (courtesy of CBS):
Toys such as Mattel’s Mindflex™ and the Start Wars Force Trainer™ include brain wave detection technology and are now readily available at your local Target or Walmart stores. For a younger generation raised on telekinetic X-Men – from Professor Xavier to Magneto – these fascinating mind-over-matter toys offer limitless play time opportunities:
NeuroSky leads the market in creating inexpensive, consumer brain-computer interfaces. NeuroSky’s brain-reading hardware and software headsets are being designed for the automotive, health care and education industries. Using their Mindset™ package you can become NeuroBoy™ and use your special telekinetic powers to push, pull, lift, or burn objects in a virtual world –- by thought alone.
Emotiv Systems, a San Francisco-based neuroengineering company founded in 2003 by four award-winning scientists, builds EEG-based headsets that pass your brain’s electrical signals to software on your PC to extract patterns and translate them. As with the NeuroSky product, you can move objects in virtual worlds on your PC using Emotiv’s EPOC™ Neuroheadset:
In light of a recent announcement at the 2009 Society for Neuroscience conference in Chicago, “mind reading” has taken another scientific leap forward. Researchers are now able to determine what vowel and consonants a person is thinking of by recording activity from the surface of the brain. An MIT Technology Review editorial reports that Gerwin Schalk and colleagues at the Wadsworth Center, in Albany, NY, used a technology called electrocorticography (ECoG), in which a sheet of electrodes is laid directly on the surface of a patient’s brain. Schalk’s team asked patients to say or imagine words flashed on a screen while their brain activity was recorded. The researchers then used specially designed decoder algorithms to predict the vowels and consonants of the word, using only the pattern of brain activity. They found that both speaking and imagining the word gave roughly the same level of accuracy. This is essential for the system to be used by people who are so severely paralyzed that they have lost the ability to speak. The system has about a 50-to-70% accuracy rate. It may one day become a neural prosthesis for people with severe paralysis, translating their thoughts into actions on a computer or prosthetic limb.
It’s understandable that researchers are wary of having their work referred to as mind reading. They call it neural decoding.
Advances in research-enabling technologies, such as functional magnetic resonance imaging (fMRI) and computational neuroscience, are resulting in techniques that can better assess the neural basis of cognition and allow the visualization of brain processes –- as well as thought-directed control of prosthetics. Government-financed projects include neural control of mechanical arms, hands and legs. These intelligent artificial limbs will be controlled by your nervous system and will allow you to pitch a fastball, thread a needle or play a piano as well as you did before your loss.
These developments are raising concerns about the potential exploitation of "mind reading" technologies by advertisers or oppressive governments. So it’s understandable that researchers are wary of having their work referred to as mind reading. Emphasizing its limitations, they call it neural decoding. Jack Gallant, a leading "neural decoder" at the University of California, Berkeley, has produced some of the field’s most impressive results yet. He and colleague Shinji Nishimoto showed that they could create a crude reproduction of a movie clip that someone was watching just by viewing their brain activity. Other neuroscientists claim that such neural decoding can be used to read memories and future plans and even to diagnose eating disorders.
Toyota is developing an advanced brain-sensing system that controls the movement of a wheelchair by reading a user’s thoughts alone. By detecting and processing brain wave patterns, the system can “propel a wheelchair forward, as well as make turns, with virtually no discernible delay between thought and movement,” according to a recent press release. Rival automaker Honda’s Asimo robot can also be manipulated by detecting brain signals. Honda is exploring the concept that humanoid robots may one day replace home care nurses:
What was once speculative fiction — the ability to read minds and to control the movement of objects using thought alone, sometimes called mind-over-matter –- is rapidly becoming neurotechnological fact. The upside of this technology will more freedom for the physically impaired –- imagine wheelchair-bound physicist Stephen Hawking able to control his wheelchair and capture and communicate his thoughts and sentences with a neuroheadset. The obvious downside is the potential dystopian nightmare of “thought police” strapping you to a chair to view the contents of your mind and gain a confession.