The makers of the iBrain, a machine used to collect sleep data, have loftier plans for the EEG recording tool: mind reading. In a set of experiments with Stephen Hawking, a research team led by iBrain inventor Philip Low haven't exactly reached that goal, but they've gotten the machine to read his thoughts as signals. "The idea is to see if Stephen can use his mind to create a consistent and repeatable pattern that a computer can translate into, say, a word or letter or a command for a computer," Low told The New York Times' David Ewing Duncan.
As of now, the iBrain uses its technology to collect data for those suffering from sleep apnea, depression, and autism. The device consists of a "miniature electronics box attached to a light and flexible elastic head harness and electrodes that can effortlessly be applied to the head during sleep," explains the company's site. It then uses what the company has named the "SPEARS algorithm" to collect and analyze the data in a simpler way than other brain-wave collecting machines, explains the company site:
By taking a single channel of EEG, SPEARS creates a map of brain activity, indicating the different signatures in waking and sleeping states. SPEARS can represent a night's worth of brain activity in clusters, where every sleep and waking state forms a separate cluster. SPEARS can reliably extract a maximal number of stages in minimal time, using a single channel. This reduces the number of sixteen, eight, or even two channels formerly needed when undergoing an EEG and eliminates the need to visually review data in all those traces. Together, this creates the opportunity for a small, single-channel EEG system that can be performed anywhere, even while driving. Comparison of manual sleep test scoring with automatic scoring from the SPEARS algorithm shows little difference except in the large amount of time and labor saved through the SPEARS technique.
It was meant as a neater replacement to all the cumbersome electrodes it generally takes to monitor these sorts of things.
Since the machine already recognizes brain activity, the new research hopes to use the machine's brain wave collection capability and translate it into action. "We wanted to see if there was any change in the signal," Dr. Low told The Times. Though Hawking, who has Amyotrophic lateral sclerosis, or A.L.S., can't move his limbs, he can think it. Hawking's thoughts about scrunching his hand into a ball resulted in data spikes, giving the researchers hope that they can take those spikes and somehow turn them into commands for another machine, or computer. "Patients want to be able to communicate beyond the yes or no with an eye blink," Dr. Terry Heiman-Patterson, a neurologist and A.L.S. specialist at Drexel University told Duncan. "They want to send an e-mail, and turn off the light and, even more, to have a meaningful conversation." Collecting or even recognizing brain data, is different than getting a computer to understand it, however. When that happens, then we'll really have a machine that can read minds.