New! Sign up for our free email newsletter.
Science News
from research organizations

Technique for letting brain talk to computers now tunes in speech

Date:
April 7, 2011
Source:
Washington University School of Medicine
Summary:
Researchers have used a technique, usually associated with identifying epilepsy, for the first time to show that a computer can listen to our thoughts. The scientists demonstrated that humans can control a cursor on a computer screen using words spoken out loud and in their head.
Share:
FULL STORY

The act of mind reading is something usually reserved for science-fiction movies but researchers in America have used a technique, usually associated with identifying epilepsy, for the first time to show that a computer can listen to our thoughts.

In a new study, scientists from Washington University demonstrated that humans can control a cursor on a computer screen using words spoken out loud and in their head, holding huge applications for patients who may have lost their speech through brain injury or disabled patients with limited movement.

By directly connecting the patient's brain to a computer, the researchers showed that the computer could be controlled with up to 90% accuracy even when no prior training was given.

Patients with a temporary surgical implant have used regions of the brain that control speech to "talk" to a computer for the first time, manipulating a cursor on a computer screen simply by saying or thinking of a particular sound.

"There are many directions we could take this, including development of technology to restore communication for patients who have lost speech due to brain injury or damage to their vocal cords or airway," says author Eric C. Leuthardt, MD, of Washington University School of Medicine in St. Louis.

Scientists have typically programmed the temporary implants, known as brain-computer interfaces, to detect activity in the brain's motor networks, which control muscle movements.

"That makes sense when you're trying to use these devices to restore lost mobility -- the user can potentially engage the implant to move a robotic arm through the same brain areas he or she once used to move an arm disabled by injury," says Leuthardt, assistant professor of neurosurgery, of biomedical engineering and of neurobiology, "But that has the potential to be inefficient for restoration of a loss of communication."

Patients might be able to learn to think about moving their arms in a particular way to say hello via a computer speaker, Leuthardt explains. But it would be much easier if they could say hello by using the same brain areas they once engaged to use their own voices.

The research appears April 7 in The Journal of Neural Engineering.

The devices under study are temporarily installed directly on the surface of the brain in epilepsy patients. Surgeons like Leuthardt use them to identify the source of persistent, medication-resistant seizures and map those regions for surgical removal. Researchers hope one day to install the implants permanently to restore capabilities lost to injury and disease.

Leuthardt and his colleagues have recently revealed that the implants can be used to analyze the frequency of brain wave activity, allowing them to make finer distinctions about what the brain is doing. For the new study, Leuthardt and others applied this technique to detect when patients say or think of four sounds:

  • oo, as in few
  • e, as in see
  • a, as in say
  • a, as in hat

When scientists identified the brainwave patterns that represented these sounds and programmed the interface to recognize them, patients could quickly learn to control a computer cursor by thinking or saying the appropriate sound.

In the future, interfaces could be tuned to listen to just speech networks or both motor and speech networks, Leuthardt says. As an example, he suggests that it might one day be possible to let a disabled patient both use his or her motor regions to control a cursor on a computer screen and imagine saying "click" when he or she wants to click on the screen.

"We can distinguish both spoken sounds and the patient imagining saying a sound, so that means we are truly starting to read the language of thought," he says. "This is one of the earliest examples, to a very, very small extent, of what is called 'reading minds' -- detecting what people are saying to themselves in their internal dialogue."

"We want to see if we can not just detect when you're saying dog, tree, tool or some other word, but also learn what the pure idea of that looks like in your mind," he says. "It's exciting and a little scary to think of reading minds, but it has incredible potential for people who can't communicate or are suffering from other disabilities."

The next step, which Leuthardt and his colleagues are working on, is to find ways to distinguish what they call "higher levels of conceptual information."

The study identified that speech intentions can be acquired through a site that is less than a centimetre wide which would require only a small insertion into the brain. This would greatly reduce the risk of a surgical procedure.


Story Source:

Materials provided by Washington University School of Medicine. Original written by Michael C. Purdy. Note: Content may be edited for style and length.


Journal Reference:

  1. Eric C. Leuthardt, Charles Gaona, Mohit Sharma, Nicholas Szrama, Jarod Roland, Zac Freudenberg, Jamie Solis, Jonathan Breshears, Gerwin Schalk. Using the electrocorticographic speech network to control a brain–computer interface in humans. Journal of Neural Engineering, 2011; DOI: 10.1088/1741-2560/8/3/036004

Cite This Page:

Washington University School of Medicine. "Technique for letting brain talk to computers now tunes in speech." ScienceDaily. ScienceDaily, 7 April 2011. <www.sciencedaily.com/releases/2011/04/110406192422.htm>.
Washington University School of Medicine. (2011, April 7). Technique for letting brain talk to computers now tunes in speech. ScienceDaily. Retrieved April 26, 2024 from www.sciencedaily.com/releases/2011/04/110406192422.htm
Washington University School of Medicine. "Technique for letting brain talk to computers now tunes in speech." ScienceDaily. www.sciencedaily.com/releases/2011/04/110406192422.htm (accessed April 26, 2024).

Explore More

from ScienceDaily

RELATED STORIES