skip navigation nih record
Vol. LXV, No. 19
September 13, 2013
cover

previous story

next story



New Technology Recognizes Words via Brain Activity Patterns

Dr. Marcel Just Dr. Tom Mitchell
Dr. Marcel Just Dr. Tom Mitchell

NIH grantees have taken the first step towards teaching computers to read a person’s mind. Dr. Tom Mitchell and Dr. Marcel Just have been programming computers to interpret images of a person’s brain activity after the person hears a word spoken. In some cases, the computers can actually discern the word a person has heard, based solely on their analysis of the listener’s brain activity patterns.

“This accomplishment will help us break new ground in our understanding of human language,” said Dr. Brett Miller, NICHD health scientist administrator. “There are many potential applications for this technology. If we can understand how the brain processes language, we can understand—and eventually treat—a variety of disorders in which people have problems understanding others or with expressing themselves.” Funding for the project has been provided by NICHD, NIMH and the National Science Foundation.

Mitchell designs programs that teach the computer to link images of brain activity to words or phrases that trigger the brain activity. The computer creates this link by associating the patterns found in the brain activity with the word that those patterns represent.

“I study machine learning, which is the study of step-by-step procedures instructing a computer to recognize the patterns in large amounts of data,” Mitchell explained.

During a typical session, Mitchell and Just will present a volunteer with 60 different words, with each word presented multiple times in random order throughout the session. The computer then charts the images against the spoken words. Based on these images, the computer creates a mathematical sequence by which it can identify the word from the brain activity patterns associated with it.

The researchers then test what the computer has learned by showing it images of brain activity patterns to see if it can identify the words associated with those patterns. Many times the computer can correctly identify a word based only on the brain activity patterns—not just from the images of the person who took part in the initial scanning session, but also from the scans of other people who heard the same word. Moreover, the brain activity patterns remain the same whether a person hears a word, is shown the printed word, or even a picture of what the word represents.

So far, efforts are most successful at recognizing concrete nouns—such as “house” or “hammer.” In fact, the computer can match concrete nouns to their corresponding activity patterns 90 percent of the time. The computer is also successful at recognizing words that represent feelings, such as “fear” or “love.” However, the researchers have not been successful in getting the computer to recognize words that describe complex concepts, like “democracy” or “justice.” These and other abstract terms fail to evoke consistent brain activity from person to person.

Eventually, Mitchell hopes to refine the technology so that it can be used to diagnose—and perhaps help treat—a number of different speech and language disorders. For example, computers might recognize people with dyslexia (reading disability) based on their patterns of brain activity and refer them for appropriate treatment. Similarly, they might monitor the brain activity of stroke patients to determine if they’re responding to therapy to overcome a speech impairment and substitute another therapy if the original treatment wasn’t helping.


back to top of page