The brain has a lot to do. Not only does it process the inputs from all five senses, but it also allows us to make sense of the information we receive from such inputs.
Researchers have long wondered whether the brain might process information found in speech in the same region as it produces speech. Now they have evidence they are both done in the same area in the brain.
A new study finds that speaking and understanding speech share the same parts of the brain, with one difference: we don’t need the brain regions that control the movements of lips, teeth, and so on to understand speech.
Most studies of how speech works in the brain focus on comprehension. That’s mostly because it’s easier to image the brains of people who are listening quietly; talking makes the head move, which is a problem when you’re measuring the brain. But now, the Donders Institute at the Radboud University Nijmegen, where the study was conducted, has developed technology that allows recording from a moving brain.
Laura Menenti, the lead researcher, was initially interested in how the brain produces grammatical sentences and wanted to track the process of producing a sentence in its entirety, looking not only at its grammatical structure but also at its meaning.
“What made this particularly exciting to us was that no one had managed to perform such a study before, meaning that we could explore an almost completely new topic,” says Menenti.
The authors used functional MRI technology to measure brain activity in people who were either listening to sentences or speaking sentences. The other problem with measuring brain activity in people who are speaking is that you have to get them to say the right kind of sentence.
The authors accomplished this with a picture of an action — a man strangling a woman, say — with one person colored green and one colored red to indicate their order in the sentence. This prompted people to say either “The man is strangling the woman” or “The woman is strangled by the man.”
From this, the researchers were able to tell where in the brain three different speech tasks (computing meaning, coming up with the words, and building a grammatical sentence) were taking place. They found that the same areas were activated for each of these tasks in people who were speaking and people who were listening to sentences. However, although some studies have suggested that while people are listening to speech, they silently articulate the words in order to understand them, the authors found no involvement of motor regions when people were listening.
According to Menenti, though the study was largely designed to answer a specific theoretical question, it also points toward some useful avenues for treatment of people with language-related problems. It suggests that while it sometimes seems that people with comprehension problems may have intact production, and vice versa, this may not necessarily be the case.
“Our data suggest that these problems would be expected to always at least partly coincide. On the other, our data confirm the idea that many different processes in the language system, such as understanding meaning or grammar, can at least partly be damaged independently of each other,” according to Menenti.
The new study appears in the August issue of Psychological Science.
Source: Association for Psychological Science