Monday, 8 December 2014

What happens in the brain when you learn a language?



Scans and neuroscience are helping scientists understand what happens to the brain when you learn a second language.
 
 
Learning a foreign language can increase the size of your brain. This is what Swedish scientists discovered when they used brain scans to monitor what happens when someone learns a second language. The study is part of a growing body of research using brain imaging technologies to better understand the cognitive benefits of language learning. Tools like magnetic resonance imaging (MRI) and electrophysiology, among others, can now tell us not only whether we need knee surgery or have irregularities with our heartbeat, but reveal what is happening in our brains when we hear, understand and produce second languages.

The Swedish MRI study showed that learning a foreign language has a visible effect on the brain. Young adult military recruits with a flair for languages learned Arabic, Russian or Dari intensively, while a control group of medical and cognitive science students also studied hard, but not at languages. MRI scans showed specific parts of the brains of the language students developed in size whereas the brain structures of the control group remained unchanged. Equally interesting was that learners whose brains grew in the hippocampus and areas of the cerebral cortex related to language learning had better language skills than other learners for whom the motor region of the cerebral cortex developed more.

In other words, the areas of the brain that grew were linked to how easy the learners found languages, and brain development varied according to performance. As the researchers noted, while it is not completely clear what changes after three months of intensive language study mean for the long term, brain growth sounds promising.

Looking at functional MRI brain scans can also tell us what parts of the brain are active during a specific learning task. For example, we can see why adult native speakers of a language like Japanese cannot easily hear the difference between the English “r” and “l” sounds (making it difficult for them to distinguish “river” and “liver” for example). Unlike English, Japanese does not distinguish between “r” and “l” as distinct sounds. Instead, a single sound unit (known as a phoneme) represents both sounds.

When presented with English words containing either of these sounds, brain imaging studies show that only a single region of a Japanese speaker’s brain is activated, whereas in English speakers, two different areas of activation show up, one for each unique sound.

For Japanese speakers, learning to hear and produce the differences between the two phonemes in English requires a rewiring of certain elements of the brain’s circuitry. What can be done? How can we learn these distinctions?

Early language studies based on brain research have shown that Japanese speakers can learn to hear and produce the difference in “r” and “l” by using a software program that greatly exaggerates the aspects of each sound that make it different from the other. When the sounds were modified and extended by the software, participants were more easily able to hear the difference between the sounds. In one study, after only three 20-minute sessions (just a single hour’s worth), the volunteers learned to successfully distinguish the sounds, even when the sounds were presented as part of normal speech.

This sort of research might eventually lead to advances in the use of technology for second-language learning. For example, using ultrasound machines like the ones used to show expectant parents the features and movements of their babies in the womb, researchers in articulatory phonetics have been able to explain to language learners how to make sounds by showing them visual images of how their tongue, lips, and jaw should move with their airstream mechanisms and the rise and fall of the soft palate to make these sounds.

Ian Wilson, a researcher working in Japan, has produced some early reports of studies of these technologies that are encouraging. Of course, researchers aren’t suggesting that ultrasound equipment be included as part of regular language learning classrooms, but savvy software engineers are beginning to come up with ways to capitalise on this new knowledge by incorporating imaging into cutting edge language learning apps.

Kara Morgan-Short, a professor at the University of Illinois at Chicago, uses electrophysiology to examine the inner workings of the brain. She and her colleagues taught second-language learners to speak an artificial language – a miniature language constructed by linguists to test claims about language learnability in a controlled way.

In their experiment, one group of volunteers learned through explanations of the rules of the language, while a second group learned by being immersed in the language, similar to how we all learn our native languages. While all of their participants learned, it was the immersed learners whose brain processes were most like those of native speakers. Interestingly, up to six months later, when they could not have received any more exposure to the language at home because the language was artificial, these learners still performed well on tests, and their brain processes had become even more native-like.

In a follow-up study, Morgan-Short and her colleagues showed that the learners who demonstrated particular talents at picking up sequences and patterns learned grammar particularly well through immersion. Morgan-Short said: “This brain-based research tells us not only that some adults can learn through immersion, like children, but might enable us to match individual adult learners with the optimal learning contexts for them.”

Brain imaging research may eventually help us tailor language learning methods to our cognitive abilities, telling us whether we learn best from formal instruction that highlights rules, immersing ourselves in the sounds of a language, or perhaps one followed by the other.

However we learn, this recent brain-based research provides good news. We know that people who speak more than one language fluently have better memories and are more cognitively creative and mentally flexible than monolinguals. Canadian studies suggest that Alzheimer’s disease and the onset of dementia are diagnosed later for bilinguals than for monolinguals, meaning that knowing a second language can help us to stay cognitively healthy well into our later years.

Even more encouraging is that bilingual benefits still hold for those of us who do not learn our second languages as children. Edinburgh University researchers point out that “millions of people across the world acquire their second language later in life: in school, university, or work, or through migration or marriage.” Their results, with 853 participants, clearly show that knowing another language is advantageous, regardless of when you learn it.