Thursday, August 6, 2009

Newborn Brain May Be Wired for Speech

Newborn Brain May Be Wired for Speech

By Faith Hickman Brynie
About Faith Hickman Brynie
July 07, 2008

The long, enthusiastic debate about whether the brain is hardwired for language gets a boost now and then, most recently from the release several months ago of a book claiming we are hardwired to, among other things, curse. Continuing research suggests that even though newborns cannot speak or understand language, our brains may indeed be built for language from birth or even before.

“From the first weeks of life the human brain is particularly adapted for processing speech,” says French researcher Ghislaine Dehaene-Lambertz, director of the cognitive neuroimaging research center at the Institut National de la Santé de la Recherche Médicale. Infants’ language learning and processing rely largely on the same brain circuits that adults use, she says.

Studies employing optical topography, a technique that assesses oxygen use in the brain, have shown activity in left-hemisphere speech centers in newborns as young as 2 to 5 days. Marcela Peña of the International School for Advanced Studies in Italy and colleagues found that left-hemisphere activity was greater when the babies hear normal speech than when they heard silence or speech played backward, according to a study published in the Proceedings of the National Academy of Sciences in 2003.

Other behavioral experiments have demonstrated that days- or weeks-old infants can distinguish the “melody” of their native language from the pitches and rhythms of other languages, and that infants can assess the number of syllables in a word and detect a change in speech sounds (such as ba versus ga), even when they hear different speakers.

In 2002 Dehaene-Lambertz’s team used functional magnetic resonance imaging (fMRI) to monitor brain activity while 3-month-old infants listened to 20-second blocks of speech played forward and backward. With forward speech, the same brain regions that adults use for language were active in the babies, with a strong preference for the left hemisphere.

Additional activation in parts of the right frontal cortex was seen in infants who listened to normal speech. The activity occurred in the same brain areas that become active when adults retrieve verbal information from memory.

The French team also found a significant preference for the native language in the babies’ left angular gyrus, an area with increased activity when adults hear words but not nonsense syllables.

In 2006 Dehaene-Lambertz again used fMRI to measure cerebral activity in 3-month-olds who heard short sentences spoken in their native language.

The infants recognized a repeated sentence even after a 14-second interval of silence. The scans showed adultlike activity in the upper region of the brain’s left temporal lobe. The fastest responses were recorded near the auditory cortex, where sounds are first processed in the brain.

Responses slowed down toward the back of the language-processing region and in Broca’s area in the left hemisphere. Activity in that area increased when a sentence was repeated, suggesting that infants may be using a memory system based on Broca’s area just as adults do. These results, reported in the Proceedings of the National Academy of Sciences, demonstrate that the precursors of adult cortical language areas are already working in infants even before the time when babbling begins, Dehaene-Lambertz says.

She offers two possible explanations for these findings. Perhaps certain brain regions are genetically and developmentally “programmed” for language at birth, or even before. Or perhaps these regions are sensitive only to sound or to any rapidly changing sound.

“We do not know yet whether another structured stimulus, such as music, would activate the same network,” Dehaene-Lambertz says. “However, we can say that the processing abilities of an infant’s brain make it efficiently adapted to the most frequent auditory input: speech.”

Edith Kaan, a linguist at the University of Florida, says that researchers are currently studying whether the developing brain handles speech sounds in a different way from other sounds. They also hope to discover how brain regions specialize as children learn to make and understand words, phrases and sentences.

“Eventually, this research may help us understand what capacities are inborn for learning language,” Kaan says. “We may also learn which functions are unique to language and language development, and which are shared with other cognitive activities such as attention, working memory and pattern recognition.”

link

No comments:

Post a Comment