The language loop is found in the left hemisphere in about 90% of right-handed persons and 70% of left-handed persons, language being one of the functions that is performed asymmetrically in the brain. Surprisingly, this loop is also found at the same location in deaf persons who use sign language. about.com
“Production of Oral or Written Language. Normal spontaneous speech begins with the intent to communicate followed by the internal organization of the thought, access to the words to be used in expressing the thought or idea and their phonetic representations (word sounds), the initiation of the intention, and finally the actual production (articulation) of speech. Spontaneous writing makes similar demands, except rather than requiring the external articulation of phonemes, the phonemes are converted into written symbols (graphemes). In typical dominance patterns, most of these language functions are mediated primarily by the left hemisphere. Whether the left, right, or both hemispheres are “responsible” for the intent to communicate is unclear. However the failure to initiate spontaneous communication typically has been associated with left anterior (frontal) lesions.
Language Reproduction. In contrast to language production, language reproduction, in its broadest sense, refers to the ability to reproduce language in either the same or alternate form from which it was perceived. Typically when we think of this aspect of language, we think of the repetition of spoken language or the transcription of spoken or written language. However, reading aloud (as opposed to silent reading for comprehension) also may be considered language reproduction.
Word-finding ability. The ability to associate a “word” with either an internal (thought or recollection) or external (perception) representation of an object or idea is a fundamental function of language. Creating these associations (i.e. words) and then retrieving them, either spontaneously or on cue, appear to be skills relegated to the left hemisphere…
Word recognition. In addition to being able to retrieve a word when needed (verbal expression), linguistic communication also demands that when a word is perceived, either aurally (auditory comprehension) or visually (reading comprehension), its meaning and/or associations are understood (verbal comprehension). Language comprehension may be broken down further into its semantic and syntactic components.
While the left hemisphere clearly is dominant for comprehending both semantics and syntax, again in split-brain studies the right hemisphere has been shown to have some limited semantic capacity and even more limited ability to process syntax independent of the left hemisphere. However, somewhat paradoxically, in the presence of an intact left hemisphere, right hemispheric damage may lead to significant difficulties in appreciating subtle or thematic aspects of communication, especially when metaphors or sarcasm are employed.
Internal use of language. Language not only is used for communicating with others, it also is used internally. It serves as an important base for abstract reasoning and problem solving. While both hemispheres contribute to the development of new and creative insights into the world around us, many of the problems presented to us on a day-to-day basis are represented in verbal terms. Even if not, we often try to assign words to our ideas, motivations, imaginings, and conflicts in order to analyze, manipulate, and weigh their various permutations and potential outcomes. Strictly speaking, what we define as rational thought and abstractive capacities appear to be the application of formal linguistic principles to a particular problem. Again, while the split-brain work has suggested that the right hemisphere certainly is capable of problem solving and decision making (in certain circumstances, apparently even more efficiently than the left hemisphere), it appears that it is the left hemisphere that mediates such thought processes in most individuals.”
Clinical Neuroanatomy by John Mendoza, Anne L. Foundas p. 346
Functions of different parts of the cortex (according to the Wernicke-Geschwind model)
Reading
Reading aloud. Written language is received by the visual cortex and transmitted to the angular gyrus. The signal is then sent to Broca’s area and the adjacent motor complex for articulation.
Silent reading involves the visual cortex, the angular gyrus, Wernicke’s area and Broca’s area.
The angular gyrus receives the visual information from the visual cortex and recodes it into auditory form and then transmits it to Wernicke’s area for interpretation.
Speech production The signal moves from Wernicke’s area to Broca’s area which then transmits it to the motor complex. Obviously spontanous speech production is a lot more than that. This space reserved for a good explanation.
Listening
Passive listening.
“During passive listening, activation is almost exclusively limited to the superior temporal areas, possibly due to the fact that no language output (naming) is being required. Sound stimuli that require little or no linguistic analysis, such as noise, pure tones, and passive listening to uninteresting text, produce nearly symmetrical activity in or around the superior temporal gyrus of each hemisphere (Binder et al. 1994). When the task requires listening for comprehension, significant lateralization to the language-dominant hemisphere is present (Schlosser et al. 1998.)
Stimuli of higher presentation rates or greater difficulty produce greater activation. When words are presented too slowly, allowing time for the subject to daydream between stimuli, activation is greatly reduced. Tasks that are uninteresting, although “language rich” may produce activation of primary auditory areas but little activation of language areas. Stimuli that are challenging or interesting produce greater activation.”
Audiology by Ross J. Roeser, Michael Valente, Holly Hosford-Dunn
Active listening
“Brain regions specifically implicated in listening to the spoken word (active listening) have been identified on MRI scans by subtracting the signal from regions (such as auditory cortex) that are engaged when listening to random tones (passive listening) from the total signal produced by listening to speech.
Listening to speech activates:
“Wernicke’s area on the left side, which is thought to permit discrimination of verbal from non-verbal material; The angular gyrus which identifies phonemes; The middle temporal gyrus which identifies phonemes; The middle temporal gyrus (area 21) and area 37 identify words from phoneme strings and tap into semantic networks located in the left dorsolateral preffontal cortex (areas 9 and 46), that must be searched to traduce the meaning of speech; Broca’s area is activated, because when listening to speech we covertly rehearse the articulatory commands needed to pronounce the words, a process referred to as subvocal articulation.”
Neuroscience by Alan Longstaff
Reading
“Clearly reading requires visual processing. Subsequently, in novice readers, the parieto-temporal region (angular gyrus and Wernicke’s area) dismantles words into phonemes so that they can be identified. However, in experienced readers the extra-striate occipito-temporal cortex (area 19) recognizes entire words instantly. Activation of a network that links the supramarginal gyrus (area 40), and area 37, to the anterior part of the Broca’s area (area 45), via the insula, allows access to semantic networks in the dorsolateral prefrontal cortex so that the meaning and pronunciation of the words can be retrieved. Finally, either subvocal articulation or reading aloud is accompanied by activation of the whole of Broca’s area, the medial supplementary motor area (area 6), motor areas subserving face and tongue (area 4), and the contralateral cerebellar hemisphere.”
Neuroscience by Alan Longstaff
The Wernicke-Geschwind model
“Norman Geschwind assembled these clues into an explanation how we use language. When you read aloud, the words (1) register in the visual area, (2) are relayed to the angular gyrus that transforms the words into an auditory code that is (3) received and understood in the nearby Wernicke’s area and (4) sent to Broca’s area, which (5) controls the motor complex, creating the pronounced word. Damage to the angular gyrus leaves the person able to speak and understand but unable to read. Damage to Wernicke’s area disrupts understanding. (Comment: Reading, both aloud and for comprehension, is usually impaired in Wernicke's aphasia.) Damage to Broca’s area disrupts speaking (Comment: often also reading aloud).
The general principle bears repeating: complex abilities result from the intricate coordination of many brain areas. Said another way, the brain operates by dividing its mental functions – speaking, perceiving, thinking, remembering – into subfunctions. Our conscious experience seems indivisible. The brain computes the word’s form, sound, and meaning using different neural networks… To sum up, the mind’s subsystems are localized in particular brain regions, yet the brain acts as a unified whole.”
Psychology, Seventh Edition in Modules
by David G. Myers
A critique of the Wernicke-Geschwind model
“PET studies have revealed that because visual linguistic stimuli are not transformed into an auditory representation, visual and auditory linguistic stimuli are processed independently by modality-specific pathways that have independent access to Broca's area. Moreover, because the linguistic processing of visual stimuli can bypass Wernicke's area altogether, other brain regions must be involved with storing the meaning of words (Mayeux & Kandel, 1991,p. 845; also see Kolb & Whishaw, 1990, pp. 582-583). Thus, not only do there seem to be separate -- parallel -- pathways for processing the phonological and semantic aspects of language, language processing clearly involves a larger number of areas and a more complex set of interconnections than just those identified by the W-G model (Wernicke-Geschwind model) (Mayeux & Kandel, 1991, p. 845). Indeed, the PET studies support the notion that language production and comprehension involve processing along multiple routes, not just one:
No one area of the brain is devoted to a very complex function, such as 'syntax' or 'semantics'. Rather, any task or function utilizes a set of brain areas that form an interconnected, parallel, and distributed hierarchy. Each area within the hierarchy makes a specific contribution to the performance of the task. (Fiez & Petersen, 1993, 287)."
Using Pet Toward A Naturalized Model Of Human Language Processing
By Robert S. Stufflebeam
Active reading
“Recall that according to Wernicke both visual and auditory information are transformed into a shared auditory representation of language. This information is then conveyed to Wernicke’s area, where it becomes associated with meaning before being transformed in Broca’s area into output as written or spoken language…Using PET imaging, they determined how individual words are coded in the brain when the words are read or heard. They found that when words are heard, Wernicke’s area becomes active, but when words are seen but not heard or spoken, there is no activation of Wernicke’s area. The visual information from the occipital cortex appears to be conveyed directly to Broca’s area without first being transformed into an auditory representation in the posterior temporal cortex.”
Essentials of neural science and behavior by Eric R. Kandel, James Harris Schwartz, Thomas M. Jessell
No comments:
Post a Comment