NEW RESEARCH INCREASES OUR UNDERSTANDING OF HOW WE LEARN AND UNDERSTAND LANGUAGE AND HOW THIS PROCESS CAN GO AWRY
For immediate release.
NR-11-05 (11/13/05). For more information, please contact Tracey Somers at (202) 462-6688 or tsomers@sfn.org.
NEW RESEARCH INCREASES OUR UNDERSTANDING OF HOW WE LEARN AND UNDERSTAND LANGUAGE AND HOW THIS PROCESS CAN GO AWRY
WASHINGTON, DC, November 13, 2005 — Researchers continue to uncover new and exciting clues about how we learn and understand language, as well as how this process can go awry. Included is the discovery of research techniques that can predict at six months of age whether children will struggle with language learning impairments and reading later in life. The findings also are already leading to new methods to overcome language and learning problems.
“The new studies uncover the workings of the brain and may lead to a better understanding, earlier diagnosis, and more effective treatment of developmental and acquired language disorders,” says Guinevere F. Eden, DPhil, an expert on the neural basis of reading disability at Georgetown University.
A mere one-third of 4th graders read proficiently, according to the National Center for Educational Statistics, part of the U.S. Department of Education. And there has been very little change in reading ability in the past 30 years. Dyslexia is one reading impairment that affects between 5 to 17 percent of the population and accounts for the majority of all learning disabilities. New learning and language research may help improve these statistics.
For the first time, it has been demonstrated that individual differences in rapid auditory processing speed obtained at six months of age is the single best predictor of language development in young children, according to Paula A. Tallal, PhD, and her colleague April Benasich, PhD, at Rutgers University. Rapid auditory processing speed is the time it takes for someone to distinguish the acoustic differences between individual sounds, in the time range important for processing speech sounds that make up words. Furthermore, their study has demonstrated that, together with gender, slow auditory processing speed can correctly predict over 90 percent of children who by age three will fall into the “impaired” range in verbal reasoning abilities.
“These findings provide insight into how we might use these cognitive and neuroscientific findings to develop more effective intervention strategies for struggling readers,” says Tallal. This is the first prospective, longitudinal study of its type, in which normally developing as well as “at risk” children’s perceptual, cognitive, and language development has been followed, beginning in infancy. “These studies show that there is a language to literacy continuum,” explains Tallal, “and that early individual differences in rapid auditory processing both proceed and predict subsequent language and reading development and disorders.”
Additional research indicates that intensive, individually adaptive, behavioral trainings might improve rapid auditory processing speed, language, and reading skills to help K–12 struggling readers. Tallal and her colleagues developed a series of computer “exercises,” called Fast ForWord, that incorporate two approaches based on their research: One that trains children to pay attention to and remember sequences of sounds at increasing rates, until rates typically found in the sound changes that characterize ongoing speech are reached, and another that acoustically modifies (sequentially extends and amplifies) the rapidly successive sound changes that occur in ongoing speech. This acoustically modified speech is used in a series of exercises to intensively cross-train individual components of language and reading over multiple levels. As a rule, as language performance improves, the amount of acoustic modification adaptively decreases until all the exercises can be performed accurately with normal speech.
Fast ForWord is a product of Scientific Learning Corporation of Oakland, CA, a company Tallal co-founded with Michael Merzenich, PhD, of the University of California at San Francisco. Both are directors of the company.
“Results show that most children who adequately complete one or more Fast ForWord language or reading programs make substantially more rapid improvement in a variety of language and reading skills than control children receiving standard language or reading intervention,” says Tallal. These interventions have been studied in both the lab and in schools with struggling readers. Neuroimaging studies using fMRI—a non-invasive imaging tool that can measure activity in the brain while a person is performing a task—as well as electropysiological recording—another way to obtain brain function data by measuring the brain’s electrical currents—depict brain activity that is associated with the behavioral results, showing that these training techniques “normalize” brain responses in brain regions important for language and reading in children with language learning impairments.
“This is, to our knowledge, the first neuroscience research successfully translated for broad use in K–12 public education,” notes Tallal. She and her team are currently exploring the potential efficacy of this approach with other populations, including adults learning English as a second language as well as elderly experiencing age-associated decline in auditory processing speed and memory, speech processing, and language skills.
Another novel study of language and learning impairments has reached a consensus on which brain regions are involved in reading and how they differ in individuals with developmental dyslexia, according to Guinevere F. Eden, DPhil, Jose Maisog, MD, and their colleagues at Georgetown University. Eden and her team have analyzed and summarized the last decade’s functional neuroimaging research on dyslexia—an analysis that has not been done before in the dyslexia field. “Understanding the underlying neurological problem causing dyslexia could aid in the development of effective treatment,” notes Eden. “However, to date, the brain imaging studies addressing this question have not yet reached a definitive consensus.”
Dyslexia is associated with a weakness in phonological processing. Listening to and understanding speech involves identifying the individual sounds that make up words. Phonological processing involves identifying those sounds and subsequently identifying the words that the sounds combine to make.
Eden and her team conducted meta-analyses on published fMRI and positron emission topography (PET)—another non-invasive tool used to study brain function—studies that compared dyslexic and non-dyslexic readers. The researchers limited studies to those that used tasks involving words, pseudowords, or letters in the subjects’ native alphabetic language. Also, the researchers used only studies involving healthy, post-pubertal dyslexic or non-dyslexic teens and adults, with no history of neurologic or psychiatric disorders, who were matched for age and handedness.
The majority of studies showed that typical readers activated left-sided brain areas more than dyslexics. Specifically, the amount of brain activity identified in two posterior regions involved in word recognition and responsible for phonologically manipulating words, respectively, as well as left frontal regions involved in articulatory output of words, are most likely to distinguish good from poor readers. Dyslexics showed little compensatory increases in brain activity and only activated a few right-sided brain areas (the thalamus and insula) more than non-dyslexic readers, according to Eden, Maisog, and their colleagues. This finding disconfirms one currently held hypothesis that dyslexics compensate for their underactivity in the posterior regions of the brain with increased activation in the left, frontal regions.
“Instead, activity in the right anterior insula part of the brain in dyslexics may indicate that dyslexics are experiencing a negative emotional response while reading, since people with reading difficulties may find reading unpleasant,” suggests Eden. “Future use of this meta-analytical approach will help identify brain regions altered in dyslexia during non-language tasks, such as processing involving the senses, and also may serve to advance our understanding of the neural basis of dyslexia in children.”
Other researchers have examined general language development. Spontaneous, natural hand gestures accompanying speech surprisingly activate regions of the brain important for processing language information, demonstrate Marco Iacoboni, MD, PhD, and his team of researchers at the University of California, Los Angeles for the first time. “These findings are particularly interesting, because the gestures are hand movements—not language—implicating that the activation of language areas in the brain must be due to linguistic information being transmitted through action,” says Iacoboni. The hand gestures also simultaneously activated areas in the brain involved in action representation.
Iacoboni and his team studied the brains of six men and six women while they viewed a 25-minute videotape of someone retelling a vivid storyline from a cartoon show, using accompanying hand gestures. The researchers used fMRI to look at the pattern of blood flow in the subjects’ brains while they were watching the video. The hand gestures in the video included two of the most common types of gestures produced during speech—iconic and beat gestures.
Iconic gestures represent meaningful objects or actions. For example, when a person hand gestures how to open an umbrella or shows the shape of the vase he or she is talking about while telling a story. Beat gestures mark themes, and provide emphasis and rhythm during spoken language. For example, when a conference speaker moves his or her hands up and down to emphasize the key points of the presentation.
When watching iconic gestures, subjects showed increased activity in the fronto-parietal area of the left hemisphere of the brain—an area previously implicated in understanding meaningful actions. More strikingly, when viewing beat gestures, subjects showed increased activity in the fronto-temporal area of the brain, which is generally activated by listening to speech. These brain regions also process action-related information, such as during imitation.
The fact that seeing gestures activated regions in the brain relevant to both language and action implies that these brain areas rely on the same cognitive capacities and that, during speech, the language system and the motor system are linked through gesture. “This approach may be applied to further study the theory that language evolved out of a manual system of communication,” notes Iacoboni.
Another group of researchers, who also study typical learning and language brain development, have identified a region of the human brain that is more active when people view letters and words than when they view other types of visual objects such as line drawings. Furthermore, they found that the strength of activity in this brain region depends on subjects’ prior experience with those letters and words. “These are novel findings that provide insight into the development of the human brain,” explains Chris I. Baker, PhD, at Massachusetts Institute of Technology (MIT). “These findings are important, because they suggest that experience alone is sufficient to produce brain regions that are selectively responsive to categories of visual objects.”
Baker and his collaborators in Nancy Kanwisher’s lab at MIT used fMRI to monitor the brain activity of 25 people with no knowledge of written Hebrew and of 16 Hebrew readers while they viewed certain letters and other visual objects including English words and consonant strings, line drawings, Hebrew words, Chinese characters, and digit strings.
The researchers found that, in most participants, a small region toward the back of the brain on the left side was more active when subjects viewed letter strings and words than when they viewed other visual objects such as line drawings and digit strings. “This finding shows that there is a region of the human brain that is selectively activated by words and letter objects, similar to selective regions that have been previously identified for natural visual objects such as faces and scenes,” says Baker.
To study the role experience with letters plays in this brain area, the researchers used fMRI to compare brain activity while participants viewed both familiar and unfamiliar letter strings. Baker and his team found that the brain activity in the word and letter selective brain region when viewing Hebrew words was much greater in readers of Hebrew than in people with no knowledge of Hebrew. “This increased brain activity in subjects familiar with Hebrew shows that experience changes the responsiveness and selectivity of this brain region,” says Baker.
The length of time humans have been reading and writing is thought to be insufficient for the development of an evolutionary, genetic preference for letters and words—in contrast to faces, scenes, and other natural objects. “The existence of this letter and word selective brain region and its increased activity related to letter and word experience suggests that experience alone may be able to account for the presence of category-selective regions in the human brain,” explains Baker.
As a next step, Baker and others in the Kanwisher lab are researching whether short-term experience with novel, visual objects can produce similar changes in the human brain.