Wired for grammar

- May 4, 2010

Aaron Newman
Prof. Aaron Newman with a brain image showing areas that "light up" when activated.(Nick Pearce Photo)

Since the 19th century, scientists have been absorbed by the process of identifying the parts of the brain that are involved in language.

Back in 1861, a French doctor kick-started this fascination by examining the brain of a recently deceased patient who had an unusual disorder: though he could understand language and did not have any motor impairment, he could not speak or express thoughts in writing. Performing an autopsy, Dr. Paul Broca found a substantial lesion in the left inferior frontal cortex; this discovery, augmented with further research, led him to make the famous statement, “We speak with the left hemisphere.”

That idea has been expounded on ever since; the primary sections in the brain that process language, or so the thinking goes, are all on the left: the left frontal cortex (now called Broca’s Area); the posterior part of the temporal lobe called Wernicke’s Area (after a German neurologist); and a bundle of nerves called the arcuate fasciculus. It has been speculated that Broca’s area evolved specifically to support the grammar found only in human languages.

But a new study by Aaron Newman at Dalhousie University’s Neuroscience Institute and Brain Repair Centre, and colleagues at the University of Rochester in New York state suggests this picture is too simple – there are multiple brain regions wired for grammar. Their findings were published in a recent edition of the journal Proceedings of the National Academy of Science of the U.S.A.

Brain imaging used

Instead of using one specialized brain area for understanding grammar, the study revealed different areas active for different aspects of grammar.

Using brain imaging, researchers examined the living brains of deaf American Sign Language (ASL) users. Videos of sentences being signed were played for the 14 subjects of the experiment, who were lying in a Magnetic Resonance Imaging (MRI) machine to monitor which areas of the brain were activated when processing different types of sentences.\

“As a brain area gets more active, it gets more oxygen and it ‘lights up,’” explains Dr. Newman, assistant professor in Dalhousie’s Departments of Psychology, Psychiatry, Surgery, and Pediatrics, and Canada Research Chair (Tier II) in Cognitive Neuroscience.

“What we found is that we understand grammar using a combination of more basic cognitive skills – the same brain areas used for things like short-term memory and sequence processing. This suggests that in the evolution of language these basic tools were ‘recyled’ by combining them in new ways to develop grammar.”

He makes the comparison to a carpenter digging through a toolbox to select what’s needed for a specific task. Depending on the type of grammar used in forming a sentence, the brain will activate a certain set of regions to process it.

“It tells us that there is no one brain area that has evolved for language,” he says. “Instead, we use the cognitive machinery we already had.”

American Sign Language

So why study American Sign Language? Until now, says Dr. Newman, the neural underpinnings of language have been difficult to document because the grammars of different languages encode information in different ways. That makes it hard to match stimuli and subjects across languages.

In English, for example, the order of the words in a sentence “encodes the grammatical dependency relationships.” Dr. Newman gives the example, “John kissed Mary,” and by word order, we’re able to deduce who did what to whom.

But other languages, such as German, Hungarian and Spanish, word order is less restricted because dependency relationships can be marked by other cues, such as suffixes attached to words to convey the role in the sentence, as in the ‘doer’ or ‘receiver’ of an action.

ASL, a natural human language used to communicate among deaf people in Canada and the United States, was ideal to study because it uses both grammatical devices: word order and inflection. Although ASL doesn’t use sound, it follows the same universal patterns of grammar as other human languages.

“By using ASL, we were able to study something we couldn’t do with spoken languages, which is to contrast the two different types of grammar,” says Dr. Newman. “When we did that, we showed they engaged the brain areas differently.”

LINK: "Dissociating neural subsystems for grammar by contrasting word order and inflection" in the Proceedings of the National Academy of Science 


Comments

All comments require a name and email address. You may also choose to log-in using your preferred social network or register with Disqus, the software we use for our commenting system. Join the conversation, but keep it clean, stay on the topic and be brief. Read comments policy.

comments powered by Disqus