New research shows that the neurological connection between speech and gesture starts at a young age

Horizontal Rule with Colgate C

Even over the phone, Professor of Psychology and Neuroscience Spencer Kelly speaks with his hands. “I’m gesturing as I’m talking right now!” he says.

It’s only natural: gesture is an integral part of language, as Kelly’s research illustrates. He’s interested in the relationship between our bodies and our speech. For his latest study, Kelly worked with collaborators in the Netherlands to learn how young children’s brains process speech and hand motions that happen at the same time. They saw that the kids’ brains responded just like adults’ do. The finding supports Kelly’s hypothesis that waving our hands while we talk is no accident — it’s part of how we evolved to communicate.

“Everyone thinks speech is language,” Kelly says. But for a long time, he’s been interested in the unspoken part of language. In a 2004 study, just a few years after Kelly started at Colgate in 2001, he and colleagues studied the brains of college students who were watching videos of someone speaking and gesturing. They found evidence that the students’ brains were processing hand gestures in the moment, simultaneously with the spoken words, rather than focusing on the meaning of the speech first and the gestures later.

Kelly wished he could do the same research with children, to find out how early in development our brains tie gesture and speech processing together. “When does this system get wired up?” he says.

For his new study, Kelly finally got a chance to ask this research question of young people. He teamed up with researchers in the Netherlands to study 15 Dutch children who were six or seven years old.

In the experiment, each kid watched 120 two-second videos. The videos showed an adult saying a verb while making a gesture. Half of the time, the word and the gesture matched: for example, the person in the video said “cut” while making a scissors motion with her fingers. In the other videos, the gesture didn’t match: maybe she said “cut” while swimming with her arms.

While the children watched the videos, electrode caps on their heads recorded electrical activity from their brains. Each time a signal travels between brain cells, a tiny bit of electricity escapes. Nets of electrodes outside a person’s skull, the tool Kelly used, can’t pick up each little electrical burst, or identify its exact point of origin in the brain. But the technology can detect large waves of electrical activity. These surges, when thousands or millions of neurons fire at once, are called event-related potentials or ERPs.

Kelly and his coauthors looked for a certain wave of electrical activity called the N400. It happens about 400 milliseconds after the brain encounters something unexpected. “I’ll produce an N400 in your head right now,” Kelly says. “Here’s a sentence: ‘The man liked cream and sugar in his socks.’” When the sentence doesn’t end the way we expect, there’s a predictable wave of electricity in our heads, like an internal double-take.

That double-take signal was stronger in the kids’ brains when they saw a gesture that didn’t match the spoken word they were hearing. The result looked like what Kelly had seen in college students’ brains years earlier. It suggests that by age 7, the system in our brains that processes gestures as a part of language is up and running smoothly.

This was just what Kelly expected to see, he says. Gesture is such an important part of language that it makes sense we learn to process it early. Even babies use gestures as a part of communication, he says. “They love to point, and they love to pay attention to where you’re looking. All this nonverbal stuff is happening before they even start speaking.”

Kelly hopes to keep pursing this research with even younger children. With collaborators in Japan, he’s hoping to study two-year-olds. “I have to tell you, it is a challenge putting those nets on the two-year-olds,” he says. He’s studied the brain activity of infants as young as one day old — although the babies are easy, he says. “They’re just kind of lying around, unless they’re crying. But two-year-olds, they have better things they want to do.”

If researchers can manage it, though, they may get more insight into how our ability to process gestures and speech develops. And Kelly thinks this development echoes our long-ago evolution into a speaking species.

Imagine prehistoric humans who lived before speech evolved, he says. “It’s unlikely that someone just woke up speaking Swahili.” Instead, our earliest forms of communication were probably gestural: Pointing. Beckoning. Showing someone else how to pick berries, or how to make a fire by striking two stones together. These gestures would have stuck around us as our spoken language gradually appeared.

It’s hard to scientifically test these kinds of stories, Kelly says. But we can get hints from research about how things might have happened. “By looking at the present-day relationship of language and the body, you can make inferences about these evolutionarily ancient things.”

Recognizing that gesture is an important part of how language evolved could have benefits today, Kelly says. “If we can just broaden our idea of what language is, it might help intervene when something goes wrong.” For example, incorporating gesture alongside spoken words might help therapists rehabilitate people who have lost speech after a stroke or brain trauma. Or it might be helpful for children with speech delays or autism.

Maybe thinking of language as a whole-body experience could even help students to learn a new language. The mouth is a limited workspace for communicating, Kelly says, but we have much more available to us. “Language grew up in the body. So why do we forget our home?”