Hand movements help us understand language in a noisy environment

Hand movements (and not especially lip movements, as is often thought) are the best way to recognise language in noisy situations. The best solution is a combination of lip and hand movements. This is the conclusion of a study by cognitive neuroscientist Linda Drijvers from Radboud University and the Max Planck Institute. It is the topic of her PhD degree that she will be obtaining on 13 May.

A recognisable situation: in a busy café with a lot of background noise, your conversation partner tries to explain something to you. What do you look at? Are you focusing on their lips, or their hand movements? Drijvers: “It’s a bit counterintuitive, but my research shows that hand gestures in noisy situations contribute more to language comprehension than just the lip movements. A person’s understanding of language in a difficult listening situation is greatly enhanced by hand movements, but the best way is to combine lip and hand movements.”

Recognising verbs

Drijvers had test subjects watch videos in which people pronounce verbs (with a lot of background noise) that express action and make accompanying hand movements. Words like stirring, sweeping, chopping, rubbing, salting and shoving. She looked at what people were looking at (using eye tracking) and how well they understood the verbs with or without hand movements.

She also looked at brainwaves in those parts of the brain where visual, motor and language information is processed. Drijvers noticed that when hand movements helped to recognise language, the brain areas for vision, the hand area of the motorised brain areas, and the brain areas that are linked to language and meaning became extra active.

To date, relatively little attention has been paid to the visual aspect of spoken language in science, and wrongly so says Drijvers: “Spoken language is usually studied from sound, but language is not only what you hear, but also what you see. Visual information can make it easier for you to understand language in difficult listening situations, such as in a noisy café, but also when you have a hearing impairment, or when Dutch is not your native language.”

Non-natives

With the latter group, Drijvers conducted research showing that non-native speakers find it more difficult to recognise speech sounds in noisy environments than native speakers. This is logical in itself, but it also affects the benefits they derive from the information conveyed by the lips and the meaningful information conveyed by the hands. As a result, they understand hand and lip movements less well and are worse at linking different information sources. This was reflected in their behaviour, eye movements and brain activity.

Drijvers: “For example, in non-native speakers, we noticed less activity in the brain areas involved in recognising lip movements, brain areas involved in retrieving meaningful information, and brain areas involved in linking speech and hand movements. They were also less able than native speakers to recognise verbs in noisy situations when only lip movements were available, and also benefited less from lip movements (and therefore also hand movements), because they were less able to recognise words lost in the noise.”

Source: Read Full Article