The language we use influences how we view and understand the world, new research led by the University of Westminster and UCL has found.

people-using-sign-language
Shutterstock

This research examining British Sign Language and spoken English is the first to assess whether brain patterns for spoken words and signs are similar to one another.

Using fMRI technology, the researchers scanned the brains of people who are bilingual in spoken British English and British Sign Language to compare the similarity of brain patterns generated by listening to the same spoken words and watching the equivalent signs.

Nine words and signings belonging to three semantic categories – fruit, animals and transport – were used in the study. The researchers found that brain patterns were similar at the level of categories, for example whether it was a fruit or animal, but neural patterns for individual items differed.

They concluded as a result that broad semantic information is shared between speech and sign, but finer semantic details differed. 

Speaking about the findings, lead researcher Dr Samuel Evans said: “Our results show that British Sign Language and spoken English generate similar, but slightly different meanings in the human brain.

“This suggests that the language we use, whether it is a visual or auditory language, influences the way that we view and interact with the world. This was unexpected as up until now, we have assumed that sign languages and spoken languages are processed in a very similar way in the brain.”

Read the full paper in the peer-reviewed journal Current Biology.
 

Press and media enquiries

Contact us on:

[email protected]