Researchers at the UCL Institute of Cognitive Neuroscience and Babylab (Birkbeck College, London) are collaborating to study the impact of early speech and language experience on the neural representation of language in infancy. Led by PI Dr. Evelyne Mercure, the group is interested in comparing hearing infants of Deaf mothers who use a sign language as their dominant language (HoD infants) with hearing infants of hearing mothers (HoH infants). This is because despite having normal hearing, HoD infants have a different early experience of speech and language to that of HoH infants. First, since Deaf mothers are more likely to use signing than auditory speech, their infants are likely to have reduced exposure to auditory spoken language. Second, the experience of HoD infants includes both a visual language (e.g. British Sign Language (BSL)) and an auditory language (e.g. English). The team decided to compare HoD infants to bilingual HoH individuals since this latter group is also exposed to two languages (both auditory). They also included a control group comprised of monolingual HoH infants. They hypothesised that the experience of HoD infants will influence their neural representation of language.
All subjects had fNIRS data acquired using the Gowerlabs NTS Optical Imaging System. Headgear containing 38 source-detector channels was developed at Babylab and designed to image the temporal and temporoparietal areas of the brain. The latter area was included because this part of the brain has previously been seen to play an important role in processing sign language. There were 38 channels in the array with 2 cm source-detector separations. Lead PI of the project, Dr. Evelyne Mercure, commented that she is very happy with fNIRS as a tool to scan baby brain function in comparison to her previous experiences with EEG and fMRI. She can scan the infants when they are awake and behaving naturally, plus her success rate is higher since the subjects tolerate the headgear well.
Each of the three subject groups were shown videos of people telling short stories in four different languages, and their functional responses are then compared. The languages were: English (familiar spoken language), French (unfamiliar spoken language), British Sign Language (familiar sign language to HoD infants) and French Sign Language (unfamiliar sign language). The aim was to assess how the brain represents spoken and signed languages, as well as familiar and unfamiliar languages. To date the team have scanned around 100 infants with the NTS Optical Imaging System for this study!
Preliminary results from the temporal channels suggest that spoken language causes a strong activation in the temporal cortex across all subjects. For the HoH monolinguals the response is highly bilateral, but in the HoD infants it is more restricted and left-lateralised; whilst in the HoH bilinguals it is more right-lateralised. The rich data-set collected by this group is still being analysed, and we can expect many more interesting results from this study in the near future.
For more information, please find Dr. Evelyne Mercure’s contact details on the UCL Institute of Cognitive Neuroscience website.