Bimodal bilinguals co-activate both languages during spoken comprehension

Cognition. 2012 Sep;124(3):314-24. doi: 10.1016/j.cognition.2012.05.014. Epub 2012 Jul 7.

Abstract

Bilinguals have been shown to activate their two languages in parallel, and this process can often be attributed to overlap in input between the two languages. The present study examines whether two languages that do not overlap in input structure, and that have distinct phonological systems, such as American Sign Language (ASL) and English, are also activated in parallel. Hearing ASL-English bimodal bilinguals' and English monolinguals' eye-movements were recorded during a visual world paradigm, in which participants were instructed, in English, to select objects from a display. In critical trials, the target item appeared with a competing item that overlapped with the target in ASL phonology. Bimodal bilinguals looked more at competing item than at phonologically unrelated items and looked more at competing items relative to monolinguals, indicating activation of the sign-language during spoken English comprehension. The findings suggest that language co-activation is not modality specific, and provide insight into the mechanisms that may underlie cross-modal language co-activation in bimodal bilinguals, including the role that top-down and lateral connections between levels of processing may play in language comprehension.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Acoustic Stimulation
  • Adult
  • Comprehension / physiology*
  • Data Interpretation, Statistical
  • Female
  • Humans
  • Language*
  • Male
  • Multilingualism*
  • Psychomotor Performance / physiology
  • Sign Language*
  • Speech Perception / physiology*
  • Young Adult