Sign language, like spoken language, promotes object categorization in young hearing infants

Cognition. 2021 Oct:215:104845. doi: 10.1016/j.cognition.2021.104845. Epub 2021 Jul 14.

Abstract

The link between language and cognition is unique to our species and emerges early in infancy. Here, we provide the first evidence that this precocious language-cognition link is not limited to spoken language, but is instead sufficiently broad to include sign language, a language presented in the visual modality. Four- to six-month-old hearing infants, never before exposed to sign language, were familiarized to a series of category exemplars, each presented by a woman who either signed in American Sign Language (ASL) while pointing and gazing toward the objects, or pointed and gazed without language (control). At test, infants viewed two images: one, a new member of the now-familiar category; and the other, a member of an entirely new category. Four-month-old infants who observed ASL distinguished between the two test objects, indicating that they had successfully formed the object category; they were as successful as age-mates who listened to their native (spoken) language. Moreover, it was specifically the linguistic elements of sign language that drove this facilitative effect: infants in the control condition, who observed the woman only pointing and gazing failed to form object categories. Finally, the cognitive advantages of observing ASL quickly narrow in hearing infants: by 5- to 6-months, watching ASL no longer supports categorization, although listening to their native spoken language continues to do so. Together, these findings illuminate the breadth of infants' early link between language and cognition and offer insight into how it unfolds.

Keywords: Categorization; Gesture; Infants; Language; Sign language.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Auditory Perception
  • Female
  • Hearing
  • Humans
  • Infant
  • Language Development
  • Language*
  • Sign Language*