What the hands can tell us about language emergence

Psychon Bull Rev. 2017 Feb;24(1):213-218. doi: 10.3758/s13423-016-1074-x.

Abstract

Why, in all cultures in which hearing is possible, has language become the province of speech and the oral modality? I address this question by widening the lens with which we look at language to include the manual modality. I suggest that human communication is most effective when it makes use of two types of formats--a discrete and segmented code, produced simultaneously along with an analog and mimetic code. The segmented code is supported by both the oral and the manual modalities. However, the mimetic code is more easily handled by the manual modality. We might then expect mimetic encoding to be done preferentially in the manual modality (gesture), leaving segmented encoding to the oral modality (speech). This argument rests on two assumptions: (1) The manual modality is as good at segmented encoding as the oral modality; sign languages, established and idiosyncratic, provide evidence for this assumption. (2) Mimetic encoding is important to human communication and best handled by the manual modality; co-speech gesture provides evidence for this assumption. By including the manual modality in two contexts--when it takes on the primary function of communication (sign language), and when it takes on a complementary communicative function (gesture)--in our analysis of language, we gain new perspectives on the origins and continuing development of language.

Keywords: Gesture; Homesign; Mimetic encoding; Sign language.

Publication types

  • Review

MeSH terms

  • Gestures*
  • Humans
  • Language*
  • Sign Language*