Monotone Quantifiers Emerge via Iterated Learning

Cogn Sci. 2021 Aug;45(8):e13027. doi: 10.1111/cogs.13027.

Abstract

Natural languages exhibit many semantic universals, that is, properties of meaning shared across all languages. In this paper, we develop an explanation of one very prominent semantic universal, the monotonicity universal. While the existing work has shown that quantifiers satisfying the monotonicity universal are easier to learn, we provide a more complete explanation by considering the emergence of quantifiers from the perspective of cultural evolution. In particular, we show that quantifiers satisfy the monotonicity universal evolve reliably in an iterated learning paradigm with neural networks as agents.

Keywords: Cultural evolution; Generalized quantifiers; Iterated learning; Neural networks; Semantic universals.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Cultural Evolution*
  • Humans
  • Language
  • Learning*
  • Neural Networks, Computer
  • Semantics