Accelerating language emergence by functional pressures

PLoS One. 2023 Dec 14;18(12):e0295748. doi: 10.1371/journal.pone.0295748. eCollection 2023.

Abstract

In language emergence, neural agents acquire communication skills by interacting with one another and the environment. Through these interactions, agents learn to connect or ground their observations to the messages they utter, forming a shared consensus about the meaning of the messages. Such connections form what we refer to as a grounding map. However, these maps can often be complicated, unstructured, and contain redundant connections. In this paper, we introduce two novel functional pressures, modeled as differentiable auxiliary losses, to simplify and structure the grounding maps. The first pressure enforces compositionality via topological similarity, which has been previously discussed but has not been modeled or utilized as a differentiable auxiliary loss. The second functional pressure, which is conceptually novel, imposes sparsity in the grounding map by pruning weaker connections while strengthening the stronger ones. We conduct experiments in multiple value-attribute environments with varying communication channels. Our methods achieve improved out-of-domain regularization and rapid convergence over baseline approaches. Furthermore, introduced functional pressures are robust to the changes in experimental conditions and able to operate with minimum training data. We note that functional pressures cause simpler and more structured emergent languages showing distinct characteristics depending on the functional pressure employed. Enhancing grounding map sparsity yields the best performance and the languages with the most compressible grammar. In summary, our novel functional pressures, focusing on compositionality and sparse groundings, expedite the development of simpler, more structured languages while enhancing their generalization capabilities. Exploring alternative types of functional pressures and combining them in agent training may be beneficial in the ongoing quest for improved emergent languages.

MeSH terms

  • Consensus
  • Generalization, Psychological
  • Language*
  • Learning
  • Linguistics*

Grants and funding

This research was funded by CODEGEN QBITS lab of University of Moratuwa. Kasun Vithanage was funded by the above. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.