Deep audio embeddings for vocalisation clustering

PLoS One. 2023 Jul 10;18(7):e0283396. doi: 10.1371/journal.pone.0283396. eCollection 2023.

Abstract

The study of non-human animals' communication systems generally relies on the transcription of vocal sequences using a finite set of discrete units. This set is referred to as a vocal repertoire, which is specific to a species or a sub-group of a species. When conducted by human experts, the formal description of vocal repertoires can be laborious and/or biased. This motivates computerised assistance for this procedure, for which machine learning algorithms represent a good opportunity. Unsupervised clustering algorithms are suited for grouping close points together, provided a relevant representation. This paper therefore studies a new method for encoding vocalisations, allowing for automatic clustering to alleviate vocal repertoire characterisation. Borrowing from deep representation learning, we use a convolutional auto-encoder network to learn an abstract representation of vocalisations. We report on the quality of the learnt representation, as well as of state of the art methods, by quantifying their agreement with expert labelled vocalisation types from 8 datasets of other studies across 6 species (birds and marine mammals). With this benchmark, we demonstrate that using auto-encoders improves the relevance of vocalisation representation which serves repertoire characterisation using a very limited number of settings. We also publish a Python package for the bioacoustic community to train their own vocalisation auto-encoders or use a pretrained encoder to browse vocal repertoires and ease unit wise annotation.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Animals
  • Cluster Analysis
  • Machine Learning*
  • Mammals
  • Vocalization, Animal

Associated data

  • figshare/10.6084/m9.figshare.23138210.v1
  • figshare/10.6084/m9.figshare.4805749.v5
  • figshare/10.6084/m9.figshare.3470165.v1

Grants and funding

Hervé Glotin received the grants ANR-20-CHIA-0014 and ANR-21-CE04-0019, and Ricard Marxer received the grant ANR-20-CE23-0012-01 from Agence Nationale de la Recherche. These financed salaries and computers to run experiments. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. To collect the humpback whale data, Renata Sousa-Lima received grants from Fundação O Boticário de Proteção à Natureza / MacArthur Foundation and the Society for Marine Mammalogy (Small-Grants-in-Aid of Research). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.