Efficient encoding of spectrotemporal information for bat echolocation

PLoS Comput Biol. 2021 Jun 28;17(6):e1009052. doi: 10.1371/journal.pcbi.1009052. eCollection 2021 Jun.

Abstract

In most animals, natural stimuli are characterized by a high degree of redundancy, limiting the ensemble of ecologically valid stimuli to a significantly reduced subspace of the representation space. Neural encodings can exploit this redundancy and increase sensing efficiency by generating low-dimensional representations that retain all information essential to support behavior. In this study, we investigate whether such an efficient encoding can be found to support a broad range of echolocation tasks in bats. Starting from an ensemble of echo signals collected with a biomimetic sonar system in natural indoor and outdoor environments, we use independent component analysis to derive a low-dimensional encoding of the output of a cochlear model. We show that this compressive encoding retains all essential information. To this end, we simulate a range of psycho-acoustic experiments with bats. In these simulations, we train a set of neural networks to use the encoded echoes as input while performing the experiments. The results show that the neural networks' performance is at least as good as that of the bats. We conclude that our results indicate that efficient encoding of echo information is feasible and, given its many advantages, very likely to be employed by bats. Previous studies have demonstrated that low-dimensional encodings allow for task resolution at a relatively high level. In contrast to previous work in this area, we show that high performance can also be achieved when low-dimensional filters are derived from a data set of realistic echo signals, not tailored to specific experimental conditions.

MeSH terms

  • Acoustic Stimulation
  • Animals
  • Chiroptera / physiology*
  • Cochlea / physiology
  • Echolocation*
  • Vision, Ocular

Associated data

  • figshare/10.6084/m9.figshare.14573652

Grants and funding

The author(s) received no specific funding for this work.