Neural decoding of inferior colliculus multiunit activity for sound category identification with temporal correlation and transfer learning

Network. 2024 May;35(2):101-133. doi: 10.1080/0954898X.2023.2282576. Epub 2023 Nov 20.

Abstract

Natural sounds are easily perceived and identified by humans and animals. Despite this, the neural transformations that enable sound perception remain largely unknown. It is thought that the temporal characteristics of sounds may be reflected in auditory assembly responses at the inferior colliculus (IC) and which may play an important role in identification of natural sounds. In our study, natural sounds will be predicted from multi-unit activity (MUA) signals collected in the IC. Data is obtained from an international platform publicly accessible. The temporal correlation values of the MUA signals are converted into images. We used two different segment sizes and with a denoising method, we generated four subsets for the classification. Using pre-trained convolutional neural networks (CNNs), features of the images were extracted and the type of heard sound was classified. For this, we applied transfer learning from Alexnet, Googlenet and Squeezenet CNNs. The classifiers support vector machines (SVM), k-nearest neighbour (KNN), Naive Bayes and Ensemble were used. The accuracy, sensitivity, specificity, precision and F1 score were measured as evaluation parameters. By using all the tests and removing the noise, the accuracy improved significantly. These results will allow neuroscientists to make interesting conclusions.

Keywords: Multiunit activity; classification; inferior colliculus; sound;; temporal correlation; transfer learning.

MeSH terms

  • Animals
  • Bayes Theorem
  • Hearing
  • Humans
  • Inferior Colliculi* / physiology
  • Machine Learning
  • Sound