Capturing the musical brain with Lasso: Dynamic decoding of musical features from fMRI data

Neuroimage. 2014 Mar:88:170-80. doi: 10.1016/j.neuroimage.2013.11.017. Epub 2013 Nov 19.

Abstract

We investigated neural correlates of musical feature processing with a decoding approach. To this end, we used a method that combines computational extraction of musical features with regularized multiple regression (LASSO). Optimal model parameters were determined by maximizing the decoding accuracy using a leave-one-out cross-validation scheme. The method was applied to functional magnetic resonance imaging (fMRI) data that were collected using a naturalistic paradigm, in which participants' brain responses were recorded while they were continuously listening to pieces of real music. The dependent variables comprised musical feature time series that were computationally extracted from the stimulus. We expected timbral features to obtain a higher prediction accuracy than rhythmic and tonal ones. Moreover, we expected the areas significantly contributing to the decoding models to be consistent with areas of significant activation observed in previous research using a naturalistic paradigm with fMRI. Of the six musical features considered, five could be significantly predicted for the majority of participants. The areas significantly contributing to the optimal decoding models agreed to a great extent with results obtained in previous studies. In particular, areas in the superior temporal gyrus, Heschl's gyrus, Rolandic operculum, and cerebellum contributed to the decoding of timbral features. For the decoding of the rhythmic feature, we found the bilateral superior temporal gyrus, right Heschl's gyrus, and hippocampus to contribute most. The tonal feature, however, could not be significantly predicted, suggesting a higher inter-participant variability in its neural processing. A subsequent classification experiment revealed that segments of the stimulus could be classified from the fMRI data with significant accuracy. The present findings provide compelling evidence for the involvement of the auditory cortex, the cerebellum and the hippocampus in the processing of musical features during continuous listening to music.

Keywords: Decoding; Music; Music Information Retrieval; Naturalistic paradigm; Time series; fMRI.

MeSH terms

  • Adult
  • Auditory Cortex / diagnostic imaging
  • Auditory Cortex / physiology*
  • Auditory Perception / physiology*
  • Brain Mapping / methods*
  • Cerebellum / diagnostic imaging
  • Cerebellum / physiology*
  • Female
  • Hippocampus / diagnostic imaging
  • Hippocampus / physiology*
  • Humans
  • Image Processing, Computer-Assisted
  • Magnetic Resonance Imaging
  • Male
  • Music*
  • Signal Processing, Computer-Assisted*
  • Young Adult