Colour Association with Music Is Mediated by Emotion: Evidence from an Experiment Using a CIE Lab Interface and Interviews

PLoS One. 2015 Dec 7;10(12):e0144013. doi: 10.1371/journal.pone.0144013. eCollection 2015.

Abstract

Crossmodal associations may arise at neurological, perceptual, cognitive, or emotional levels of brain processing. Higher-level modal correspondences between musical timbre and visual colour have been previously investigated, though with limited sets of colour. We developed a novel response method that employs a tablet interface to navigate the CIE Lab colour space. The method was used in an experiment where 27 film music excerpts were presented to participants (n = 22) who continuously manipulated the colour and size of an on-screen patch to match the music. Analysis of the data replicated and extended earlier research, for example, that happy music was associated with yellow, music expressing anger with large red colour patches, and sad music with smaller patches towards dark blue. Correlation analysis suggested patterns of relationships between audio features and colour patch parameters. Using partial least squares regression, we tested models for predicting colour patch responses from audio features and ratings of perceived emotion in the music. Parsimonious models that included emotion robustly explained between 60% and 75% of the variation in each of the colour patch parameters, as measured by cross-validated R2. To illuminate the quantitative findings, we performed a content analysis of structured spoken interviews with the participants. This provided further evidence of a significant emotion mediation mechanism, whereby people tended to match colour association with the perceived emotion in the music. The mixed method approach of our study gives strong evidence that emotion can mediate crossmodal association between music and visual colour. The CIE Lab interface promises to be a useful tool in perceptual ratings of music and other sounds.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Auditory Perception / physiology*
  • Color Perception / physiology*
  • Emotions / physiology*
  • Female
  • Humans
  • Male
  • Music
  • User-Computer Interface*

Grants and funding

The work was supported by Tier 1 grant #4010958, Academic Research Fund, Ministry of Education, Singapore.