Towards brain-activity-controlled information retrieval: Decoding image relevance from MEG signals

Neuroimage. 2015 May 15:112:288-298. doi: 10.1016/j.neuroimage.2014.12.079. Epub 2015 Jan 13.

Abstract

We hypothesize that brain activity can be used to control future information retrieval systems. To this end, we conducted a feasibility study on predicting the relevance of visual objects from brain activity. We analyze both magnetoencephalographic (MEG) and gaze signals from nine subjects who were viewing image collages, a subset of which was relevant to a predetermined task. We report three findings: i) the relevance of an image a subject looks at can be decoded from MEG signals with performance significantly better than chance, ii) fusion of gaze-based and MEG-based classifiers significantly improves the prediction performance compared to using either signal alone, and iii) non-linear classification of the MEG signals using Gaussian process classifiers outperforms linear classification. These findings break new ground for building brain-activity-based interactive image retrieval systems, as well as for systems utilizing feedback both from brain activity and eye movements.

Keywords: Bayesian classification; Gaussian processes; Gaze signal; Image relevance; Implicit relevance feedback; Information retrieval; Magnetoencephalography.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Algorithms
  • Bayes Theorem
  • Brain / physiology*
  • Eye Movements
  • Female
  • Fixation, Ocular
  • Humans
  • Image Processing, Computer-Assisted
  • Magnetoencephalography / methods*
  • Male
  • Mental Processes / physiology*
  • Normal Distribution
  • Photic Stimulation
  • Young Adult