Empirically Identifying and Computationally Modeling the Brain-Behavior Relationship for Human Scene Categorization

J Cogn Neurosci. 2023 Nov 1;35(11):1879-1897. doi: 10.1162/jocn_a_02043.

Abstract

Humans effortlessly make quick and accurate perceptual decisions about the nature of their immediate visual environment, such as the category of the scene they face. Previous research has revealed a rich set of cortical representations potentially underlying this feat. However, it remains unknown which of these representations are suitably formatted for decision-making. Here, we approached this question empirically and computationally, using neuroimaging and computational modeling. For the empirical part, we collected EEG data and RTs from human participants during a scene categorization task (natural vs. man-made). We then related EEG data to behavior to behavior using a multivariate extension of signal detection theory. We observed a correlation between neural data and behavior specifically between ∼100 msec and ∼200 msec after stimulus onset, suggesting that the neural scene representations in this time period are suitably formatted for decision-making. For the computational part, we evaluated a recurrent convolutional neural network (RCNN) as a model of brain and behavior. Unifying our previous observations in an image-computable model, the RCNN predicted well the neural representations, the behavioral scene categorization data, as well as the relationship between them. Our results identify and computationally characterize the neural and behavioral correlates of scene categorization in humans.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain Mapping / methods
  • Brain* / diagnostic imaging
  • Humans
  • Pattern Recognition, Visual*
  • Photic Stimulation / methods