Neural representations of self-generated thought during think-aloud fMRI

Neuroimage. 2023 Jan:265:119775. doi: 10.1016/j.neuroimage.2022.119775. Epub 2022 Nov 28.

Abstract

Is the brain at rest during the so-called resting state? Ongoing experiences in the resting state vary in unobserved and uncontrolled ways across time, individuals, and populations. However, the role of self-generated thoughts in resting-state fMRI remains largely unexplored. In this study, we collected real-time self-generated thoughts during "resting-state" fMRI scans via the think-aloud method (i.e., think-aloud fMRI), which required participants to report whatever they were currently thinking. We first investigated brain activation patterns during a think-aloud condition and found that significantly activated brain areas included all brain regions required for speech. We then calculated the relationship between divergence in thought content and brain activation during think-aloud and found that divergence in thought content was associated with many brain regions. Finally, we explored the neural representation of self-generated thoughts by performing representational similarity analysis (RSA) at three neural scales: a voxel-wise whole-brain searchlight level, a region-level whole-brain analysis using the Schaefer 400-parcels, and at the systems level using the Yeo seven-networks. We found that "resting-state" self-generated thoughts were distributed across a wide range of brain regions involving all seven Yeo networks. This study highlights the value of considering ongoing experiences during resting-state fMRI and providing preliminary methodological support for think-aloud fMRI.

Keywords: Natural language processing; Representational similarity analysis; Self-generated thoughts; Think-aloud fMRI.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain Mapping / methods
  • Brain* / diagnostic imaging
  • Brain* / physiology
  • Cognition
  • Humans
  • Magnetic Resonance Imaging*
  • Speech