Exploring self-generated thoughts in a resting state with natural language processing

Behav Res Methods. 2022 Aug;54(4):1725-1743. doi: 10.3758/s13428-021-01710-6. Epub 2021 Oct 13.

Abstract

The present study seeks to examine individuals' stream of thought in real time. Specifically, we asked participants to speak their thoughts freely out loud during a typical resting-state condition. We first examined the feasibility and reliability of the method and found that the oral reporting method did not significantly change the frequency or content characteristics of self-generated thoughts; moreover, its test-retest reliability was high. Based on methodological feasibility, we combined natural language processing (NLP) with the Bidirectional Encoder Representation from Transformers (BERT) model to directly quantify thought content. We analyzed the divergence of self-generated thought content and expressions of sadness and empirically verified the validity and behavioral significance of the metrics calculated by BERT. Furthermore, we found that reflection and brooding could be differentiated by detecting the divergence of self-generated thought content and expressions of sadness, thus deepening our understanding of rumination and depression and providing a way to distinguish adaptive from maladaptive rumination. Finally, this study provides a new framework to examine self-generated thoughts in a resting state with NLP, extending research on the continuous content of instant self-generated thoughts with applicability to resting-state functional brain imaging.

Keywords: Natural language processing; Resting state; Rumination; Self-generated thoughts; Think-aloud method.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain
  • Brain Mapping*
  • Cognition
  • Humans
  • Natural Language Processing*
  • Reproducibility of Results