THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior

Elife. 2023 Feb 27:12:e82580. doi: 10.7554/eLife.82580.

Abstract

Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here, we present THINGS-data, a multimodal collection of large-scale neuroimaging and behavioral datasets in humans, comprising densely sampled functional MRI and magnetoencephalographic recordings, as well as 4.70 million similarity judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly annotated objects, allowing for testing countless hypotheses at scale while assessing the reproducibility of previous findings. Beyond the unique insights promised by each individual dataset, the multimodality of THINGS-data allows combining datasets for a much broader view into object processing than previously possible. Our analyses demonstrate the high quality of the datasets and provide five examples of hypothesis-driven and data-driven applications. THINGS-data constitutes the core public release of the THINGS initiative (https://things-initiative.org) for bridging the gap between disciplines and the advancement of cognitive neuroscience.

Keywords: MEG; behavior; fMRI; human; neuroscience; objects; research data; vision.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, N.I.H., Intramural

MeSH terms

  • Brain Mapping / methods
  • Brain* / diagnostic imaging
  • Humans
  • Magnetic Resonance Imaging / methods
  • Magnetoencephalography / methods
  • Pattern Recognition, Visual* / physiology
  • Reproducibility of Results