Estimating the prevalence of missing experiments in a neuroimaging meta-analysis

Res Synth Methods. 2020 Nov;11(6):866-883. doi: 10.1002/jrsm.1448. Epub 2020 Sep 27.

Abstract

Coordinate-based meta-analyses (CBMA) allow researchers to combine the results from multiple functional magnetic resonance imaging experiments with the goal of obtaining results that are more likely to generalize. However, the interpretation of CBMA findings can be impaired by the file drawer problem, a type of publication bias that refers to experiments that are carried out but are not published. Using foci per contrast count data from the BrainMap database, we propose a zero-truncated modeling approach that allows us to estimate the prevalence of nonsignificant experiments. We validate our method with simulations and real coordinate data generated from the Human Connectome Project. Application of our method to the data from BrainMap provides evidence for the existence of a file drawer effect, with the rate of missing experiments estimated as at least 6 per 100 reported. The R code that we used is available at https://osf.io/ayhfv/.

Keywords: meta-analysis; neuroimaging; publication-bias; zero-truncated modeling.

Publication types

  • Meta-Analysis

MeSH terms

  • Brain Mapping
  • Computer Graphics
  • Computer Simulation
  • Connectome
  • Data Interpretation, Statistical
  • Databases, Factual
  • Humans
  • Image Processing, Computer-Assisted
  • Magnetic Resonance Imaging / methods*
  • Meta-Analysis as Topic
  • Monte Carlo Method
  • Neuroimaging / methods*
  • Prevalence