Ten questions to consider when interpreting results of a meta-epidemiological study-the MetaBLIND study as a case

Res Synth Methods. 2020 Mar;11(2):260-274. doi: 10.1002/jrsm.1392. Epub 2020 Jan 20.

Abstract

Randomized clinical trials underpin evidence-based clinical practice, but flaws in their conduct may lead to biased estimates of intervention effects and hence invalid treatment recommendations. The main approach to the empirical study of bias is to collate a number of meta-analyses and, within each, compare the results of trials with and without a methodological characteristic such as blinding of participants and health professionals. Estimated within-meta-analysis differences are combined across meta-analyses, leading to an estimate of mean bias. Such "meta-epidemiological" studies are published in increasing numbers and have the potential to inform trial design, assessment of risk of bias, and reporting guidelines. However, their interpretation is complicated by issues of confounding, imprecision, and applicability. We developed a guide for interpreting meta-epidemiological studies, illustrated using MetaBLIND, a large study on the impact of blinding. Applying generally accepted principles of research methodology to meta-epidemiology, we framed 10 questions covering the main issues to consider when interpreting results of such studies, including risk of systematic error, risk of random error, issues related to heterogeneity, and theoretical plausibility. We suggest that readers of a meta-epidemiological study reflect comprehensively on the research question posed in the study, whether an experimental intervention was unequivocally identified for all included trials, the risk of misclassification of the trial characteristic, and the risk of confounding, i.e the adequacy of any adjustment for the likely confounders. We hope that our guide to interpretation of results of meta-epidemiological studies is helpful for readers of such studies.

MeSH terms

  • Bias*
  • Empirical Research
  • Epidemiologic Studies*
  • Evidence-Based Practice
  • Humans
  • Meta-Analysis as Topic*
  • Randomized Controlled Trials as Topic
  • Research Design