Methodological quality assessment should move beyond design specificity

JBI Evid Synth. 2023 Mar 1;21(3):507-519. doi: 10.11124/JBIES-22-00362.

Abstract

Objective: This study aimed to assess the utility of a unified tool (MASTER) for bias assessment against design-specific tools in terms of content and coverage.

Methods: Each of the safeguards in the design-specific tools was compared and matched to safeguards in the unified MASTER scale. The design-specific tools were the JBI, Scottish Intercollegiate Guidelines Network (SIGN), and the Newcastle-Ottawa Scale (NOS) tools for analytic study designs. Duplicates, safeguards that could not be mapped to the MASTER scale, and items not applicable as safeguards against bias were flagged and described.

Results: Many safeguards across the JBI, SIGN, and NOS tools were common, with a minimum of 10 to a maximum of 23 unique safeguards across various tools. These 3 design-specific toolsets were missing 14 to 26 safeguards from the MASTER scale. The MASTER scale had complete coverage of safeguards within the 3 toolsets for analytic designs.

Conclusions: The MASTER scale provides a unified framework for bias assessment of analytic study designs, has good coverage, avoids duplication, has less redundancy, and is more convenient when used for methodological quality assessment in evidence synthesis. It also allows assessment across designs that cannot be done using a design-specific tool.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Bias
  • Humans
  • Research Design*