CODECHECK: an Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility

F1000Res. 2021 Mar 30:10:253. doi: 10.12688/f1000research.51738.2. eCollection 2021.

Abstract

The traditional scientific paper falls short of effectively communicating computational research. To help improve this situation, we propose a system by which the computational workflows underlying research articles are checked. The CODECHECK system uses open infrastructure and tools and can be integrated into review and publication processes in multiple ways. We describe these integrations along multiple dimensions (importance, who, openness, when). In collaboration with academic publishers and conferences, we demonstrate CODECHECK with 25 reproductions of diverse scientific publications. These CODECHECKs show that asking for reproducible workflows during a collaborative review can effectively improve executability. While CODECHECK has clear limitations, it may represent a building block in Open Science and publishing ecosystems for improving the reproducibility, appreciation, and, potentially, the quality of non-textual research artefacts. The CODECHECK website can be accessed here: https://codecheck.org.uk/.

Keywords: Open Science; code sharing; data sharing; peer review; quality control; reproducibility; reproducible research; scholarly publishing.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Ecosystem*
  • Peer Review*
  • Reproducibility of Results
  • Workflow

Grants and funding

This work was financially supported by the UK Software Sustainability Institute and a Mozilla Science mini grant. DN is supported by grant PE1632/17-1 from the German Research Foundation (DFG).