Making simulation results reproducible-Survey, guidelines, and examples based on Gradle and Docker

PeerJ Comput Sci. 2019 Dec 9:5:e240. doi: 10.7717/peerj-cs.240. eCollection 2019.

Abstract

This article addresses two research questions related to reproducibility within the context of research related to computer science. First, a survey on reproducibility addressed to researchers in the academic and private sectors is described and evaluated. The survey indicates a strong need for open and easily accessible results, in particular, reproducing an experiment should not require too much effort. The results of the survey are then used to formulate guidelines for making research results reproducible. In addition, this article explores four approaches based on software tools that could bring forward reproducibility in research results. After a general analysis of tools, three examples are further investigated based on actual research projects which are used to evaluate previously introduced tools. Results indicate that the evaluated tools contribute well to making simulation results reproducible but due to conflicting requirements, none of the presented solutions fulfills all intended goals perfectly.

Keywords: In-silico research; Reproducibility; Simulation.

Grants and funding

This work was supported by the Austrian Science Fund (FWF) under the CHIST-ERA project 496 CONCERT (project no. I1402). There was no additional external funding received for this study. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.