The importance of uncertainty quantification in model reproducibility

Philos Trans A Math Phys Eng Sci. 2021 May 17;379(2197):20200071. doi: 10.1098/rsta.2020.0071. Epub 2021 Mar 29.

Abstract

Many computer models possess high-dimensional input spaces and substantial computational time to produce a single model evaluation. Although such models are often 'deterministic', these models suffer from a wide range of uncertainties. We argue that uncertainty quantification is crucial for computer model validation and reproducibility. We present a statistical framework, termed history matching, for performing global parameter search by comparing model output to the observed data. We employ Gaussian process (GP) emulators to produce fast predictions about model behaviour at the arbitrary input parameter settings allowing output uncertainty distributions to be calculated. History matching identifies sets of input parameters that give rise to acceptable matches between observed data and model output given our representation of uncertainties. Modellers could proceed by simulating computer models' outputs of interest at these identified parameter settings and producing a range of predictions. The variability in model results is crucial for inter-model comparison as well as model development. We illustrate the performance of emulation and history matching on a simple one-dimensional toy model and in application to a climate model. This article is part of the theme issue 'Reliability and reproducibility in computational science: implementing verification, validation and uncertainty quantification in silico'.

Keywords: Bayesian methods; emulation; error estimates.