A probabilistic metric for the validation of computational models

R Soc Open Sci. 2018 Nov 14;5(11):180687. doi: 10.1098/rsos.180687. eCollection 2018 Nov.

Abstract

A new validation metric is proposed that combines the use of a threshold based on the uncertainty in the measurement data with a normalized relative error, and that is robust in the presence of large variations in the data. The outcome from the metric is the probability that a model's predictions are representative of the real world based on the specific conditions and confidence level pertaining to the experiment from which the measurements were acquired. Relative error metrics are traditionally designed for use with a series of data values, but orthogonal decomposition has been employed to reduce the dimensionality of data matrices to feature vectors so that the metric can be applied to fields of data. Three previously published case studies are employed to demonstrate the efficacy of this quantitative approach to the validation process in the discipline of structural analysis, for which historical data were available; however, the concept could be applied to a wide range of disciplines and sectors where modelling and simulation play a pivotal role.

Keywords: computational modelling; model validation; orthogonal decomposition; relative error.

Associated data

  • Dryad/10.5061/dryad.2qp305p