Bias-variance decomposition of absolute errors for diagnosing regression models of continuous data

Patterns (N Y). 2021 Jul 21;2(8):100309. doi: 10.1016/j.patter.2021.100309. eCollection 2021 Aug 13.

Abstract

Bias-variance decomposition (BVD) is a powerful tool for understanding and improving data-driven models. It reveals sources of estimation errors. Existing literature has defined BVD for squared error but not absolute error, while absolute error is the more natural error metric and has shown advantages over squared error in many scientific fields. Here, I analytically derive the absolute-error BVD, empirically investigate its behaviors, and compare that with other error metrics. Different error metrics offer distinctly different perspectives. I find the commonly believed bias/variance trade-off under squared error is often absent under absolute error, and ensembles-a never hurt technique under squared error-could harm performance under absolute error. Compared with squared error, absolute-error BVD better promotes model traits reducing estimation residuals and better illustrates relative importance of different error sources. As data scientists pay increasing attention to uncertainty issues, the technique introduced here can be a useful addition to a data-driven modeler's toolset.

Keywords: L1 loss; absolute error; bias-variance decomposition; continuous data; continuous response; data-driven modeling; empirical models; inductive learning; regressions.