Winsorization for Robust Bayesian Neural Networks

Entropy (Basel). 2021 Nov 20;23(11):1546. doi: 10.3390/e23111546.

Abstract

With the advent of big data and the popularity of black-box deep learning methods, it is imperative to address the robustness of neural networks to noise and outliers. We propose the use of Winsorization to recover model performances when the data may have outliers and other aberrant observations. We provide a comparative analysis of several probabilistic artificial intelligence and machine learning techniques for supervised learning case studies. Broadly, Winsorization is a versatile technique for accounting for outliers in data. However, different probabilistic machine learning techniques have different levels of efficiency when used on outlier-prone data, with or without Winsorization. We notice that Gaussian processes are extremely vulnerable to outliers, while deep learning techniques in general are more robust.

Keywords: Bayesian neural network; Winsorization; concrete dropout; flipout; mixture density networks; uncertainty quantification; variational Gaussian process.