Robust Bayesian Regression with Synthetic Posterior Distributions

Entropy (Basel). 2020 Jun 15;22(6):661. doi: 10.3390/e22060661.

Abstract

Although linear regression models are fundamental tools in statistical science, the estimation results can be sensitive to outliers. While several robust methods have been proposed in frequentist frameworks, statistical inference is not necessarily straightforward. We here propose a Bayesian approach to robust inference on linear regression models using synthetic posterior distributions based on γ-divergence, which enables us to naturally assess the uncertainty of the estimation through the posterior distribution. We also consider the use of shrinkage priors for the regression coefficients to carry out robust Bayesian variable selection and estimation simultaneously. We develop an efficient posterior computation algorithm by adopting the Bayesian bootstrap within Gibbs sampling. The performance of the proposed method is illustrated through simulation studies and applications to famous datasets.

Keywords: Bayesian bootstrap; Bayesian lasso; Gibbs sampling; divergence; linear regression.