A Generalized Model for Robust Tensor Factorization With Noise Modeling by Mixture of Gaussians

IEEE Trans Neural Netw Learn Syst. 2018 Nov;29(11):5380-5393. doi: 10.1109/TNNLS.2018.2796606. Epub 2018 Mar 1.

Abstract

The low-rank tensor factorization (LRTF) technique has received increasing attention in many computer vision applications. Compared with the traditional matrix factorization technique, it can better preserve the intrinsic structure information and thus has a better low-dimensional subspace recovery performance. Basically, the desired low-rank tensor is recovered by minimizing the least square loss between the input data and its factorized representation. Since the least square loss is most optimal when the noise follows a Gaussian distribution, -norm-based methods are designed to deal with outliers. Unfortunately, they may lose their effectiveness when dealing with real data, which are often contaminated by complex noise. In this paper, we consider integrating the noise modeling technique into a generalized weighted LRTF (GWLRTF) procedure. This procedure treats the original issue as an LRTF problem and models the noise using a mixture of Gaussians (MoG), a procedure called MoG GWLRTF. To extend the applicability of the model, two typical tensor factorization operations, i.e., CANDECOMP/PARAFAC factorization and Tucker factorization, are incorporated into the LRTF procedure. Its parameters are updated under the expectation-maximization framework. Extensive experiments indicate the respective advantages of these two versions of MoG GWLRTF in various applications and also demonstrate their effectiveness compared with other competing methods.

Publication types

  • Research Support, Non-U.S. Gov't