Imbalanced low-rank tensor completion via latent matrix factorization

Neural Netw. 2022 Nov:155:369-382. doi: 10.1016/j.neunet.2022.08.023. Epub 2022 Sep 6.

Abstract

Tensor completion has been widely used in computer vision and machine learning. Most existing tensor completion methods empirically assume the intrinsic tensor is simultaneous low-rank in all over modes. However, tensor data recorded from real-world applications may conflict with these assumptions, e.g., face images taken from different subjects often lie in a union of low-rank subspaces, which may result in a quite high rank or even full rank structure in its sample mode. To this aim, in this paper, we propose an imbalanced low-rank tensor completion method, which can flexibly estimate the low-rank incomplete tensor via decomposing it into a mixture of multiple latent tensor ring (TR) rank components. Specifically, each latent component is approximated using low-rank matrix factorization based on TR unfolding matrix. In addition, an effective proximal alternating minimization algorithm is developed and theoretically proved to maintain the global convergence property, that is, the whole sequence of iterates is convergent and converges to a critical point. Extensive experiments on both synthetic and real-world tensor data demonstrate that the proposed method achieves more favorable completion results with less computational cost when compared to the state-of-the-art tensor completion methods.

Keywords: Image/video inpainting; Low-rank tensor recovery; Tensor analysis; Tensor completion; Tensor ring decomposition.

MeSH terms

  • Algorithms*
  • Humans
  • Machine Learning*