A Generalized Nesterov-Accelerated Second-Order Latent Factor Model for High-Dimensional and Incomplete Data

IEEE Trans Neural Netw Learn Syst. 2023 Oct 13:PP. doi: 10.1109/TNNLS.2023.3321915. Online ahead of print.

Abstract

High-dimensional and incomplete (HDI) data are frequently encountered in big date-related applications for describing restricted observed interactions among large node sets. How to perform accurate and efficient representation learning on such HDI data is a hot yet thorny issue. A latent factor (LF) model has proven to be efficient in addressing it. However, the objective function of an LF model is nonconvex. Commonly adopted first-order methods cannot approach its second-order stationary point, thereby resulting in accuracy loss. On the other hand, traditional second-order methods are impractical for LF models since they suffer from high computational costs due to the required operations on the objective's huge Hessian matrix. In order to address this issue, this study proposes a generalized Nesterov-accelerated second-order LF (GNSLF) model that integrates twofold conceptions: 1) acquiring proper second-order step efficiently by adopting a Hessian-vector algorithm and 2) embedding the second-order step into a generalized Nesterov's acceleration (GNA) method for speeding up its linear search process. The analysis focuses on the local convergence for GNSLF's nonconvex cost function instead of the global convergence has been taken; its local convergence properties have been provided with theoretical proofs. Experimental results on six HDI data cases demonstrate that GNSLF performs better than state-of-the-art LF models in accuracy for missing data estimation with high efficiency, i.e., a second-order model can be accelerated by incorporating GNA without accuracy loss.