Efficient tracking of the dominant eigenspace of a normalized kernel matrix

Neural Comput. 2008 Feb;20(2):523-54. doi: 10.1162/neco.2007.05-06-213.

Abstract

Various machine learning problems rely on kernel-based methods. The power of these methods resides in the ability to solve highly nonlinear problems by reformulating them in a linear context. The dominant eigenspace of a (normalized) kernel matrix is often required. Unfortunately, the computational requirements of the existing kernel methods are such that the applicability is restricted to relatively small data sets. This letter therefore focuses on a kernel-based method for large data sets. More specifically, a numerically stable tracking algorithm for the dominant eigenspace of a normalized kernel matrix is proposed, which proceeds by an updating (the addition of a new data point) followed by a downdating (the exclusion of an old data point) of the kernel matrix. Testing the algorithm on some representative case studies reveals that a very good approximation of the dominant eigenspace is obtained, while only a minimal amount of operations and memory space per iteration step is required.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Artificial Intelligence*
  • Humans
  • Information Storage and Retrieval*
  • Neural Networks, Computer*
  • Normal Distribution
  • Pattern Recognition, Automated / methods*