Principal component analysis by Lp-norm maximization

IEEE Trans Cybern. 2014 May;44(5):594-609. doi: 10.1109/TCYB.2013.2262936. Epub 2013 Jun 25.

Abstract

This paper proposes several principal component analysis (PCA) methods based on Lp-norm optimization techniques. In doing so, the objective function is defined using the Lp-norm with an arbitrary p value, and the gradient of the objective function is computed on the basis of the fact that the number of training samples is finite. In the first part, an easier problem of extracting only one feature is dealt with. In this case, principal components are searched for either by a gradient ascent method or by a Lagrangian multiplier method. When more than one feature is needed, features can be extracted one by one greedily, based on the proposed method. Second, a more difficult problem is tackled that simultaneously extracts more than one feature. The proposed methods are shown to find a local optimal solution. In addition, they are easy to implement without significantly increasing computational complexity. Finally, the proposed methods are applied to several datasets with different values of p and their performances are compared with those of conventional PCA methods.

Publication types

  • Research Support, Non-U.S. Gov't