k-NS: a classifier by the distance to the nearest subspace

IEEE Trans Neural Netw. 2011 Aug;22(8):1256-68. doi: 10.1109/TNN.2011.2153210. Epub 2011 Jun 30.

Abstract

To improve the classification performance of k-NN, this paper presents a classifier, called k -NS, based on the Euclidian distances from a query sample to the nearest subspaces. Each nearest subspace is spanned by k nearest samples of a same class. A simple discriminant is derived to calculate the distances due to the geometric meaning of the Grammian, and the calculation stability of the discriminant is guaranteed by embedding Tikhonov regularization. The proposed classifier, k-NS, categorizes a query sample into the class whose corresponding subspace is proximal. Because the Grammian only involves inner products, the classifier is naturally extended into the high-dimensional feature space induced by kernel functions. The experimental results on 13 publicly available benchmark datasets show that k-NS is quite promising compared to several other classifiers founded on nearest neighbors in terms of training and test accuracy and efficiency.

Publication types

  • Comparative Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Artificial Intelligence*
  • Models, Statistical*
  • Pattern Recognition, Automated / methods*