Interactive image data labeling using self-organizing maps in an augmented reality scenario

Neural Netw. 2005 Jun-Jul;18(5-6):566-74. doi: 10.1016/j.neunet.2005.06.040.

Abstract

We present an approach for the convenient labeling of image patches gathered from an unrestricted environment. The system is employed for a mobile Augmented Reality (AR) gear: while the user walks around with the head-mounted AR-gear, context-free modules for focus-of-attention permanently sample the most 'interesting' image patches. After this acquisition phase, a Self-Organizing Map (SOM) is trained on the complete set of patches, using combinations of MPEG-7 features as a data representation. The SOM allows visualization of the sampled patches and an easy manual sorting into categories. With very little effort, the user can compose a training set for a classifier, thus, unknown objects can be made known to the system. We evaluate the system for COIL-imagery and demonstrate that a user can reach satisfying categorization within few steps, even for image data sampled from walking in an office environment. (An abbreviated version of some portions of this article appeared in [Bekel, H., Heidemann, G., & Ritter, H. (2005). SOM Based Image Data Structuring in an Augmented Reality Scenario. In Proceedings of the International Joint Conference on Neural Networks, Montreal, Canada.], as part of the IJCNN 2005 conference proceedings, published under the IEEE copyright).

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Color
  • Computer Graphics
  • Image Processing, Computer-Assisted / statistics & numerical data*
  • Information Storage and Retrieval
  • Memory / physiology
  • Models, Neurological*
  • Visual Perception