A Motion-Based Feature for Event-Based Pattern Recognition

Front Neurosci. 2017 Jan 4:10:594. doi: 10.3389/fnins.2016.00594. eCollection 2016.

Abstract

This paper introduces an event-based luminance-free feature from the output of asynchronous event-based neuromorphic retinas. The feature consists in mapping the distribution of the optical flow along the contours of the moving objects in the visual scene into a matrix. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating "spiking" events that encode relative changes in pixels' illumination at high temporal resolutions. The optical flow is computed at each event, and is integrated locally or globally in a speed and direction coordinate frame based grid, using speed-tuned temporal kernels. The latter ensures that the resulting feature equitably represents the distribution of the normal motion along the current moving edges, whatever their respective dynamics. The usefulness and the generality of the proposed feature are demonstrated in pattern recognition applications: local corner detection and global gesture recognition.

Keywords: corner detection; event-driven vision; gesture recognition; histogram of oriented optical flow; motion-based feature; neuromorphic sensor; pattern recognition; speed-tuned integration time.