A bio-inspired, computational model suggests velocity gradients of optic flow locally encode ordinal depth at surface borders and globally they encode self-motion

Neural Comput. 2013 Sep;25(9):2421-49. doi: 10.1162/NECO_a_00479. Epub 2013 May 10.

Abstract

Visual navigation requires the estimation of self-motion as well as the segmentation of objects from the background. We suggest a definition of local velocity gradients to compute types of self-motion, segment objects, and compute local properties of optical flow fields, such as divergence, curl, and shear. Such velocity gradients are computed as velocity differences measured locally tangent and normal to the direction of flow. Then these differences are rotated according to the local direction of flow to achieve independence of that direction. We propose a bio-inspired model for the computation of these velocity gradients for video sequences. Simulation results show that local gradients encode ordinal surface depth, assuming self-motion in a rigid scene or object motions in a nonrigid scene. For translational self-motion velocity, gradients can be used to distinguish between static and moving objects. The information about ordinal surface depth and self-motion can help steering control for visual navigation.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Computer Simulation*
  • Humans
  • Models, Neurological*
  • Motion Perception / physiology*
  • Optic Flow / physiology*
  • Surface Properties