Learning eye vergence control from a distributed disparity representation

Int J Neural Syst. 2010 Aug;20(4):267-78. doi: 10.1142/S0129065710002425.

Abstract

We present two neural models for vergence angle control of a robotic head, a simplified and a more complex one. Both models work in a closed-loop manner and do not rely on explicitly computed disparity, but extract the desired vergence angle from the post-processed response of a population of disparity tuned complex cells, the actual gaze direction and the actual vergence angle. The first model assumes that the gaze direction of the robotic head is orthogonal to its baseline and the stimulus is a frontoparallel plane orthogonal to the gaze direction. The second model goes beyond these assumptions, and operates reliably in the general case where all restrictions on the orientation of the gaze, as well as the stimulus position, type and orientation, are dropped.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Convergence, Ocular / physiology*
  • Eye Movements / physiology*
  • Humans
  • Models, Neurological*
  • Robotics
  • Vision Disparity*
  • Vision, Binocular / physiology