Iterative color constancy with temporal filtering for an image sequence with no relative motion between the camera and the scene

J Opt Soc Am A Opt Image Sci Vis. 2015 Nov 1;32(11):2033-43. doi: 10.1364/JOSAA.32.002033.

Abstract

Color constancy is the ability to perceive the color of a surface as invariant even under changing illumination. In outdoor applications, such as mobile robot navigation or surveillance, the lack of this ability harms the segmentation, tracking, and object recognition tasks. The main approaches for color constancy are generally targeted to static images and intend to estimate the scene illuminant color from the images. We present an iterative color constancy method with temporal filtering applied to image sequences in which reference colors are estimated from previous corrected images. Furthermore, two strategies to sample colors from the images are tested. The proposed method has been tested using image sequences with no relative movement between the scene and the camera. It also has been compared with known color constancy algorithms such as gray-world, max-RGB, and gray-edge. In most cases, the iterative color constancy method achieved better results than the other approaches.