A Comparison and Evaluation of Stereo Matching on Active Stereo Images

Sensors (Basel). 2022 Apr 26;22(9):3332. doi: 10.3390/s22093332.

Abstract

The relationship between the disparity and depth information of corresponding pixels is inversely proportional. Thus, in order to accurately estimate depth from stereo vision, it is important to obtain accurate disparity maps, which encode the difference between horizontal coordinates of corresponding image points. Stereo vision can be classified as either passive or active. Active stereo vision generates pattern texture, which passive stereo vision does not have, on the image to fill the textureless regions. In passive stereo vision, many surveys have discovered that disparity accuracy is heavily reliant on attributes, such as radiometric variation and color variation, and have found the best-performing conditions. However, in active stereo matching, the accuracy of the disparity map is influenced not only by those affecting the passive stereo technique, but also by the attributes of the generated pattern textures. Therefore, in this paper, we analyze and evaluate the relationship between the performance of the active stereo technique and the attributes of pattern texture. When evaluating, experiments are conducted under various settings, such as changing the pattern intensity, pattern contrast, number of pattern dots, and global gain, that may affect the overall performance of the active stereo matching technique. Through this evaluation, our discovery can act as a noteworthy reference for constructing an active stereo system.

Keywords: active stereo matching; disparity accuracy; infrared image; matching cost; off-the-shelf active stereo sensor; performance evaluation.

MeSH terms

  • Algorithms*
  • Imaging, Three-Dimensional* / methods
  • Vision, Ocular

Grants and funding

This work was supported by the National Research Foundation of Korea (NRF), funded by the Korea Government (Ministry of Science and ICT, MSIT) under Grant NRF-2020R1A2C3011697, the Yonsei University Research Fund of 2021 (2021-22-0001), and the Sookmyung Women’s University Research Grants (No. 1-2203-2002).