Visual-quality-driven unsupervised image dehazing

Neural Netw. 2023 Oct:167:1-9. doi: 10.1016/j.neunet.2023.08.010. Epub 2023 Aug 9.

Abstract

Most of the existing learning-based dehazing methods require a diverse and large collection of paired hazy/clean images, which is intractable to obtain. Therefore, existing dehazing methods resort to training on synthetic images. This may result in a possible domain shift when treating real scenes. In this paper, we propose a novel unsupervised dehazing (lightweight) network without any reference images to directly predict clear images from the original hazy images, which consists of an interactive fusion module (IFM) and an iterative optimization module (IOM). Specifically, IFM interactively fuses multi-level features to make up for the missing information among deep and shallow features while IOM iteratively optimizes dehazed results to obtain pleasing visual effects. Particularly, based on the observation that hazy images usually suffer from quality degradation, four non-reference visual-quality-driven loss functions are designed to enable the network trained in an unsupervised way, including dark channel loss, contrast loss, saturation loss, and edge sharpness loss. Extensive experiments on two synthetic datasets and one real-world dataset demonstrate that our method performs favorably against the state-of-the-art unsupervised dehazing methods and even matches some supervised methods in terms of metrics such as PSNR, SSIM, and UQI.

Keywords: Image dehazing; Interactive fusion; Iterative enhancement; Unsupervised learning; Visual-quality-driven.

MeSH terms

  • Image Processing, Computer-Assisted*
  • Neural Networks, Computer*