A Two-Stage Density-Aware Single Image Deraining Method

IEEE Trans Image Process. 2021:30:6843-6854. doi: 10.1109/TIP.2021.3099396. Epub 2021 Aug 3.

Abstract

Although advanced single image deraining methods have been proposed, one main challenge remains: the available methods usually perform well on specific rain patterns but can hardly deal with scenarios with dramatically different rain densities, especially when the impacts of rain streaks and the veiling effect caused by rain accumulation are heavily coupled. To tackle this challenge, we propose a two-stage density-aware single image deraining method with gated multi-scale feature fusion. In the first stage, a realistic physics model closer to real rain scenes is leveraged for initial deraining, and a network branch is also trained for rain density estimation to guide the subsequent refinement. The second stage of model-independent refinement is realized using conditional Generative Adversarial Network (cGAN), aiming to eliminate artifacts and improve the restoration quality. In particular, dilated convolutions are applied to extract rain features at multiple scales and gated feature fusion is exploited to better aggregate multi-level contextual information in both stages. Extensive experiments have been conducted on representative synthetic rain datasets and real rain scenes. Quantitative and qualitative results demonstrate the superiority of our method in terms of effectiveness and generalization ability, which outperforms the state-of-the-art.