Retain and Recover: Delving Into Information Loss for Few-Shot Segmentation

IEEE Trans Image Process. 2023:32:5353-5365. doi: 10.1109/TIP.2023.3315555. Epub 2023 Oct 2.

Abstract

Benefiting from advances in few-shot learning techniques, their application to dense prediction tasks (e.g., segmentation) has also made great strides in the past few years. However, most existing few-shot segmentation (FSS) approaches follow a similar pipeline to that of few-shot classification, where some core components are directly exploited regardless of various properties between tasks. We note that such an ill-conceived framework introduces unnecessary information loss, which is clearly unacceptable given the already very limited training sample. To this end, we delve into the typical types of information loss and provide a reasonably effective way, namely Retain And REcover (RARE). The main focus of this paper can be summarized as follows: (i) the loss of spatial information due to global pooling; (ii) the loss of boundary information due to mask interpolation; (iii) the degradation of representational power due to sample averaging. Accordingly, we propose a series of strategies to retain/recover the avoidable/unavoidable information, such as unidirectional pooling, error-prone region focusing, and adaptive integration. Extensive experiments on two popular benchmarks (i.e., PASCAL- 5i and COCO- 20i ) demonstrate the effectiveness of our scheme, which is not restricted to a particular baseline approach. The ultimate goal of our work is to address different information loss problems within a unified framework, and it also exhibits superior performance compared to other methods with similar motivations. The source code will be made available at https://github.com/chunbolang/RARE.