In this paper, we address a complex but practical scenario in Active Learning (AL) known as open-set AL, where the unlabeled data consists of both in-distribution (ID) and out-of-distribution (OOD) samples. Standard AL methods will fail in this scenario as OOD samples are highly likely to be regarded as uncertain samples, leading to their selection and wasting of the budget. Existing methods focus on selecting the highly likely ID samples, which tend to be easy and less informative. To this end, we introduce two criteria, namely contrastive confidence and historical divergence, which measure the possibility of being ID and the hardness of a sample, respectively. By balancing the two proposed criteria, highly informative ID samples can be selected as much as possible. Furthermore, unlike previous methods that require additional neural networks to detect the OOD samples, we propose a contrastive clustering framework that endows the classifier with the ability to identify the OOD samples and further enhances the network’s representation learning. The experimental results demonstrate that the proposed method achieves state-of-the-art performance on several benchmark datasets.
Contrastive Open-Set Active Learning-Based Sample Selection for Image Classification
Zizheng Yan,Delian Ruan,Yushuang Wu,Junshi Huang,Zhenhua Chai,Xiaoguang Han,Shuguang Cui,Guanbin Li
Published 2024 in IEEE Transactions on Image Processing
ABSTRACT
PUBLICATION RECORD
- Publication year
2024
- Venue
IEEE Transactions on Image Processing
- Publication date
2024-09-05
- Fields of study
Medicine, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-75 of 75 references · Page 1 of 1
CITED BY
Showing 1-6 of 6 citing papers · Page 1 of 1