Deep neural networks trained with standard crossentropy loss are more prone to memorize noisy labels, which degrades their performance. Negative learning using complementary labels is more robust when noisy labels intervene but with an extremely slow model convergence speed. In this paper, we first introduce a bidirectional learning scheme, where positive learning ensures convergence speed while negative learning robustly copes with label noise. Further, a dynamic sample reweighting strategy is proposed to globally weaken the effect of noise-labeled samples by exploiting the excellent discriminatory ability of negative learning on the sample probability distribution. In ad-dition, we combine self-distillation to further improve the model performance. The code is available at https://github.com/chenchenzong/BLDR.
Noise-Robust Bidirectional Learning with Dynamic Sample Reweighting
Chen-Chen Zong,Zhengyang Cao,Honglin Guo,Yunshu Du,Mingshan Xie,Shao-Yuan Li,Sheng-Jun Huang
Published 2022 in arXiv.org
ABSTRACT
PUBLICATION RECORD
- Publication year
2022
- Venue
arXiv.org
- Publication date
2022-09-03
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-18 of 18 references · Page 1 of 1
CITED BY
Showing 1-2 of 2 citing papers · Page 1 of 1