Face recognition systems are sometimes deployed to a target domain with limited unlabeled samples available. For instance, a model trained on the large-scale webfaces may be required to adapt to a NIR-VIS scenario via very limited unlabeled faces. This situation poses a great challenge to Unsupervised Domain Adaptation with Limited samples for Face Recognition (UDAL-FR), which is less studied in previous works. In this paper, with deep learning methods, we propose a novel training remedy by decomposing the model into the weight parameters and the BN statistics in the training phase. Based on decomposing, we design a novel framework via meta-learning, called Decomposed Meta Batch Normalization (DMBN) for fast domain adaptation in face recognition. DMBN trains the network such that domain-invariant information is prone to store in the weight parameters and domain-specific knowledge tends to be represented by the BN statistics. Specifically, DMBN constructs distribution-shifted tasks via domain-aware sampling, on which several meta-gradients are obtained by optimizing discriminative representations across different BNs. Finally, the weight parameters are updated with these meta-gradients for better consistency across different BNs. With the learned weight parameters, the adaptation is very fast since only the BN updating on limited data is needed. We propose two UDAL-FR benchmarks to evaluate the domain-adaptive ability of a model with limited unlabeled samples. Extensive experiments validate the efficacy of our proposed DMBN.
Decomposed Meta Batch Normalization for Fast Domain Adaptation in Face Recognition
Jianzhu Guo,Xiangyu Zhu,Zhen Lei,Stan Z. Li
Published 2021 in IEEE Transactions on Information Forensics and Security
ABSTRACT
PUBLICATION RECORD
- Publication year
2021
- Venue
IEEE Transactions on Information Forensics and Security
- Publication date
Unknown publication date
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-72 of 72 references · Page 1 of 1
CITED BY
Showing 1-5 of 5 citing papers · Page 1 of 1