In deep learning visual tasks, convolutional neural networks (CNNs) have demonstrated excellent performance in various image classification scenarios. However, their large number of parameters and dependence on large-scale annotated data severely restrict the application of the model in small samples and resource-limited environments. To address this issue, we present a lightweight image classification framework that couples self-supervised teacher pre-training with knowledge distillation to a compact student. The teacher (ResNet-18) is pre-trained with SimCLR and, under the same unlabeled budget, with MoCo v2 and BYOL; the student (LiteNet-DSC) uses depthwise-separable convolutions with batch normalization. Beyond final-logit distillation, we add a lightweight feature-level transfer aligning two intermediate stages, and incorporate consistency regularization with semi-supervised pseudo-labels. On MNIST and CIFAR-10 under a low-label regime (CIFAR-10: 10k labeled / 35k unlabeled), the student reaches up to 93.1% top-1 with ~0.5M parameters and 0.50 ms single-image latency, about 5× faster than the teacher. Ablations show consistent gains from alternative SSL (BYOL ≥ MoCo v2 > SimCLR) and from feature-level transfer. The approach achieves a high-accuracy, low-overhead balance suitable for mobile and embedded deployment.
Self-supervised Distillation Method for Lightweight Convolutional Networks
Published 2025 in 2025 6th International Conference on Computer Engineering and Intelligent Control (ICCEIC)
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
2025 6th International Conference on Computer Engineering and Intelligent Control (ICCEIC)
- Publication date
2025-10-17
- Fields of study
Not labeled
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-15 of 15 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1