This paper addresses the challenges of misclassification and reliability assessment in compensation capacitor detection under strong noise in high-speed railway track circuits. A hierarchical Bayesian deep learning framework is proposed, integrating multi-domain signal enhancement in the time, frequency, and time–frequency (TF) domains with bidirectional long short-term memory (BiLSTM) sequence modeling for robust feature extraction. Bayesian classification and regression based on Monte Carlo (MC) Dropout and stochastic weight averaging Gaussian (SWAG) enable posterior inference, confidence interval estimation, and uncertainty-aware prediction, while a rejection mechanism filters low-confidence outputs. Experiments on 8782 real-world segments from five railway lines show that the proposed method achieves 97.8% state-recognition accuracy, a mean absolute error of 0.084 μF, and an R2 of 0.96. It further outperforms threshold-based, convolutional neural network (CNN), and standard BiLSTM models in negative log-likelihood (NLL), expected calibration error (ECE), and overall calibration quality, approaching the theoretical 95% interval coverage. The framework substantially improves robustness, accuracy, and reliability, providing a viable solution for intelligent monitoring and safety assurance of compensation capacitors in track circuits.
An Uncertainty-Aware Bayesian Deep Learning Method for Automatic Identification and Capacitance Estimation of Compensation Capacitors
Published 2026 in Italian National Conference on Sensors
ABSTRACT
PUBLICATION RECORD
- Publication year
2026
- Venue
Italian National Conference on Sensors
- Publication date
2026-01-01
- Fields of study
Medicine, Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-34 of 34 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1