Remaining useful life (RUL) prediction is one of the key technologies of condition-based maintenance (CBM), which is important to maintain the reliability and safety of industrial equipment. Massive industrial measurement data has effectively improved the performance of the data-driven-based RUL prediction method. While deep learning has achieved great success in RUL prediction, existing methods have difficulties in processing long sequences and extracting information from the sensor and time step aspects. In this article, we propose dual-aspect self-attention based on transformer (DAST), a novel deep RUL prediction method, which is an encoder–decoder structure purely based on self-attention without any recurrent neural network (RNN)/convolution neural network (CNN) module. DAST consists of two encoders, which work in parallel to simultaneously extract features of different sensors and time steps. Solely based on self-attention, the DAST encoders are more effective in processing long data sequences and are capable of adaptively learning to focus on more important parts of the input. Moreover, the parallel feature extraction design avoids the mutual influence of information from two aspects. Experiments on two widely used turbofan engines datasets show that our method significantly outperforms the state-of-the-art RUL prediction methods.
Dual-Aspect Self-Attention Based on Transformer for Remaining Useful Life Prediction
Zhizheng Zhang,Wen Song,Qiqiang Li
Published 2021 in IEEE Transactions on Instrumentation and Measurement
ABSTRACT
PUBLICATION RECORD
- Publication year
2021
- Venue
IEEE Transactions on Instrumentation and Measurement
- Publication date
2021-06-30
- Fields of study
Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-42 of 42 references · Page 1 of 1