Transformer architectures have been recently widely used for natural language processing tasks, however, their application to processing of physiological signals, such as electroencephalography (EEG), has remained limited. In this paper, we introduce P300-Transformer (P3T), a new single-trial P300 detection model, designed to optimize the information transfer rate (ITR) in P300-BCI speller systems, while maintaining a high character recognition rate. Evaluating the performance of the proposed P3T model using the publicly-available P300 dataset from BCI Competition III, demonstrates an average character recognition rate of 97.50% and an ITR of 20.52 bits/min. Compared to the state-of-the-art models, our results suggest that the proposed transformer-based P3T model can significantly enhance the character recognition rate in P300 speller systems, eliminating the need for extensive feature extraction.
P3T: A Transformer Model for Enhancing Character Recognition Rates in P300 Speller Systems
Published 2024 in Asilomar Conference on Signals, Systems and Computers
ABSTRACT
PUBLICATION RECORD
- Publication year
2024
- Venue
Asilomar Conference on Signals, Systems and Computers
- Publication date
2024-10-27
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-18 of 18 references · Page 1 of 1
CITED BY
Showing 1-6 of 6 citing papers · Page 1 of 1