P3T: A Transformer Model for Enhancing Character Recognition Rates in P300 Speller Systems

Jiazhen Hong,L. Najafizadeh

Published 2024 in Asilomar Conference on Signals, Systems and Computers

ABSTRACT

Transformer architectures have been recently widely used for natural language processing tasks, however, their application to processing of physiological signals, such as electroencephalography (EEG), has remained limited. In this paper, we introduce P300-Transformer (P3T), a new single-trial P300 detection model, designed to optimize the information transfer rate (ITR) in P300-BCI speller systems, while maintaining a high character recognition rate. Evaluating the performance of the proposed P3T model using the publicly-available P300 dataset from BCI Competition III, demonstrates an average character recognition rate of 97.50% and an ITR of 20.52 bits/min. Compared to the state-of-the-art models, our results suggest that the proposed transformer-based P3T model can significantly enhance the character recognition rate in P300 speller systems, eliminating the need for extensive feature extraction.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-18 of 18 references · Page 1 of 1