High inter-subject variability and the non-stationary nature of EEG signals pose significant challenges for subject-independent Brain-Computer Interfaces (BCIs) leading to poor model generalization. Differences in neural activity patterns, electrode placements, and external noise further degrade performance making it difficult to develop BCIs that remain reliable across users without extensive recalibration. This study presents a Compact Convolutional Swin Transformer (CCST) to address this issue by using hierarchical window based self-attention combined with convolutional feature extraction to efficiently capture both local electrode interactions and global temporal dependencies. This multi-scale feature representation enhances generalization across subjects, a critical factor for real world BCI deployment. We evaluated CCST on the BCI Competition IV (2a, 2b) and PhysioNet MI datasets using Leave-One-Subject-Out (LOSO) cross-validation achieving state-of-the-art classification accuracies of 68.27%, 76.61%, and 71.70% respectively. Our statistical analysis using the Wilcoxon signed-rank test with Bonferroni correction confirms significant performance improvements over benchmark models. Additionally, CCST achieves a reduction in parameters and a decrease in FLOPs compared to full self-attention models making it more efficient for real-time BCI applications. These results establish CCST as a scalable and efficient framework for adaptive subject-independent BCIs with promising applications in neurorehabilitation, assistive technology, and cognitive training.
Multi-scale EEG feature decoding with Swin Transformers for subject independent motor imagery BCIs
Wasi Ur Rehman Qamar,B. Abibullaev
Published 2026 in Scientific Reports
ABSTRACT
PUBLICATION RECORD
- Publication year
2026
- Venue
Scientific Reports
- Publication date
2026-01-20
- Fields of study
Medicine, Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-69 of 69 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1