GFASNet: Gait feature attention-driven deep sequential network for dementia-related gait pattern analysis

Quynh Hoang Ngan Nguyen,Ankhzaya Jamsrandorj,Dawoon Jung,Sung Woo Kim,Jinwook Kim,Min Seok Baek,Kyung-Ryoul Mun

Published 2026 in Artif. Intell. Medicine

ABSTRACT

Deep learning models leveraging human activity data, such as gait, have shown promise for dementia prediction. However, their limited interpretability and lack of clinically meaningful insights restrict their translational value in cognitive health research. This study introduces GFASNet, a Gait Feature Attention-driven Deep Sequential Network designed primarily to identify dementia-related gait alterations through model-derived attention mechanisms, which may serve as candidate digital biomarkers. GFASNet incorporates feature-level attention into sequential deep learning architectures to enhance model transparency and quantify the relative contribution of individual gait parameters. Spatiotemporal gait data were collected from 232 participants performing free-walking tests on a pressure-sensor walkway. Gait sequences composed of eight consecutive strides were used to train and evaluate four GFASNet variants based on distinct recurrent architectures: Long Short-Term Memory, Bidirectional Long Short-Term Memory, Gated Recurrent Unit, and Bidirectional Gated Recurrent Unit. All GFASNet models outperformed non-attention baselines in classification tasks. Crucially, attention weight analysis indicated that the models consistently focused on specific gait features when distinguishing dementia cases, highlighting their potential relevance as digital biomarkers. These findings demonstrate that GFASNet not only enhances dementia identification but also facilitates interpretable and clinically relevant gait analysis for cognitive health research.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-48 of 48 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1