Remote photoplethysmography (rPPG) is a non-contact technique for measuring physiological signals from facial videos, offering significant potential in various applications. Recent advances leveraging multi-scale feature fusion have shown superior performance in extracting physiological signals. However, many existing methods focus independently on either the temporal or spatial dimensions, overlooking the inherent correlations between these dimensions in video data. To address this gap, we propose a novel three-branch architecture, termed TBNet, for rPPG signal measurement. TBNet employs fast stream, global stream, and local stream to independently capture multi-scale facial video features across temporal and spatial dimensions, followed by their fusion for robust physiological signal extraction. To further enhance feature representation and mitigate the impact of complex noise, we introduce a feature enhancement module. Additionally, we design a hybrid loss function that simultaneously optimizes performance in both the temporal and frequency domains based on the three-branch structure. Experimental results on three public datasets demonstrate that TBNet achieves state-of-the-art performance in the heart rate (HR) estimation task, achieving MAE/RMSE values of 0.16/0.40 on UBFC-rPPG, 0.47/0.69 on PURE, and 0.51/1.39 on COHFACE.
Three-Branch Network for Multi-Scale Spatiotemporal Feature Fusion in Remote Physiological Measurement
Zhipeng Li,Hanguang Xiao,Ziyi Xia,Feizhong Zhou,Xiaoxuan Huang,Tianqi Liu
Published 2025 in IEEE transactions on consumer electronics
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
IEEE transactions on consumer electronics
- Publication date
2025-11-01
- Fields of study
Medicine, Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-61 of 61 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1