Tactile perception is essential for autonomous robots to perform intelligent manipulation and effective humanrobot interaction. Unlike visual sensors, tactile sensors can acquire rich physical information through direct contact, enabling recognition of both object properties and categories. However, tactile data are often irregular, heterogeneous, and complex, making effective representation and analysis challenging. In this study, a tactile recognition framework named MSCNN-GCN is proposed. This method combines multiscale convolutional neural networks (CNNs) and graph convolutional networks (GCNs) to enable structured feature extraction from tactile signals. Specifically, a graph is constructed to represent the correlations among tactile channels, and GCNs are used to model their topological and semantic relationships. In parallel, multiscale CNNs extract local spatial features from the tactile input. The two types of features are then integrated using an attention-based fusion mechanism, resulting in enhanced representation capability and improved recognition robustness. Experiments are conducted on a self-collected dataset of nine objects for two tasks, including property recognition and object recognition. MSCNN-GCN achieves accuracies of 93.33% for stiffness recognition, 96.11% for shape recognition, and 95.72% for object recognition, outperforming several baseline methods. The results highlight the effectiveness of GCN-based modeling for tactile data and provide a promising approach for structured learning from multichannel tactile signals.
MSCNN-GCN: Graph-Based Tactile Learning Enhancement for Property and Object Recognition With Sparse Sensor Configuration
Zhangyi Chen,Long Wang,Yao Luo,Songyuan Han,Jinbo Jiang,Yuhao Liu,Xiaoling Li
Published 2025 in IEEE Sensors Journal
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
IEEE Sensors Journal
- Publication date
2025-12-15
- Fields of study
Not labeled
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-34 of 34 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1