Executing contact-rich manipulation tasks necessitates the fusion of tactile and visual feedback. However, the distinct nature of these modalities poses significant challenges. In this paper, we introduce a system that leverages visual and tactile sensory inputs to enable dexterous in-hand manipulation. Specifically, we propose Robot Synesthesia, a novel point cloudbased tactile representation inspired by human tactile-visual synesthesia. This approach allows for the simultaneous and seamless integration of both sensory inputs, offering richer spatial information and facilitating better reasoning about robot actions. Comprehensive ablations are performed on how the integration of vision and touch can improve reinforcement learning and Sim2Real performance. Our project page is available at https://yingyuan0414.github.io/visuotactile/.
Robot Synesthesia: In-Hand Manipulation with Visuotactile Sensing
Ying Yuan,Haichuan Che,Yuzhe Qin,Binghao Huang,Zhao-Heng Yin,Kang-Won Lee,Yi Wu,Soo-Chul Lim,Xiaolong Wang
Published 2023 in IEEE International Conference on Robotics and Automation
ABSTRACT
PUBLICATION RECORD
- Publication year
2023
- Venue
IEEE International Conference on Robotics and Automation
- Publication date
2023-12-04
- Fields of study
Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-66 of 66 references · Page 1 of 1
CITED BY
Showing 1-74 of 74 citing papers · Page 1 of 1