Human demonstrations are important in a range of robotics applications, and are created with a variety of input methods. However, the design space for these input methods has not been extensively studied. In this paper, focusing on demonstrations of hand-scale object manipulation tasks to robot arms with two-finger grippers, we identify distinct usage paradigms in robotics that utilize human-to-robot demonstrations, extract abstract features that form a design space for input methods, and characterize existing input methods as well as a novel input method that we introduce, the instrumented tongs. We detail the design specifications for our method and present a user study that compares it against three common input methods: free-hand manipulation, kinesthetic guidance, and teleoperation. Study results show that instrumented tongs provide high quality demonstrations and a positive experience for the demonstrator while offering good correspondence to the target robot.
Characterizing Input Methods for Human-to-Robot Demonstrations
Pragathi Praveena,G. Subramani,Bilge Mutlu,Michael Gleicher
Published 2019 in IEEE/ACM International Conference on Human-Robot Interaction
ABSTRACT
PUBLICATION RECORD
- Publication year
2019
- Venue
IEEE/ACM International Conference on Human-Robot Interaction
- Publication date
2019-01-31
- Fields of study
Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-46 of 46 references · Page 1 of 1
CITED BY
Showing 1-17 of 17 citing papers · Page 1 of 1