This paper presents a dataset of human grasping behavior in unstructured environments. Wide-angle head-mounted camera video was recorded from two housekeepers and two machinists during their regular work activities, and the grasp types, objects, and tasks were analyzed and coded by study staff. The full dataset contains 27.7 hours of tagged video and represents a wide range of manipulative behaviors spanning much of the typical human hand usage. We provide the original videos, a spreadsheet including the tagged grasp type, object, and task parameters, time information for each successive grasp, and video screenshots for each instance. Example code is provided for MATLAB and R, demonstrating how to load in the dataset and produce simple plots.
The Yale human grasping dataset: Grasp, object, and task data in household and machine shop environments
Ian M. Bullock,Thomas Feix,A. Dollar
Published 2015 in Int. J. Robotics Res.
ABSTRACT
PUBLICATION RECORD
- Publication year
2015
- Venue
Int. J. Robotics Res.
- Publication date
2015-03-01
- Fields of study
Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-10 of 10 references · Page 1 of 1