Electromyography (EMG) based interfaces are the most common solutions for the control of robotic, orthotic, prosthetic, assistive, and rehabilitation devices, translating myoelectric activations into meaningful actions. Over the last years, a lot of emphasis has been put into the EMG based decoding of human intention, but very few studies have been carried out focusing on the continuous decoding of human motion. In this work, we present a learning scheme for the EMG based decoding of object motions in dexterous, in-hand manipulation tasks. We also study the contribution of different muscles while performing these tasks and the effect of the gender and hand size in the overall decoding accuracy. To do that, we use EMG signals derived from 16 muscle sites (8 on the hand and 8 on the forearm) from 11 different subjects and an optical motion capture system that records the object motion. The object motion decoding is formulated as a regression problem using the Random Forests methodology. Regarding feature selection, we use the following time-domain features: root mean square, waveform length and zero crossings. A 10-fold cross validation procedure is used for model assessment purposes and the feature variable importance values are calculated for each feature. This study shows that subject specific, hand specific, and object specific decoding models offer better decoding accuracy that the generic models.
A Learning Scheme for EMG Based Decoding of Dexterous, In-Hand Manipulation Motions
Anany Dwivedi,Yongje Kwon,A. McDaid,Minas Liarokapis
Published 2019 in IEEE transactions on neural systems and rehabilitation engineering
ABSTRACT
PUBLICATION RECORD
- Publication year
2019
- Venue
IEEE transactions on neural systems and rehabilitation engineering
- Publication date
2019-08-21
- Fields of study
Medicine, Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-46 of 46 references · Page 1 of 1
CITED BY
Showing 1-39 of 39 citing papers · Page 1 of 1