In micromanipulation, modeling the dynamics and environmental interactions of micro objects is challenging. This study employs Imitation Learning (IL) to avoid explicit modeling, allowing stable and efficient autonomous operation in the Robotic Micromanipulation Systems (RMS) by learning strategies directly from human demonstrations. We propose Micro-ACT, an IL model tailored for RMS, based on the Action Chunking with Transformers (ACT) framework. Micro-ACT uses multimodal observations and latent variables to generate action sequences of the tool tip. Users can intuitively demonstrate actions by dragging the tool tip with a cursor on a graphical interface. Tool tip positioning relies on an enhanced self-calibration method, where calibration data are weighted by template matching similarity and the matrix is optimized using weighted least squares. Experiments confirm that this method accurately tracks the tool tip during continuous demonstration and inference operations. For a pushing task spanning approximately 6.79 mm, Micro-ACT performs comparably to expert operation, averaging 19.56 s with an 80% success rate and an IoU of 0.79. Incorporating motor velocity data further improves control over subtle movements. Moreover, Micro-ACT exhibits robust performance under artificial visual and physical disturbances, underscoring its reliability.
Towards Autonomous Micromanipulation Using Imitation Learning via Action Chunking with Transformer
Yun Long,Tiexin Wang,Tianle Weng,Liangjing Yang
Published 2025 in 2025 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM)
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
2025 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM)
- Publication date
2025-07-14
- Fields of study
Not labeled
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-18 of 18 references · Page 1 of 1
CITED BY
Showing 1-1 of 1 citing papers · Page 1 of 1