Curriculum learning (CL) based on action space is an effective approach to deep reinforcement learning (DRL) in high-dimensional action spaces with limited training data. However, it can be challenging to determine which part of the action space to focus on during training. We propose an automatic CL approach that dynamically changes the action space using masking. Our approach selects the part of the action space based on its learning progress at each step of the training process, rather than every episode, which encourages the policy to explore actions that are learning faster. At the same time, it allows the policy to learn coordination among other actions. To evaluate the proposed CL approach, we tested it on an industrial robot performing the anchor bolt insertion task, which is in high demand for automation in the construction industry. The results indicate that our approach improved the learning performance of the DRL policy, resulting in better task execution. The policies trained with our approach outperformed those trained from scratch and those trained with similar action-space CL approaches. Although we only tested our approach on the anchor bolt insertion task, we believe it can be applied to other DRL policies with high-dimensional discrete action spaces. Overall, our automated CL approach shows promise for improving the automation of field tasks with real robots.
Automatic Action Space Curriculum Learning with Dynamic Per-Step Masking
Published 2023 in 2023 IEEE 19th International Conference on Automation Science and Engineering (CASE)
ABSTRACT
PUBLICATION RECORD
- Publication year
2023
- Venue
2023 IEEE 19th International Conference on Automation Science and Engineering (CASE)
- Publication date
2023-08-26
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-21 of 21 references · Page 1 of 1
CITED BY
Showing 1-2 of 2 citing papers · Page 1 of 1