Understanding the environment is crucial for autonomous robots to perform navigation and manipulation tasks. Never-seen-before objects may have complex appearances and dynamics, where only physical interactions can help to identify visually hidden properties like mass or friction. In this work we propose a baseline for the newly defined problem of using physical interactions to discover unknown properties of objects, without prior knowledge of them or any supervision. The agent first uses intrinsically motivated unsupervised reinforcement learning to learn how to interact with objects, so as to get observations with a level of information which eases the physical properties estimation. A self-supervised predictive task is then set up while following the learned behaviour to extract a latent representation of the physical properties of an object. When applied to a simulated mobile robot in presence of varying objects, the proposed baseline identifies and differentiates categorical properties, e.g. shape, and quantifies continuous properties, e.g. mass and friction, with excellent correlations to their true values even from noisy observations. It achieves significantly better results than simple interactions of a policy that performs poor exploration. This work provides an implementation of a functional, object-oriented action-perception cycle for embodied robotic agents.
Unsupervised Discovery of Objects Physical Properties Through Maximum Entropy Reinforcement Learning
Maxime Chareyre,Pierre Fournier,Julien Moras,Jean-Marc Bourinet,Y. Mezouar
Published 2025 in IEEE Robotics and Automation Letters
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
IEEE Robotics and Automation Letters
- Publication date
2025-04-01
- Fields of study
Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-30 of 30 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1