Predicting Motion Plans for Articulating Everyday Objects

03/02/2023
by   Arjun Gupta, et al.
0

Mobile manipulation tasks such as opening a door, pulling open a drawer, or lifting a toilet lid require constrained motion of the end-effector under environmental and task constraints. This, coupled with partial information in novel environments, makes it challenging to employ classical motion planning approaches at test time. Our key insight is to cast it as a learning problem to leverage past experience of solving similar planning problems to directly predict motion plans for mobile manipulation tasks in novel situations at test time. To enable this, we develop a simulator, ArtObjSim, that simulates articulated objects placed in real scenes. We then introduce SeqIK+θ_0, a fast and flexible representation for motion plans. Finally, we learn models that use SeqIK+θ_0 to quickly predict motion plans for articulating novel objects at test time. Experimental evaluation shows improved speed and accuracy at generating motion plans than pure search-based methods and pure learning methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset