kPAM: KeyPoint Affordances for Category-Level Robotic Manipulation

03/15/2019
by   Lucas Manuelli, et al.
0

We would like robots to achieve purposeful manipulation by placing any instance from a category of objects into a desired set of goal states. Existing manipulation pipelines typically specify the desired configuration as a target 6-DOF pose and rely on explicitly estimating the pose of the manipulated objects. However, representing an object with a parameterized transformation defined on a fixed template cannot capture large intra-category shape variation, and specifying a target pose at a category level can be physically infeasible or fail to accomplish the task -- e.g. knowing the pose and size of a coffee mug relative to some canonical mug is not sufficient to successfully hang it on a rack by its handle. Hence we propose a novel formulation of category-level manipulation that uses semantic 3D keypoints as the object representation. This keypoint representation enables a simple and interpretable specification of the manipulation target as geometric costs and constraints on the keypoints, which flexibly generalizes existing pose-based manipulation methods. Using this formulation, we factor the manipulation policy into instance segmentation, 3D keypoint detection, optimization-based robot action planning and local dense-geometry-based action execution. This factorization allows us to leverage advances in these sub-problems and combine them into a general and effective perception-to-action manipulation pipeline. Our pipeline is robust to large intra-category shape variation and topology changes as the keypoint representation ignores task-irrelevant geometric details. Extensive hardware experiments demonstrate our method can reliably accomplish tasks with never-before seen objects in a category, such as placing shoes and mugs with significant shape variation into category level target configurations.

READ FULL TEXT

page 1

page 4

page 6

page 8

page 9

research
09/16/2019

kPAM-SC: Generalizable Manipulation Planning using KeyPoint Affordance and Shape Completion

Manipulation planning is the task of computing robot trajectories that m...
research
02/11/2021

kPAM 2.0: Feedback Control for Category-Level Robotic Manipulation

In this paper, we explore generalizable, perception-to-action robotic ma...
research
01/30/2022

You Only Demonstrate Once: Category-Level Manipulation from Single Visual Demonstration

Promising results have been achieved recently in category-level manipula...
research
10/16/2020

Manipulation-Oriented Object Perception in Clutter through Affordance Coordinate Frames

In order to enable robust operation in unstructured environments, robots...
research
06/24/2022

Optimal and Robust Category-level Perception: Object Pose and Shape Estimation from 2D and 3D Semantic Keypoints

We consider a category-level perception problem, where one is given 2D o...
research
10/18/2021

Keypoint-Based Bimanual Shaping of Deformable Linear Objects under Environmental Constraints using Hierarchical Action Planning

This paper addresses the problem of contact-based manipulation of deform...
research
03/25/2018

StarMap for Category-Agnostic Keypoint and Viewpoint Estimation

Semantic keypoints provide concise abstractions for a variety of visual ...

Please sign up or login with your details

Forgot password? Click here to reset