Category-Independent Articulated Object Tracking with Factor Graphs

05/07/2022
by   Nick Heppert, et al.
0

Robots deployed in human-centric environments may need to manipulate a diverse range of articulated objects, such as doors, dishwashers, and cabinets. Articulated objects often come with unexpected articulation mechanisms that are inconsistent with categorical priors: for example, a drawer might rotate about a hinge joint instead of sliding open. We propose a category-independent framework for predicting the articulation models of unknown objects from sequences of RGB-D images. The prediction is performed by a two-step process: first, a visual perception module tracks object part poses from raw images, and second, a factor graph takes these poses and infers the articulation model including the current configuration between the parts as a 6D twist. We also propose a manipulation-oriented metric to evaluate predicted joint twists in terms of how well a compliant robot controller would be able to manipulate the articulated object given the predicted twist. We demonstrate that our visual perception and factor graph modules outperform baselines on simulated data and show the applicability of our factor graph on real world data.

READ FULL TEXT

page 1

page 7

page 8

research
10/16/2020

Manipulation-Oriented Object Perception in Clutter through Affordance Coordinate Frames

In order to enable robust operation in unstructured environments, robots...
research
06/28/2021

VAT-Mart: Learning Visual Action Trajectory Proposals for Manipulating 3D ARTiculated Objects

Perceiving and manipulating 3D articulated objects (e.g., cabinets, door...
research
09/13/2022

Learning Category-Level Manipulation Tasks from Point Clouds with Dynamic Graph CNNs

This paper presents a new technique for learning category-level manipula...
research
12/01/2020

Visual Identification of Articulated Object Parts

As autonomous robots interact and navigate around real-world environment...
research
01/17/2019

Kinematically-Informed Interactive Perception: Robot-Generated 3D Models for Classification

To be useful in everyday environments, robots must be able to observe an...
research
06/30/2021

SimNet: Enabling Robust Unknown Object Manipulation from Pure Synthetic Data via Stereo

Robot manipulation of unknown objects in unstructured environments is a ...
research
09/15/2023

AnyOKP: One-Shot and Instance-Aware Object Keypoint Extraction with Pretrained ViT

Towards flexible object-centric visual perception, we propose a one-shot...

Please sign up or login with your details

Forgot password? Click here to reset