OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction

03/29/2022
by   Lixin Yang, et al.
0

Learning how humans manipulate objects requires machines to acquire knowledge from two perspectives: one for understanding object affordances and the other for learning human's interactions based on the affordances. Even though these two knowledge bases are crucial, we find that current databases lack a comprehensive awareness of them. In this work, we propose a multi-modal and rich-annotated knowledge repository, OakInk, for visual and cognitive understanding of hand-object interactions. We start to collect 1,800 common household objects and annotate their affordances to construct the first knowledge base: Oak. Given the affordance, we record rich human interactions with 100 selected objects in Oak. Finally, we transfer the interactions on the 100 recorded objects to their virtual counterparts through a novel method: Tink. The recorded and transferred hand-object interactions constitute the second knowledge base: Ink. As a result, OakInk contains 50,000 distinct affordance-aware and intent-oriented hand-object interactions. We benchmark OakInk on pose estimation and grasp generation tasks. Moreover, we propose two practical applications of OakInk: intent-based interaction generation and handover generation. Our datasets and source code are publicly available at https://github.com/lixiny/OakInk.

READ FULL TEXT

page 2

page 3

page 7

page 8

page 9

page 10

page 11

page 12

research
09/16/2023

AffordPose: A Large-scale Dataset of Hand-Object Interactions with Affordance-driven Hand Pose

How human interact with objects depends on the functional roles of the t...
research
02/17/2022

AKB-48: A Real-World Articulated Object Knowledge Base

Human life is populated with articulated objects. A comprehensive unders...
research
04/28/2022

Articulated Objects in Free-form Hand Interaction

We use our hands to interact with and to manipulate objects. Articulated...
research
04/07/2021

Affordance Transfer Learning for Human-Object Interaction Detection

Reasoning the human-object interactions (HOI) is essential for deeper sc...
research
04/23/2021

H2O: A Benchmark for Visual Human-human Object Handover Analysis

Object handover is a common human collaboration behavior that attracts a...
research
12/06/2022

Beyond Object Recognition: A New Benchmark towards Object Concept Learning

Understanding objects is a central building block of artificial intellig...
research
09/14/2022

Transferring Implicit Knowledge of Non-Visual Object Properties Across Heterogeneous Robot Morphologies

Humans leverage multiple sensor modalities when interacting with objects...

Please sign up or login with your details

Forgot password? Click here to reset