Synthesizing Physical Character-Scene Interactions

02/02/2023
by   Mohamed Hassan, et al.
5

Movement is how people interact with and affect their environment. For realistic character animation, it is necessary to synthesize such interactions between virtual characters and their surroundings. Despite recent progress in character animation using machine learning, most systems focus on controlling an agent's movements in fairly simple and homogeneous environments, with limited interactions with other objects. Furthermore, many previous approaches that synthesize human-scene interactions require significant manual labeling of the training data. In contrast, we present a system that uses adversarial imitation learning and reinforcement learning to train physically-simulated characters that perform scene interaction tasks in a natural and life-like manner. Our method learns scene interaction behaviors from large unstructured motion datasets, without manual annotation of the motion data. These scene interactions are learned using an adversarial discriminator that evaluates the realism of a motion within the context of a scene. The key novelty involves conditioning both the discriminator and the policy networks on scene context. We demonstrate the effectiveness of our approach through three challenging scene interaction tasks: carrying, sitting, and lying down, which require coordination of a character's movements in relation to objects in the environment. Our policies learn to seamlessly transition between different behaviors like idling, walking, and sitting. By randomizing the properties of the objects and their placements during training, our method is able to generalize beyond the objects and scenarios depicted in the training dataset, producing natural character-scene interactions for a wide variety of object shapes and placements. The approach takes physics-based character motion generation a step closer to broad applicability.

READ FULL TEXT

page 1

page 3

page 6

page 11

research
04/05/2021

AMP: Adversarial Motion Priors for Stylized Physics-Based Character Control

Synthesizing graceful and life-like behaviors for physically simulated c...
research
10/01/2021

GAN-based Reactive Motion Synthesis with Class-aware Discriminators for Human-human Interaction

Creating realistic characters that can react to the users' or another ch...
research
03/19/2023

Combining Active and Passive Simulations for Secondary Motion

Objects that move in response to the actions of a main character often m...
research
11/06/2020

Learning Human Search Behavior from Egocentric Visual Inputs

"Looking for things" is a mundane but critical task we repeatedly carry ...
research
08/24/2023

ROAM: Robust and Object-aware Motion Generation using Neural Pose Descriptors

Existing automatic approaches for 3D virtual character motion synthesis ...
research
01/09/2023

Locomotion-Action-Manipulation: Synthesizing Human-Scene Interactions in Complex 3D Environments

Synthesizing interaction-involved human motions has been challenging due...
research
03/09/2022

Triangular Character Animation Sampling with Motion, Emotion, and Relation

Dramatic progress has been made in animating individual characters. Howe...

Please sign up or login with your details

Forgot password? Click here to reset