Unpaired Motion Style Transfer from Video to Animation

05/12/2020
by   Kfir Aberman, et al.
0

Transferring the motion style from one animation clip to another, while preserving the motion content of the latter, has been a long-standing problem in character animation. Most existing data-driven approaches are supervised and rely on paired data, where motions with the same content are performed in different styles. In addition, these approaches are limited to transfer of styles that were seen during training. In this paper, we present a novel data-driven framework for motion style transfer, which learns from an unpaired collection of motions with style labels, and enables transferring motion styles not observed during training. Furthermore, our framework is able to extract motion styles directly from videos, bypassing 3D reconstruction, and apply them to the 3D input motion. Our style transfer network encodes motions into two latent codes, for content and for style, each of which plays a different role in the decoding (synthesis) process. While the content code is decoded into the output motion by several temporal convolutional layers, the style code modifies deep features via temporally invariant adaptive instance normalization (AdaIN). Moreover, while the content code is encoded from 3D joint rotations, we learn a common embedding for style from either 3D or 2D joint positions, enabling style extraction from videos. Our results are comparable to the state-of-the-art, despite not requiring paired training data, and outperform other methods when transferring previously unseen styles. To our knowledge, we are the first to demonstrate style transfer directly from videos to 3D animations - an ability which enables one to extend the set of style examples far beyond motions captured by MoCap systems.

READ FULL TEXT

page 8

page 11

research
03/04/2022

Style-ERD: Responsive and Coherent Online Motion Style Transfer

Motion style transfer is a common method for enriching character animati...
research
12/23/2019

One-Shot Imitation Filming of Human Motion Videos

Imitation learning has been applied to mimic the operation of a human ca...
research
01/12/2022

Real-Time Style Modelling of Human Locomotion via Feature-Wise Transformations and Local Motion Phases

Controlling the manner in which a character moves in a real-time animati...
research
12/16/2022

Unifying Human Motion Synthesis and Style Transfer with Denoising Diffusion Probabilistic Models

Generating realistic motions for digital humans is a core but challengin...
research
05/30/2023

HuMoT: Human Motion Representation using Topology-Agnostic Transformers for Character Animation Retargeting

Motion retargeting is the long-standing problem in character animation t...
research
08/11/2023

Semantics2Hands: Transferring Hand Motion Semantics between Avatars

Human hands, the primary means of non-verbal communication, convey intri...
research
02/10/2022

Motion Puzzle: Arbitrary Motion Style Transfer by Body Part

This paper presents Motion Puzzle, a novel motion style transfer network...

Please sign up or login with your details

Forgot password? Click here to reset