Semantics2Hands: Transferring Hand Motion Semantics between Avatars

08/11/2023
by   Zijie Ye, et al.
0

Human hands, the primary means of non-verbal communication, convey intricate semantics in various scenarios. Due to the high sensitivity of individuals to hand motions, even minor errors in hand motions can significantly impact the user experience. Real applications often involve multiple avatars with varying hand shapes, highlighting the importance of maintaining the intricate semantics of hand motions across the avatars. Therefore, this paper aims to transfer the hand motion semantics between diverse avatars based on their respective hand models. To address this problem, we introduce a novel anatomy-based semantic matrix (ASM) that encodes the semantics of hand motions. The ASM quantifies the positions of the palm and other joints relative to the local frame of the corresponding joint, enabling precise retargeting of hand motions. Subsequently, we obtain a mapping function from the source ASM to the target hand joint rotations by employing an anatomy-based semantics reconstruction network (ASRN). We train the ASRN using a semi-supervised learning strategy on the Mixamo and InterHand2.6M datasets. We evaluate our method in intra-domain and cross-domain hand motion retargeting tasks. The qualitative and quantitative results demonstrate the significant superiority of our ASRN over the state-of-the-arts.

READ FULL TEXT

page 3

page 6

page 8

research
03/05/2022

A Modular Approach to the Embodiment of Hand Motions from Human Demonstrations

Manipulating objects with robotic hands is a complicated task. Not only ...
research
05/24/2023

ACE: Adversarial Correspondence Embedding for Cross Morphology Motion Retargeting from Human to Nonhuman Characters

Motion retargeting is a promising approach for generating natural and co...
research
05/12/2020

Unpaired Motion Style Transfer from Video to Animation

Transferring the motion style from one animation clip to another, while ...
research
03/04/2022

Freeform Body Motion Generation from Speech

People naturally conduct spontaneous body motions to enhance their speec...
research
02/17/2022

Neural Marionette: Unsupervised Learning of Motion Skeleton and Latent Dynamics from Volumetric Video

We present Neural Marionette, an unsupervised approach that discovers th...
research
10/13/2021

The Computerized Classification of Micro-Motions in the Hand using Waveforms from Mobile Phone

Our hands reveal important information such as the pulsing of our veins ...
research
02/08/2023

Guided Learning from Demonstration for Robust Transferability

Learning from demonstration (LfD) has the potential to greatly increase ...

Please sign up or login with your details

Forgot password? Click here to reset