Dance with You: The Diversity Controllable Dancer Generation via Diffusion Models

08/23/2023
by   Siyue Yao, et al.
0

Recently, digital humans for interpersonal interaction in virtual environments have gained significant attention. In this paper, we introduce a novel multi-dancer synthesis task called partner dancer generation, which involves synthesizing virtual human dancers capable of performing dance with users. The task aims to control the pose diversity between the lead dancer and the partner dancer. The core of this task is to ensure the controllable diversity of the generated partner dancer while maintaining temporal coordination with the lead dancer. This scenario varies from earlier research in generating dance motions driven by music, as our emphasis is on automatically designing partner dancer postures according to pre-defined diversity, the pose of lead dancer, as well as the accompanying tunes. To achieve this objective, we propose a three-stage framework called Dance-with-You (DanY). Initially, we employ a 3D Pose Collection stage to collect a wide range of basic dance poses as references for motion generation. Then, we introduce a hyper-parameter that coordinates the similarity between dancers by masking poses to prevent the generation of sequences that are over-diverse or consistent. To avoid the rigidity of movements, we design a Dance Pre-generated stage to pre-generate these masked poses instead of filling them with zeros. After that, a Dance Motion Transfer stage is adopted with leader sequences and music, in which a multi-conditional sampling formula is rewritten to transfer the pre-generated poses into a sequence with a partner style. In practice, to address the lack of multi-person datasets, we introduce AIST-M, a new dataset for partner dancer generation, which is publicly availiable. Comprehensive evaluations on our AIST-M dataset demonstrate that the proposed DanY can synthesize satisfactory partner dancer results with controllable diversity.

READ FULL TEXT
research
07/08/2022

Music-driven Dance Regeneration with Controllable Key Pose Constraints

In this paper, we propose a novel framework for music-driven dance motio...
research
11/23/2021

Rhythm is a Dancer: Music-Driven Motion Synthesis with Global Structure

Synthesizing human motion with a global structure, such as a choreograph...
research
06/01/2023

Controllable Motion Diffusion Model

Generating realistic and controllable motions for virtual characters is ...
research
04/30/2021

Dance Generation with Style Embedding: Learning and Transferring Latent Representations of Dance Styles

Choreography refers to creation of dance steps and motions for dances ac...
research
11/05/2019

Dancing to Music

Dancing to music is an instinctive move by humans. Learning to model the...
research
09/16/2020

ChoreoNet: Towards Music to Dance Synthesis with Choreographic Action Unit

Dance and music are two highly correlated artistic forms. Synthesizing d...
research
10/21/2021

MUGL: Large Scale Multi Person Conditional Action Generation with Locomotion

We introduce MUGL, a novel deep neural model for large-scale, diverse ge...

Please sign up or login with your details

Forgot password? Click here to reset