The secret of immersion: actor driven camera movement generation for auto-cinematography

03/29/2023
by   Xinyi Wu, et al.
0

Immersion plays a vital role when designing cinematic creations, yet the difficulty in immersive shooting prevents designers to create satisfactory outputs. In this work, we analyze the specific components that contribute to cinematographic immersion considering spatial, emotional, and aesthetic level, while these components are then combined into a high-level evaluation mechanism. Guided by such a immersion mechanism, we propose a GAN-based camera control system that is able to generate actor-driven camera movements in the 3D virtual environment to obtain immersive film sequences. The proposed encoder-decoder architecture in the generation flow transfers character motion into camera trajectory conditioned on an emotion factor. This ensures spatial and emotional immersion by performing actor-camera synchronization physically and psychologically. The emotional immersion is further strengthened by incorporating regularization that controls camera shakiness for expressing different mental statuses. To achieve aesthetic immersion, we make effort to improve aesthetic frame compositions by modifying the synthesized camera trajectory. Based on a self-supervised adjustor, the adjusted camera placements can project the character to the appropriate on-frame locations following aesthetic rules. The experimental results indicate that our proposed camera control system can efficiently offer immersive cinematic videos, both quantitatively and qualitatively, based on a fine-grained immersive shooting. Live examples are shown in the supplementary video.

READ FULL TEXT

page 4

page 8

page 9

page 11

page 12

page 13

page 14

page 17

research
11/17/2020

Fine-grained Emotion Strength Transfer, Control and Prediction for Emotional Speech Synthesis

This paper proposes a unified model to conduct emotion transfer, control...
research
06/08/2023

Emotion and Sentiment Guided Paraphrasing

Paraphrase generation, a.k.a. paraphrasing, is a common and important ta...
research
08/16/2023

DragNUWA: Fine-grained Control in Video Generation by Integrating Text, Image, and Trajectory

Controllable video generation has gained significant attention in recent...
research
12/03/2019

Integrating Motion into Vision Models for Better Visual Prediction

We demonstrate an improved vision system that learns a model of its envi...
research
11/26/2022

Progressive Disentangled Representation Learning for Fine-Grained Controllable Talking Head Synthesis

We present a novel one-shot talking head synthesis method that achieves ...
research
03/01/2017

Making 360^∘ Video Watchable in 2D: Learning Videography for Click Free Viewing

360^∘ video requires human viewers to actively control "where" to look w...

Please sign up or login with your details

Forgot password? Click here to reset