Artemis: Articulated Neural Pets with Appearance and Motion synthesis

02/11/2022
by   Haimin Luo, et al.
0

We, humans, are entering into a virtual era and indeed want to bring animals to the virtual world as well for companion. Yet, computer-generated (CGI) furry animals are limited by tedious off-line rendering, let alone interactive motion control. In this paper, we present ARTEMIS, a novel neural modeling and rendering pipeline for generating ARTiculated neural pets with appEarance and Motion synthesIS. Our ARTEMIS enables interactive motion control, real-time animation, and photo-realistic rendering of furry animals. The core of our ARTEMIS is a neural-generated (NGI) animal engine, which adopts an efficient octree-based representation for animal animation and fur rendering. The animation then becomes equivalent to voxel-level deformation based on explicit skeletal warping. We further use a fast octree indexing and efficient volumetric rendering scheme to generate appearance and density features maps. Finally, we propose a novel shading network to generate high-fidelity details of appearance and opacity under novel poses from appearance and density feature maps. For the motion control module in ARTEMIS, we combine state-of-the-art animal motion capture approach with recent neural character control scheme. We introduce an effective optimization scheme to reconstruct the skeletal motion of real animals captured by a multi-view RGB and Vicon camera array. We feed all the captured motion into a neural character control scheme to generate abstract control signals with motion styles. We further integrate ARTEMIS into existing engines that support VR headsets, providing an unprecedented immersive experience where a user can intimately interact with a variety of virtual animals with vivid movements and photo-realistic appearance. We make available our ARTEMIS model and dynamic furry animal dataset at https://haiminluo.github.io/publication/artemis/.

READ FULL TEXT

page 1

page 8

page 10

page 11

page 12

page 14

page 15

page 16

research
12/12/2022

Neural Assets: Volumetric Object Capture and Rendering for Interactive Environments

Creating realistic virtual assets is a time-consuming process: it usuall...
research
04/06/2023

Instant-NVR: Instant Neural Volumetric Rendering for Human-object Interactions from Monocular RGBD Stream

Convenient 4D modeling of human-object interactions is essential for num...
research
10/22/2022

NeARportation: A Remote Real-time Neural Rendering Framework

While the presentation of photo-realistic appearance plays a major role ...
research
02/22/2021

Style and Pose Control for Image Synthesis of Humans from a Single Monocular View

Photo-realistic re-rendering of a human from a single image with explici...
research
02/12/2022

NeuVV: Neural Volumetric Videos with Immersive Rendering and Editing

Some of the most exciting experiences that Metaverse promises to offer, ...
research
04/05/2021

Convolutional Neural Opacity Radiance Fields

Photo-realistic modeling and rendering of fuzzy objects with complex opa...
research
12/13/2021

HVH: Learning a Hybrid Neural Volumetric Representation for Dynamic Hair Performance Capture

Capturing and rendering life-like hair is particularly challenging due t...

Please sign up or login with your details

Forgot password? Click here to reset