Dressing Avatars: Deep Photorealistic Appearance for Physically Simulated Clothing

06/30/2022
by   Donglai Xiang, et al.
13

Despite recent progress in developing animatable full-body avatars, realistic modeling of clothing - one of the core aspects of human self-expression - remains an open challenge. State-of-the-art physical simulation methods can generate realistically behaving clothing geometry at interactive rates. Modeling photorealistic appearance, however, usually requires physically-based rendering which is too expensive for interactive applications. On the other hand, data-driven deep appearance models are capable of efficiently producing realistic appearance, but struggle at synthesizing geometry of highly dynamic clothing and handling challenging body-clothing configurations. To this end, we introduce pose-driven avatars with explicit modeling of clothing that exhibit both photorealistic appearance learned from real-world data and realistic clothing dynamics. The key idea is to introduce a neural clothing appearance model that operates on top of explicit geometry: at training time we use high-fidelity tracking, whereas at animation time we rely on physically simulated geometry. Our core contribution is a physically-inspired appearance network, capable of generating photorealistic appearance with view-dependent and dynamic shadowing effects even for unseen body-clothing configurations. We conduct a thorough evaluation of our model and demonstrate diverse animation results on several subjects and different types of clothing. Unlike previous work on photorealistic full-body avatars, our approach can produce much richer dynamics and more realistic deformations even for many examples of loose clothing. We also demonstrate that our formulation naturally allows clothing to be used with avatars of different people while staying fully animatable, thus enabling, for the first time, photorealistic avatars with novel clothing.

READ FULL TEXT

page 1

page 4

page 6

page 7

page 9

page 10

page 11

page 12

research
03/24/2022

Learning Motion-Dependent Appearance for High-Fidelity Rendering of Dynamic Humans from a Single Camera

Appearance of dressed humans undergoes a complex geometric transformatio...
research
06/07/2022

Garment Avatars: Realistic Cloth Driving using Pattern Registration

Virtual telepresence is the future of online communication. Clothing is ...
research
06/15/2023

DreamHuman: Animatable 3D Avatars from Text

We present DreamHuman, a method to generate realistic animatable 3D huma...
research
05/04/2021

Real-time Deep Dynamic Characters

We propose a deep videorealistic 3D human character model displaying hig...
research
04/25/2019

Mechanics-Aware Modeling of Cloth Appearance

Micro-appearance models have brought unprecedented fidelity and details ...
research
09/30/2021

HUMBI: A Large Multiview Dataset of Human Body Expressions and Benchmark Challenge

This paper presents a new large multiview dataset called HUMBI for human...
research
06/28/2021

Modeling Clothing as a Separate Layer for an Animatable Human Avatar

We have recently seen great progress in building photorealistic animatab...

Please sign up or login with your details

Forgot password? Click here to reset