Controlling Material Appearance by Examples

06/30/2022
by   Yiwei Hu, et al.
0

Despite the ubiquitousness of materials maps in modern rendering pipelines, their editing and control remains a challenge. In this paper, we present an example-based material control method to augment input material maps based on user-provided material photos. We train a tileable version of MaterialGAN and leverage its material prior to guide the appearance transfer, optimizing its latent space using differentiable rendering. Our method transfers the micro and meso-structure textures of user provided target(s) photographs, while preserving the structure of the input and quality of the input material. We show our methods can control existing material maps, increasing realism or generating new, visually appealing materials.

READ FULL TEXT

page 1

page 4

page 5

page 6

page 8

page 9

page 10

page 11

research
06/12/2022

TileGen: Tileable, Controllable Material Generation and Capture

Recent methods (e.g. MaterialGAN) have used unconditional GANs to genera...
research
05/20/2023

PhotoMat: A Material Generator Learned from Single Flash Photos

Authoring high-quality digital materials is key to realism in 3D renderi...
research
09/14/2021

An Inverse Procedural Modeling Pipeline for SVBRDF Maps

Procedural modeling is now the de facto standard of material modeling in...
research
07/15/2022

Node Graph Optimization Using Differentiable Proxies

Graph-based procedural materials are ubiquitous in content production in...
research
05/12/2023

Progressive Material Caching

The evaluation of material networks is a relatively resource-intensive p...
research
04/06/2021

NeuMIP: Multi-Resolution Neural Materials

We propose NeuMIP, a neural method for representing and rendering a vari...
research
02/23/2021

Generative Modelling of BRDF Textures from Flash Images

We learn a latent space for easy capture, semantic editing, consistent i...

Please sign up or login with your details

Forgot password? Click here to reset