In-the-wild Material Appearance Editing using Perceptual Attributes

02/07/2023
by   J. Daniel Subias, et al.
0

Intuitively editing the appearance of materials from a single image is a challenging task given the complexity of the interactions between light and matter, and the ambivalence of human perception. This problem has been traditionally addressed by estimating additional factors of the scene like geometry or illumination, thus solving an inverse rendering problem and subduing the final quality of the results to the quality of these estimations. We present a single-image appearance editing framework that allows us to intuitively modify the material appearance of an object by increasing or decreasing high-level perceptual attributes describing such appearance (e.g., glossy or metallic). Our framework takes as input an in-the-wild image of a single object, where geometry, material, and illumination are not controlled, and inverse rendering is not required. We rely on generative models and devise a novel architecture with Selective Transfer Unit (STU) cells that allow to preserve the high-frequency details from the input image in the edited one. To train our framework we leverage a dataset with pairs of synthetic images rendered with physically-based algorithms, and the corresponding crowd-sourced ratings of high-level perceptual attributes. We show that our material editing framework outperforms the state of the art, and showcase its applicability on synthetic images, in-the-wild real-world photographs, and video sequences.

READ FULL TEXT

page 1

page 5

page 6

page 8

page 9

page 13

research
05/19/2022

Physically-Based Editing of Indoor Scene Lighting from a Single Image

We present a method to edit complex indoor lighting from a single image ...
research
06/13/2018

An intuitive control space for material appearance

Many different techniques for measuring material appearance have been pr...
research
04/01/2021

PhySG: Inverse Rendering with Spherical Gaussians for Physics-based Material Editing and Relighting

We present PhySG, an end-to-end inverse rendering pipeline that includes...
research
01/08/2019

Neural Inverse Rendering of an Indoor Scene from a Single Image

Inverse rendering aims to estimate physical scene attributes (e.g., refl...
research
12/25/2019

Inverse Rendering Techniques for Physically Grounded Image Editing

From a single picture of a scene, people can typically grasp the spatial...
research
09/30/2020

MaterialGAN: Reflectance Capture using a Generative SVBRDF Model

We address the problem of reconstructing spatially-varying BRDFs from a ...
research
03/24/2023

WildLight: In-the-wild Inverse Rendering with a Flashlight

This paper proposes a practical photometric solution for the challenging...

Please sign up or login with your details

Forgot password? Click here to reset