CLIP-based Neural Neighbor Style Transfer for 3D Assets

08/08/2022
by   Shailesh Mishra, et al.
0

We present a method for transferring the style from a set of images to a 3D object. The texture appearance of an asset is optimized with a differentiable renderer in a pipeline based on losses using pretrained deep neural networks. More specifically, we utilize a nearest-neighbor feature matching loss with CLIP-ResNet50 to extract the style from images. We show that a CLIP- based style loss provides a different appearance over a VGG-based loss by focusing more on texture over geometric shapes. Additionally, we extend the loss to support multiple images and enable loss-based control over the color palette combined with automatic color palette extraction from style images.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 7

research
04/28/2022

An Overview of Color Transfer and Style Transfer for Images and Videos

Image or video appearance features (e.g., color, texture, tone, illumina...
research
12/30/2015

Improving Style Similarity Metrics of 3D Shapes

The idea of style similarity metrics has been recently developed for var...
research
05/02/2017

Visual Attribute Transfer through Deep Image Analogy

We propose a new technique for visual attribute transfer across images t...
research
12/18/2019

Neural Smoke Stylization with Color Transfer

Artistically controlling fluid simulations requires a large amount of ma...
research
06/13/2022

ARF: Artistic Radiance Fields

We present a method for transferring the artistic features of an arbitra...
research
03/14/2019

Superpixel-based Color Transfer

In this work, we propose a fast superpixel-based color transfer method (...
research
09/03/2023

S2RF: Semantically Stylized Radiance Fields

We present our method for transferring style from any arbitrary image(s)...

Please sign up or login with your details

Forgot password? Click here to reset