Improving the Latent Space of Image Style Transfer

05/24/2022
by   Yunpeng Bai, et al.
0

Existing neural style transfer researches have studied to match statistical information between the deep features of content and style images, which were extracted by a pre-trained VGG, and achieved significant improvement in synthesizing artistic images. However, in some cases, the feature statistics from the pre-trained encoder may not be consistent with the visual style we perceived. For example, the style distance between images of different styles is less than that of the same style. In such an inappropriate latent space, the objective function of the existing methods will be optimized in the wrong direction, resulting in bad stylization results. In addition, the lack of content details in the features extracted by the pre-trained encoder also leads to the content leak problem. In order to solve these issues in the latent space used by style transfer, we propose two contrastive training schemes to get a refined encoder that is more suitable for this task. The style contrastive loss pulls the stylized result closer to the same visual style image and pushes it away from the content image. The content contrastive loss enables the encoder to retain more available details. We can directly add our training scheme to some existing style transfer methods and significantly improve their results. Extensive experimental results demonstrate the effectiveness and superiority of our methods.

READ FULL TEXT

page 2

page 4

page 6

page 7

page 8

page 9

research
12/01/2021

CLIPstyler: Image Style Transfer with a Single Text Condition

Existing neural style transfer methods require reference style images to...
research
09/02/2020

Neural Crossbreed: Neural Based Image Metamorphosis

We propose Neural Crossbreed, a feed-forward neural network that can lea...
research
06/01/2021

Language-Driven Image Style Transfer

Despite having promising results, style transfer, which requires prepari...
research
04/08/2021

Rethinking and Improving the Robustness of Image Style Transfer

Extensive research in neural style transfer methods has shown that the c...
research
06/07/2019

Conditional Neural Style Transfer with Peer-Regularized Feature Transform

This paper introduces a neural style transfer model to conditionally gen...
research
01/26/2023

Style-Aware Contrastive Learning for Multi-Style Image Captioning

Existing multi-style image captioning methods show promising results in ...
research
11/29/2019

Transflow Learning: Repurposing Flow Models Without Retraining

It is well known that deep generative models have a rich latent space, a...

Please sign up or login with your details

Forgot password? Click here to reset