Domain Enhanced Arbitrary Image Style Transfer via Contrastive Learning

05/19/2022
by   Yuxin Zhang, et al.
4

In this work, we tackle the challenging problem of arbitrary image style transfer using a novel style feature representation learning method. A suitable style representation, as a key component in image stylization tasks, is essential to achieve satisfactory results. Existing deep neural network based approaches achieve reasonable results with the guidance from second-order statistics such as Gram matrix of content features. However, they do not leverage sufficient style information, which results in artifacts such as local distortions and style inconsistency. To address these issues, we propose to learn style representation directly from image features instead of their second-order statistics, by analyzing the similarities and differences between multiple styles and considering the style distribution. Specifically, we present Contrastive Arbitrary Style Transfer (CAST), which is a new style representation learning and style transfer method via contrastive learning. Our framework consists of three key components, i.e., a multi-layer style projector for style code encoding, a domain enhancement module for effective learning of style distribution, and a generative network for image style transfer. We conduct qualitative and quantitative evaluations comprehensively to demonstrate that our approach achieves significantly better results compared to those obtained via state-of-the-art methods. Code and models are available at https://github.com/zyxElsa/CAST_pytorch

READ FULL TEXT

page 1

page 2

page 3

page 6

page 7

research
03/09/2023

A Unified Arbitrary Style Transfer Framework via Adaptive Contrastive Learning

We present Unified Contrastive Arbitrary Style Transfer (UCAST), a novel...
research
01/24/2023

Few-shot Font Generation by Learning Style Difference and Similarity

Few-shot font generation (FFG) aims to preserve the underlying global st...
research
03/15/2022

Exact Feature Distribution Matching for Arbitrary Style Transfer and Domain Generalization

Arbitrary style transfer (AST) and domain generalization (DG) are import...
research
10/26/2019

ETNet: Error Transition Network for Arbitrary Style Transfer

Numerous valuable efforts have been devoted to achieving arbitrary style...
research
11/20/2022

Font Representation Learning via Paired-glyph Matching

Fonts can convey profound meanings of words in various forms of glyphs. ...
research
06/02/2020

Distribution Aligned Multimodal and Multi-Domain Image Stylization

Multimodal and multi-domain stylization are two important problems in th...
research
03/16/2023

SpectralCLIP: Preventing Artifacts in Text-Guided Style Transfer from a Spectral Perspective

Contrastive Language-Image Pre-Training (CLIP) has refreshed the state o...

Please sign up or login with your details

Forgot password? Click here to reset