Name Your Style: An Arbitrary Artist-aware Image Style Transfer

02/28/2022
by   Zhi-Song Liu, et al.
0

Image style transfer has attracted widespread attention in the past few years. Despite its remarkable results, it requires additional style images available as references, making it less flexible and inconvenient. Using text is the most natural way to describe the style. More importantly, text can describe implicit abstract styles, like styles of specific artists or art movements. In this paper, we propose a text-driven image style transfer (TxST) that leverages advanced image-text encoders to control arbitrary style transfer. We introduce a contrastive training strategy to effectively extract style descriptions from the image-text model (i.e., CLIP), which aligns stylization with the text description. To this end, we also propose a novel and efficient attention module that explores cross-attentions to fuse style and content features. Finally, we achieve an arbitrary artist-aware image style transfer to learn and transfer specific artistic characters such as Picasso, oil painting, or a rough sketch. Extensive experiments demonstrate that our approach outperforms the state-of-the-art methods on both image and textual styles. Moreover, it can mimic the styles of one or many artists to achieve attractive results, thus highlighting a promising direction in image style transfer.

READ FULL TEXT

page 1

page 3

page 4

page 8

page 9

page 10

page 12

page 13

research
08/27/2022

AesUST: Towards Aesthetic-Enhanced Universal Style Transfer

Recent studies have shown remarkable success in universal style transfer...
research
03/16/2023

StylerDALLE: Language-Guided Style Transfer Using a Vector-Quantized Tokenizer of a Large-Scale Generative Model

Despite the progress made in the style transfer task, most previous work...
research
09/13/2023

DreamStyler: Paint by Style Inversion with Text-to-Image Diffusion Models

Recent progresses in large-scale text-to-image models have yielded remar...
research
10/20/2022

TANGO: Text-driven Photorealistic and Robust 3D Stylization via Lighting Decomposition

Creation of 3D content by stylization is a promising yet challenging pro...
research
05/25/2023

CLIP3Dstyler: Language Guided 3D Arbitrary Neural Style Transfer

In this paper, we propose a novel language-guided 3D arbitrary neural st...
research
06/03/2019

Massive Styles Transfer with Limited Labeled Data

Language style transfer has attracted more and more attention in the pas...
research
12/10/2018

Style Transfer and Extraction for the Handwritten Letters Using Deep Learning

How can we learn, transfer and extract handwriting styles using deep neu...

Please sign up or login with your details

Forgot password? Click here to reset