A Unified Framework for Generalizable Style Transfer: Style and Content Separation

06/13/2018
by   Yexun Zhang, et al.
2

Image style transfer has drawn broad attention in recent years. However, most existing methods aim to explicitly model the transformation between different styles, and the learned model is thus not generalizable to new styles. We here propose a unified style transfer framework for both character typeface transfer and neural style transfer tasks leveraging style and content separation. A key merit of such framework is its generalizability to new styles and contents. The overall framework consists of style encoder, content encoder, mixer and decoder. The style encoder and content encoder are used to extract the style and content representations from the corresponding reference images. The mixer integrates the above two representations and feeds it into the decoder to generate images with the target style and content. During training, the encoder networks learn to extract styles and contents from limited size of style/content reference images. This learning framework allows simultaneous style transfer among multiple styles and can be deemed as a special `multi-task' learning scenario. The encoders are expected to capture the underlying features for different styles and contents which is generalizable to new styles and contents. Under this framework, we design two individual networks for character typeface transfer and neural style transfer, respectively. For character typeface transfer, to separate the style features and content features, we leverage the conditional dependence of styles and contents given an image. For neural style transfer, we leverage the statistical information of feature maps in certain layers to represent style. Extensive experimental results have demonstrated the effectiveness and robustness of the proposed methods.

READ FULL TEXT

page 7

page 10

page 11

page 12

research
11/17/2017

Separating Style and Content for Generalized Style Transfer

Neural style transfer has drawn broad attention in recent years. However...
research
08/10/2021

Domain-Aware Universal Style Transfer

Style transfer aims to reproduce content images with the styles from ref...
research
11/26/2020

3DSNet: Unsupervised Shape-to-Shape 3D Style Transfer

Transferring the style from one image onto another is a popular and wide...
research
11/19/2019

Two-Stream FCNs to Balance Content and Style for Style Transfer

Style transfer is to render given image contents in given styles, and it...
research
05/30/2019

Wasserstein Style Transfer

We propose Gaussian optimal transport for Image style transfer in an Enc...
research
11/28/2022

MicroAST: Towards Super-Fast Ultra-Resolution Arbitrary Style Transfer

Arbitrary style transfer (AST) transfers arbitrary artistic styles onto ...
research
03/27/2017

StyleBank: An Explicit Representation for Neural Image Style Transfer

We propose StyleBank, which is composed of multiple convolution filter b...

Please sign up or login with your details

Forgot password? Click here to reset