Two-Stream FCNs to Balance Content and Style for Style Transfer

11/19/2019
by   Duc Minh Vo, et al.
10

Style transfer is to render given image contents in given styles, and it has an important role in both computer vision fundamental research and industrial applications. Following the success of deep learning based approaches, this problem has been re-launched very recently, but still remains a difficult task because of trade-off between preserving contents and faithful rendering of styles. In this paper, we propose an end-to-end two-stream Fully Convolutional Networks (FCNs) aiming at balancing the contributions of the content and the style in rendered images. Our proposed network consists of the encoder and decoder parts. The encoder part utilizes a FCN for content and a FCN for style where the two FCNs have feature injections and are independently trained to preserve the semantic content and to learn the faithful style representation in each. The semantic content feature and the style representation feature are then concatenated adaptively and fed into the decoder to generate style-transferred (stylized) images. In order to train our proposed network, we employ a loss network, the pre-trained VGG-16, to compute content loss and style loss, both of which are efficiently used for the feature injection as well as the feature concatenation. Our intensive experiments show that our proposed model generates more balanced stylized images in content and style than state-of-the-art methods. Moreover, our proposed network achieves efficiency in speed.

READ FULL TEXT

page 2

page 3

page 4

page 6

page 9

page 10

page 13

page 15

research
06/13/2018

A Unified Framework for Generalizable Style Transfer: Style and Content Separation

Image style transfer has drawn broad attention in recent years. However,...
research
11/01/2020

Autonomous Extraction of Gleason Patterns for Grading Prostate Cancer using Multi-Gigapixel Whole Slide Images

Prostate cancer (PCa) is the second deadliest form of cancer in males. T...
research
12/01/2017

Multi-Content GAN for Few-Shot Font Style Transfer

In this work, we focus on the challenge of taking partial observations o...
research
07/16/2017

Chinese Typography Transfer

In this paper, we propose a new network architecture for Chinese typogra...
research
07/04/2018

Uncorrelated Feature Encoding for Faster Image Style Transfer

Recent fast style transfer methods use a pre-trained convolutional neura...
research
07/25/2019

Style Conditioned Recommendations

We propose Style Conditioned Recommendations (SCR) and introduce style i...
research
07/24/2019

One-Shot Mutual Affine-Transfer for Photorealistic Stylization

Photorealistic style transfer aims to transfer the style of a reference ...

Please sign up or login with your details

Forgot password? Click here to reset