Learning Linear Transformations for Fast Arbitrary Style Transfer

08/14/2018
by   Xueting Li, et al.
10

Given a random pair of images, an arbitrary style transfer method extracts the feel from the reference image to synthesize an output based on the look of the other content image. Recent arbitrary style transfer methods transfer second order statistics from reference image onto content image via a multiplication between content image features and a transformation matrix, which is computed from features with a pre-determined algorithm. These algorithms either require computationally expensive operations, or fail to model the feature covariance and produce artifacts in synthesized images. Generalized from these methods, in this work, we derive the form of transformation matrix theoretically and present an arbitrary style transfer approach that learns the transformation matrix with a feed-forward network. Our algorithm is highly efficient yet allows a flexible combination of multi-level styles while preserving content affinity during style transfer process. We demonstrate the effectiveness of our approach on four tasks: artistic style transfer, video and photo-realistic style transfer as well as domain adaptation, including comparisons with the state-of-the-art methods.

READ FULL TEXT

page 2

page 6

page 7

page 8

page 9

page 10

page 11

research
09/27/2019

Style Transfer by Rigid Alignment in Neural Net Feature Space

Arbitrary style transfer is an important problem in computer vision that...
research
05/25/2018

Beyond Textures: Learning from Multi-domain Artistic Images for Arbitrary Style Transfer

We propose a fast feed-forward network for arbitrary style transfer, whi...
research
06/26/2022

Non-Parametric Style Transfer

Recent feed-forward neural methods of arbitrary image style transfer mai...
research
08/16/2019

How Sequence-to-Sequence Models Perceive Language Styles?

Style is ubiquitous in our daily language uses, while what is language s...
research
08/22/2018

Manipulating Attributes of Natural Scenes via Hallucination

In this study, we explore building a two-stage framework for enabling us...
research
07/11/2022

CCPL: Contrastive Coherence Preserving Loss for Versatile Style Transfer

In this paper, we aim to devise a universally versatile style transfer m...
research
09/13/2017

Meta Networks for Neural Style Transfer

In this paper we propose a new method to get the specified network param...

Please sign up or login with your details

Forgot password? Click here to reset