Instance Normalization: The Missing Ingredient for Fast Stylization

07/27/2016
by   Dmitry Ulyanov, et al.
0

It this paper we revisit the fast stylization method introduced in Ulyanov et. al. (2016). We show how a small change in the stylization architecture results in a significant qualitative improvement in the generated images. The change is limited to swapping batch normalization with instance normalization, and to apply the latter both at training and testing times. The resulting method can be used to train high-performance architectures for real-time image generation. The code will is made available on github at https://github.com/DmitryUlyanov/texture_nets. Full paper can be found at arXiv:1701.02096.

READ FULL TEXT

page 2

page 3

page 5

page 6

research
04/06/2019

Instance-Level Meta Normalization

This paper presents a normalization mechanism called Instance-Level Meta...
research
10/26/2021

Revisiting Batch Normalization

Batch normalization (BN) is comprised of a normalization component follo...
research
05/13/2021

HINet: Half Instance Normalization Network for Image Restoration

In this paper, we explore the role of Instance Normalization in low-leve...
research
10/09/2019

Loss Surface Sightseeing by Multi-Point Optimization

We present multi-point optimization: an optimization technique that allo...
research
11/26/2018

Collaging on Internal Representations: An Intuitive Approach for Semantic Transfiguration

We present a novel CNN-based image editing method that allows the user t...
research
10/09/2019

Loss Landscape Sightseeing with Multi-Point Optimization

We present multi-point optimization: an optimization technique that allo...
research
05/21/2017

Shake-Shake regularization

The method introduced in this paper aims at helping deep learning practi...

Please sign up or login with your details

Forgot password? Click here to reset