Incorporating long-range consistency in CNN-based texture generation

06/03/2016
by   G. Berger, et al.
0

Gatys et al. (2015) showed that pair-wise products of features in a convolutional network are a very effective representation of image textures. We propose a simple modification to that representation which makes it possible to incorporate long-range structure into image generation, and to render images that satisfy various symmetry constraints. We show how this can greatly improve rendering of regular textures and of images that contain other kinds of symmetric structure. We also present applications to inpainting and season transfer.

READ FULL TEXT

page 6

page 10

page 11

page 12

page 13

page 14

page 15

page 17

research
08/04/2020

High resolution neural texture synthesis with long range constraints

The field of texture synthesis has witnessed important progresses over t...
research
09/08/2022

Lightweight Long-Range Generative Adversarial Networks

In this paper, we introduce novel lightweight generative adversarial net...
research
02/03/2023

TEXTure: Text-Guided Texturing of 3D Shapes

In this paper, we present TEXTure, a novel method for text-guided genera...
research
07/14/2020

Rethinking Image Inpainting via a Mutual Encoder-Decoder with Feature Equalizations

Deep encoder-decoder based CNNs have advanced image inpainting methods f...
research
11/21/2022

Long Range Constraints for Neural Texture Synthesis Using Sliced Wasserstein Loss

In the past decade, exemplar-based texture synthesis algorithms have see...
research
12/15/2020

Image Inpainting Guided by Coherence Priors of Semantics and Textures

Existing inpainting methods have achieved promising performance in recov...
research
01/12/2021

MP3net: coherent, minute-long music generation from raw audio with a simple convolutional GAN

We present a deep convolutional GAN which leverages techniques from MP3/...

Please sign up or login with your details

Forgot password? Click here to reset