Two-Stream Convolutional Networks for Dynamic Texture Synthesis

06/21/2017
by   Matthew Tesfaldet, et al.
0

We introduce a two-stream model for dynamic texture synthesis. Our model is based on pre-trained convolutional networks (ConvNets) that target two independent tasks: (i) object recognition, and (ii) optical flow prediction. Given an input dynamic texture, statistics of filter responses from the object recognition ConvNet encapsulate the per-frame appearance of the input texture, while statistics of filter responses from the optical flow ConvNet model its dynamics. To generate a novel texture, a noise input sequence is optimized to simultaneously match the feature statistics from each stream of an example texture. Inspired by recent work on image style transfer and enabled by the two-stream model, we also apply the synthesis approach to combine the texture appearance from one texture with the dynamics of another to generate entirely novel dynamic textures. We show that our approach generates novel, high quality samples that match both the framewise appearance and temporal evolution of input texture imagery. Finally, we quantitatively evaluate our approach with a thorough user study.

READ FULL TEXT

page 3

page 4

page 6

page 7

page 8

research
05/26/2016

DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies

A recent paper by Gatys et al. describes a method for rendering an image...
research
12/17/2019

Conditional Generative ConvNets for Exemplar-based Texture Synthesis

The goal of exemplar-based texture synthesis is to generate texture imag...
research
04/19/2012

Dynamic Template Tracking and Recognition

In this paper we address the problem of tracking non-rigid objects whose...
research
02/03/2018

Learning the Synthesizability of Dynamic Texture Samples

A dynamic texture (DT) refers to a sequence of images that exhibit tempo...
research
05/22/2021

Texture synthesis via projection onto multiscale, multilayer statistics

We provide a new model for texture synthesis based on a multiscale, mult...
research
01/31/2017

Stable and Controllable Neural Texture Synthesis and Style Transfer Using Histogram Losses

Recently, methods have been proposed that perform texture synthesis and ...
research
01/09/2017

Improved Texture Networks: Maximizing Quality and Diversity in Feed-forward Stylization and Texture Synthesis

The recent work of Gatys et al., who characterized the style of an image...

Please sign up or login with your details

Forgot password? Click here to reset