Learning the Synthesizability of Dynamic Texture Samples

02/03/2018
by   Feng Yang, et al.
0

A dynamic texture (DT) refers to a sequence of images that exhibit temporal regularities and has many applications in computer vision and graphics. Given an exemplar of dynamic texture, it is a dynamic but challenging task to generate new samples with high quality that are perceptually similar to the input exemplar, which is known to be example-based dynamic texture synthesis (EDTS). Numerous approaches have been devoted to this problem, in the past decades, but none them are able to tackle all kinds of dynamic textures equally well. In this paper, we investigate the synthesizability of dynamic texture samples: given a dynamic texture sample, how synthesizable it is by using EDTS, and which EDTS method is the most suitable to synthesize it? To this end, we propose to learn regression models to connect dynamic texture samples with synthesizability scores, with the help of a compiled dynamic texture dataset annotated in terms of synthesizability. More precisely, we first define the synthesizability of DT samples and characterize them by a set of spatiotemporal features. Based on these features and an annotated dynamic texture dataset, we then train regression models to predict the synthesizability scores of texture samples and learn classifiers to select the most suitable EDTS methods. We further complete the selection, partition and synthesizability prediction of dynamic texture samples in a hierarchical scheme. We finally apply the learned synthesizability to detecting synthesizable regions in videos. The experiments demonstrate that our method can effectively learn and predict the synthesizability of DT samples.

READ FULL TEXT

page 3

page 6

page 11

page 12

page 16

page 20

page 21

page 22

research
12/17/2019

Conditional Generative ConvNets for Exemplar-based Texture Synthesis

The goal of exemplar-based texture synthesis is to generate texture imag...
research
01/17/2012

Spatiotemporal Gabor filters: a new method for dynamic texture recognition

This paper presents a new method for dynamic texture recognition based o...
research
06/21/2017

Two-Stream Convolutional Networks for Dynamic Texture Synthesis

We introduce a two-stream model for dynamic texture synthesis. Our model...
research
04/12/2019

Macrocanonical Models for Texture Synthesis

In this article we consider macrocanonical models for texture synthesis....
research
08/20/2021

Spatiotemporal Texture Reconstruction for Dynamic Objects Using a Single RGB-D Camera

This paper presents an effective method for generating a spatiotemporal ...
research
06/27/2018

Dynamic texture analysis with diffusion in networks

Dynamic texture is a field of research that has gained considerable inte...
research
09/21/2020

DR2S : Deep Regression with Region Selection for Camera Quality Evaluation

In this work, we tackle the problem of estimating a camera capability to...

Please sign up or login with your details

Forgot password? Click here to reset