-
Alternating Back-Propagation for Generator Network
This paper proposes an alternating back-propagation algorithm for learni...
read it
-
Learning Descriptor Networks for 3D Shape Synthesis and Analysis
This paper proposes a 3D shape descriptor network, which is a deep convo...
read it
-
Learning Dynamic Generator Model by Alternating Back-Propagation Through Time
This paper studies the dynamic generator model for spatial-temporal proc...
read it
-
Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model
This paper proposes the divergence triangle as a framework for joint tra...
read it
-
Energy Propagation in Deep Convolutional Neural Networks
Many practical machine learning tasks employ very deep convolutional neu...
read it
-
Learning Noise-Aware Encoder-Decoder from Noisy Labels by Alternating Back-Propagation for Saliency Detection
In this paper, we propose a noise-aware encoder-decoder framework to dis...
read it
-
CoT: Cooperative Training for Generative Modeling
We propose Cooperative Training (CoT) for training generative models tha...
read it
Cooperative Training of Descriptor and Generator Networks
This paper studies the cooperative training of two probabilistic models of signals such as images. Both models are parametrized by convolutional neural networks (ConvNets). The first network is a descriptor network, which is an exponential family model or an energy-based model, whose feature statistics or energy function are defined by a bottom-up ConvNet, which maps the observed signal to the feature statistics. The second network is a generator network, which is a non-linear version of factor analysis. It is defined by a top-down ConvNet, which maps the latent factors to the observed signal. The maximum likelihood training algorithms of both the descriptor net and the generator net are in the form of alternating back-propagation, and both algorithms involve Langevin sampling. We observe that the two training algorithms can cooperate with each other by jumpstarting each other's Langevin sampling, and they can be naturally and seamlessly interwoven into a CoopNets algorithm that can train both nets simultaneously.
READ FULL TEXT
Comments
There are no comments yet.