Learning and Inference in Sparse Coding Models with Langevin Dynamics

04/23/2022
by   Michael Y. -S. Fang, et al.
0

We describe a stochastic, dynamical system capable of inference and learning in a probabilistic latent variable model. The most challenging problem in such models - sampling the posterior distribution over latent variables - is proposed to be solved by harnessing natural sources of stochasticity inherent in electronic and neural systems. We demonstrate this idea for a sparse coding model by deriving a continuous-time equation for inferring its latent variables via Langevin dynamics. The model parameters are learned by simultaneously evolving according to another continuous-time equation, thus bypassing the need for digital accumulators or a global clock. Moreover we show that Langevin dynamics lead to an efficient procedure for sampling from the posterior distribution in the 'L0 sparse' regime, where latent variables are encouraged to be set to zero as opposed to having a small L1 norm. This allows the model to properly incorporate the notion of sparsity rather than having to resort to a relaxed version of sparsity to make optimization tractable. Simulations of the proposed dynamical system on both synthetic and natural image datasets demonstrate that the model is capable of probabilistically correct inference, enabling learning of the dictionary as well as parameters of the prior.

READ FULL TEXT

page 21

page 25

page 26

page 40

page 41

research
12/04/2019

Learning Deep Generative Models with Short Run Inference Dynamics

This paper studies the fundamental problem of learning deep generative m...
research
09/04/2017

Inverse Ising problem in continuous time: A latent variable approach

We consider the inverse Ising problem, i.e. the inference of network cou...
research
10/25/2019

Attention for Inference Compilation

We present a new approach to automatic amortized inference in universal ...
research
06/15/2018

Stochastic WaveNet: A Generative Latent Variable Model for Sequential Data

How to model distribution of sequential data, including but not limited ...
research
08/17/2020

A Hierarchical Bayesian SED Model for Type Ia Supernovae in the Optical to Near-Infrared

While conventional Type Ia supernova (SN Ia) cosmology analyses rely pri...
research
09/20/2022

Learning Sparse Latent Representations for Generator Model

Sparsity is a desirable attribute. It can lead to more efficient and mor...
research
04/19/2021

Simulation-Based Inference with Approximately Correct Parameters via Maximum Entropy

Inferring the input parameters of simulators from observations is a cruc...

Please sign up or login with your details

Forgot password? Click here to reset