On Transformations in Stochastic Gradient MCMC

03/07/2019
by   Soma Yokoi, et al.
8

Stochastic gradient Langevin dynamics (SGLD) is a widely used sampler for the posterior inference with a large scale dataset. Although SGLD is designed for unbounded random variables, many practical models incorporate variables with boundaries such as non-negative ones or those in a finite interval. Existing modifications of SGLD for handling bounded random variables resort to heuristics without a formal guarantee of sampling from the true stationary distribution. In this paper, we reformulate the SGLD algorithm incorporating a deterministic transformation with rigorous theories. Our method transforms unbounded samples obtained by SGLD into the domain of interest. We demonstrate transformed SGLD in both artificial problem settings and real-world applications of Bayesian non-negative matrix factorization and binary neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2015

Large-Scale Distributed Bayesian Matrix Factorization using Stochastic Gradient MCMC

Despite having various attractive qualities such as high prediction accu...
research
09/20/2018

Optimal Bayesian clustering using non-negative matrix factorization

Bayesian model-based clustering is a widely applied procedure for discov...
research
09/14/2018

Canonical spectral representation for exchangeable max-stable sequences

The set of infinite-dimensional, symmetric stable tail dependence functi...
research
10/27/2016

Rapid Posterior Exploration in Bayesian Non-negative Matrix Factorization

Non-negative Matrix Factorization (NMF) is a popular tool for data explo...
research
01/10/2019

On large deviations for sums of discrete m-dependent random variables

The ratio P(S_n=x)/P(Z_n=x) is investigated for three cases: (a) when S_...
research
12/14/2018

Stochastic comparisons of the largest claim amounts from two sets of interdependent heterogeneous portfolios

Let X_λ_1,...,X_λ_n be dependent non-negative random variables and Y_i=...
research
05/01/2021

Stochastic Block-ADMM for Training Deep Networks

In this paper, we propose Stochastic Block-ADMM as an approach to train ...

Please sign up or login with your details

Forgot password? Click here to reset