Bayesian neural networks and dimensionality reduction

08/18/2020
by   Deborshee Sen, et al.
21

In conducting non-linear dimensionality reduction and feature learning, it is common to suppose that the data lie near a lower-dimensional manifold. A class of model-based approaches for such problems includes latent variables in an unknown non-linear regression function; this includes Gaussian process latent variable models and variational auto-encoders (VAEs) as special cases. VAEs are artificial neural networks (ANNs) that employ approximations to make computation tractable; however, current implementations lack adequate uncertainty quantification in estimating the parameters, predictive densities, and lower-dimensional subspace, and can be unstable and lack interpretability in practice. We attempt to solve these problems by deploying Markov chain Monte Carlo sampling algorithms (MCMC) for Bayesian inference in ANN models with latent variables. We address issues of identifiability by imposing constraints on the ANN parameters as well as by using anchor points. This is demonstrated on simulated and real data examples. We find that current MCMC sampling schemes face fundamental challenges in neural networks involving latent variables, motivating new research directions.

READ FULL TEXT

page 16

page 19

page 20

page 22

research
08/09/2014

Gaussian Process Structural Equation Models with Latent Variables

In a variety of disciplines such as social sciences, psychology, medicin...
research
01/21/2022

Curved factor analysis with the Ellipsoid-Gaussian distribution

There is a need for new models for characterizing dependence in multivar...
research
03/28/2018

Pseudo-marginal Bayesian inference for supervised Gaussian process latent variable models

We introduce a Bayesian framework for inference with a supervised versio...
research
12/22/2021

A Comparison of Bayesian Inference Techniques for Sparse Factor Analysis

Dimension reduction algorithms aim to discover latent variables which de...
research
07/13/2018

Sequential sampling of Gaussian process latent variable models

We consider the problem of inferring a latent function in a probabilisti...
research
08/05/2020

Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian Processes

We present a Bayesian approach to identify optimal transformations that ...

Please sign up or login with your details

Forgot password? Click here to reset