Deep Exponential Families

11/10/2014
by   Rajesh Ranganath, et al.
0

We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent "black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show that going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2021

Convolutional Deep Exponential Families

We describe convolutional deep exponential families (CDEFs) in this pape...
research
12/18/2017

Deep generative models of genetic variation capture mutation effects

The functions of proteins and RNAs are determined by a myriad of interac...
research
11/07/2015

Hierarchical Variational Models

Black box variational inference allows researchers to easily prototype a...
research
06/13/2017

Recurrent Latent Variable Networks for Session-Based Recommendation

In this work, we attempt to ameliorate the impact of data sparsity in th...
research
06/18/2020

Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization

Recent research has seen several advances relevant to black-box VI, but ...
research
01/18/2018

Overpruning in Variational Bayesian Neural Networks

The motivations for using variational inference (VI) in neural networks ...
research
05/17/2018

A Forest Mixture Bound for Block-Free Parallel Inference

Coordinate ascent variational inference is an important algorithm for in...

Please sign up or login with your details

Forgot password? Click here to reset