Inference in Deep Networks in High Dimensions

06/20/2017
by   Alyson K. Fletcher, et al.
0

Deep generative networks provide a powerful tool for modeling complex data in a wide range of applications. In inverse problems that use these networks as generative priors on data, one must often perform inference of the inputs of the networks from the outputs. Inference is also required for sampling during stochastic training on these generative models. This paper considers inference in a deep stochastic neural network where the parameters (e.g., weights, biases and activation functions) are known and the problem is to estimate the values of the input and hidden units from the output. While several approximate algorithms have been proposed for this task, there are few analytic tools that can provide rigorous guarantees in the reconstruction error. This work presents a novel and computationally tractable output-to-input inference method called Multi-Layer Vector Approximate Message Passing (ML-VAMP). The proposed algorithm, derived from expectation propagation, extends earlier AMP methods that are known to achieve the replica predictions for optimality in simple linear inverse problems. Our main contribution shows that the mean-squared error (MSE) of ML-VAMP can be exactly predicted in a certain large system limit (LSL) where the numbers of layers is fixed and weight matrices are random and orthogonally-invariant with dimensions that grow to infinity. ML-VAMP is thus a principled method for output-to-input inference in deep networks with a rigorous and precise performance achievability result in high dimensions.

READ FULL TEXT
research
03/01/2019

Asymptotics of MAP Inference in Deep Networks

Deep generative priors are a powerful tool for reconstruction problems w...
research
11/08/2019

Inference with Deep Generative Priors in High Dimensions

Deep generative priors offer powerful models for complex-structured data...
research
06/27/2018

Plug-in Estimation in High-Dimensional Linear Inverse Problems: A Rigorous Analysis

Estimating a vector x from noisy linear measurements Ax+w often requires...
research
01/26/2020

Inference in Multi-Layer Networks with Matrix-Valued Unknowns

We consider the problem of inferring the input and hidden variables of a...
research
12/03/2022

Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models

We consider the problem of reconstructing the signal and the hidden vari...
research
02/25/2016

Expectation Consistent Approximate Inference: Generalizations and Convergence

Approximations of loopy belief propagation, including expectation propag...
research
05/26/2022

Multi-layer State Evolution Under Random Convolutional Design

Signal recovery under generative neural network priors has emerged as a ...

Please sign up or login with your details

Forgot password? Click here to reset