Inference in Multi-Layer Networks with Matrix-Valued Unknowns

01/26/2020
by   Parthe Pandit, et al.
0

We consider the problem of inferring the input and hidden variables of a stochastic multi-layer neural network from an observation of the output. The hidden variables in each layer are represented as matrices. This problem applies to signal recovery via deep generative prior models, multi-task and mixed regression and learning certain classes of two-layer neural networks. A unified approximation algorithm for both MAP and MMSE inference is proposed by extending a recently-developed Multi-Layer Vector Approximate Message Passing (ML-VAMP) algorithm to handle matrix-valued unknowns. It is shown that the performance of the proposed Multi-Layer Matrix VAMP (ML-Mat-VAMP) algorithm can be exactly predicted in a certain random large-system limit, where the dimensions N× d of the unknown quantities grow as N→∞ with d fixed. In the two-layer neural-network learning problem, this scaling corresponds to the case where the number of input features and training samples grow to infinity but the number of hidden nodes stays fixed. The analysis enables a precise prediction of the parameter and test error of the learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2019

Inference with Deep Generative Priors in High Dimensions

Deep generative priors offer powerful models for complex-structured data...
research
01/31/2022

Equivariant neural networks for recovery of Hadamard matrices

We propose a message passing neural network architecture designed to be ...
research
03/01/2019

Asymptotics of MAP Inference in Deep Networks

Deep generative priors are a powerful tool for reconstruction problems w...
research
06/20/2017

Inference in Deep Networks in High Dimensions

Deep generative networks provide a powerful tool for modeling complex da...
research
12/03/2022

Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models

We consider the problem of reconstructing the signal and the hidden vari...
research
02/13/2022

Information Density in Multi-Layer Resistive Memories

Resistive memories store information in a crossbar arrangement of two-te...
research
12/14/2021

Training Multi-Layer Over-Parametrized Neural Network in Subquadratic Time

We consider the problem of training a multi-layer over-parametrized neur...

Please sign up or login with your details

Forgot password? Click here to reset