Convergence of neural networks to Gaussian mixture distribution

04/26/2022
by   Yasuhiko Asao, et al.
0

We give a proof that, under relatively mild conditions, fully-connected feed-forward deep random neural networks converge to a Gaussian mixture distribution as only the width of the last hidden layer goes to infinity. We conducted experiments for a simple model which supports our result. Moreover, it gives a detailed description of the convergence, namely, the growth of the last hidden layer gets the distribution closer to the Gaussian mixture, and the other layer successively get the Gaussian mixture closer to the normal distribution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2021

Random Neural Networks in the Infinite Width Limit as Gaussian Processes

This article gives a new proof that fully connected neural networks with...
research
11/14/2019

SDGM: Sparse Bayesian Classifier Based on a Discriminative Gaussian Mixture Model

In probabilistic classification, a discriminative model based on Gaussia...
research
02/26/2022

Variational Inference with Gaussian Mixture by Entropy Approximation

Variational inference is a technique for approximating intractable poste...
research
03/23/2021

Joint Distribution across Representation Space for Out-of-Distribution Detection

Deep neural networks (DNNs) have become a key part of many modern softwa...
research
03/28/2023

GAS: A Gaussian Mixture Distribution-Based Adaptive Sampling Method for PINNs

With recent study of the deep learning in scientific computation, the PI...
research
06/01/2018

Radio Galaxy Morphology Generation Using DNN Autoencoder and Gaussian Mixture Models

The morphology of a radio galaxy is highly affected by its central activ...
research
10/16/2020

Consistency of archetypal analysis

Archetypal analysis is an unsupervised learning method that uses a conve...

Please sign up or login with your details

Forgot password? Click here to reset