Asymptotic properties of one-layer artificial neural networks with sparse connectivity

12/01/2021
by   Christian Hirsch, et al.
0

A law of large numbers for the empirical distribution of parameters of a one-layer artificial neural networks with sparse connectivity is derived for a simultaneously increasing number of both, neurons and training iterations of the stochastic gradient descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2011

Building a Chaotic Proved Neural Network

Chaotic neural networks have received a great deal of attention these la...
research
05/25/2022

On the Interpretability of Regularisation for Neural Networks Through Model Gradient Similarity

Most complex machine learning and modelling techniques are prone to over...
research
01/18/2017

On the Performance of Network Parallel Training in Artificial Neural Networks

Artificial Neural Networks (ANNs) have received increasing attention in ...
research
12/05/2018

Neuromodulated Learning in Deep Neural Networks

In the brain, learning signals change over time and synaptic location, a...
research
10/16/2019

Structural Analysis of Sparse Neural Networks

Sparse Neural Networks regained attention due to their potential for mat...
research
11/05/2019

Dynamic Time Warp Convolutional Networks

Where dealing with temporal sequences it is fair to assume that the same...

Please sign up or login with your details

Forgot password? Click here to reset