Structural Analysis of Sparse Neural Networks

10/16/2019
by   Julian Stier, et al.
0

Sparse Neural Networks regained attention due to their potential for mathematical and computational advantages. We give motivation to study Artificial Neural Networks (ANNs) from a network science perspective, provide a technique to embed arbitrary Directed Acyclic Graphs into ANNs and report study results on predicting the performance of image classifiers based on the structural properties of the networks' underlying graph. Results could further progress neuroevolution and add explanations for the success of distinct architectures from a structural perspective.

READ FULL TEXT
research
05/24/2022

Transition to Linearity of General Neural Networks with Directed Acyclic Graph Architecture

In this paper we show that feedforward neural networks corresponding to ...
research
12/01/2021

Asymptotic properties of one-layer artificial neural networks with sparse connectivity

A law of large numbers for the empirical distribution of parameters of a...
research
01/12/2023

A Network Science perspective of Graph Convolutional Networks: A survey

The mining and exploitation of graph structural information have been th...
research
07/11/2022

The Mean Dimension of Neural Networks – What causes the interaction effects?

Owen and Hoyt recently showed that the effective dimension offers key st...
research
02/19/2020

Neural Networks on Random Graphs

We performed a massive evaluation of neural networks with architectures ...
research
06/29/2018

Computer Simulation of Neural Networks Using Spreadsheets: The Dawn of the Age of Camelot

The article substantiates the necessity to develop training methods of c...
research
03/17/2017

Deciding How to Decide: Dynamic Routing in Artificial Neural Networks

We propose and systematically evaluate three strategies for training dyn...

Please sign up or login with your details

Forgot password? Click here to reset