Sparse Super-Regular Networks

01/04/2022
by   Andrew W. E. McDonald, et al.
0

It has been argued by Thom and Palm that sparsely-connected neural networks (SCNs) show improved performance over fully-connected networks (FCNs). Super-regular networks (SRNs) are neural networks composed of a set of stacked sparse layers of (epsilon, delta)-super-regular pairs, and randomly permuted node order. Using the Blow-up Lemma, we prove that as a result of the individual super-regularity of each pair of layers, SRNs guarantee a number of properties that make them suitable replacements for FCNs for many tasks. These guarantees include edge uniformity across all large-enough subsets, minimum node in- and out-degree, input-output sensitivity, and the ability to embed pre-trained constructs. Indeed, SRNs have the capacity to act like FCNs, and eliminate the need for costly regularization schemes like Dropout. We show that SRNs perform similarly to X-Nets via readily reproducible experiments, and offer far greater guarantees and control over network structure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2018

On the relationship between Dropout and Equiangular Tight Frames

Dropout is a popular regularization technique in neural networks. Yet, t...
research
08/17/2021

Minimum degree conditions for containing an r-regular r-connected subgraph

We study optimal minimum degree conditions when an n-vertex graph G cont...
research
11/16/2018

DropFilter: A Novel Regularization Method for Learning Convolutional Neural Networks

The past few years have witnessed the fast development of different regu...
research
10/16/2020

A case where a spindly two-layer linear network whips any neural network with a fully connected input layer

It was conjectured that any neural network of any structure and arbitrar...
research
11/06/2017

Characterizing Sparse Connectivity Patterns in Neural Networks

We propose a novel way of reducing the number of parameters in the stora...
research
11/09/2018

Deep Learning Super-Diffusion in Multiplex Networks

Complex network theory has shown success in understanding the emergent a...
research
06/30/2020

Data-driven Regularization via Racecar Training for Generalizing Neural Networks

We propose a novel training approach for improving the generalization in...

Please sign up or login with your details

Forgot password? Click here to reset