Duality of Width and Depth of Neural Networks

02/06/2020
by   Fenglei-Lei Fan, et al.
20

Here, we report that the depth and the width of a neural network are dual from two perspectives. First, we employ the partially separable representation to determine the width and depth. Second, we use the De Morgan law to guide the conversion between a deep network and a wide network. Furthermore, we suggest the generalized De Morgan law to promote duality to network equivalency.

READ FULL TEXT

page 8

page 9

research
09/13/2019

Finite Depth and Width Corrections to the Neural Tangent Kernel

We prove the precise scaling, at finite depth and width, for the mean an...
research
01/09/2020

Deep Network Approximation for Smooth Functions

This paper establishes optimal approximation error characterization of d...
research
02/01/2023

Width and Depth Limits Commute in Residual Networks

We show that taking the width and depth to infinity in a deep neural net...
research
06/18/2020

On the Predictability of Pruning Across Scales

We show that the error of magnitude-pruned networks follows a scaling la...
research
06/01/2021

Symmetry-via-Duality: Invariant Neural Network Densities from Parameter-Space Correlators

Parameter-space and function-space provide two different duality frames ...
research
05/11/2023

Rethink Depth Separation with Intra-layer Links

The depth separation theory is nowadays widely accepted as an effective ...
research
05/03/2021

Enumeration of parallelogram polycubes

In this paper, we enumerate parallelogram polycubes according to several...

Please sign up or login with your details

Forgot password? Click here to reset