Reframing Neural Networks: Deep Structure in Overcomplete Representations

03/10/2021
by   Calvin Murdock, et al.
0

In comparison to classical shallow representation learning techniques, deep neural networks have achieved superior performance in nearly every application benchmark. But despite their clear empirical advantages, it is still not well understood what makes them so effective. To approach this question, we introduce deep frame approximation, a unifying framework for representation learning with structured overcomplete frames. While exact inference requires iterative optimization, it may be approximated by the operations of a feed-forward deep neural network. We then indirectly analyze how model capacity relates to the frame structure induced by architectural hyperparameters such as depth, width, and skip connections. We quantify these structural differences with the deep frame potential, a data-independent measure of coherence linked to representation uniqueness and stability. As a criterion for model selection, we show correlation with generalization error on a variety of common deep network architectures such as ResNets and DenseNets. We also demonstrate how recurrent networks implementing iterative optimization algorithms achieve performance comparable to their feed-forward approximations. This connection to the established theory of overcomplete representations suggests promising new directions for principled deep network architecture design with less reliance on ad-hoc engineering.

READ FULL TEXT

page 1

page 8

page 15

research
03/30/2020

Dataless Model Selection with the Deep Frame Potential

Choosing a deep neural network architecture is a fundamental problem in ...
research
03/16/2018

Deep Component Analysis via Alternating Direction Neural Networks

Despite a lack of theoretical understanding, deep neural networks have a...
research
09/13/2021

Explaining Deep Learning Representations by Tracing the Training Process

We propose a novel explanation method that explains the decisions of a d...
research
06/08/2017

CortexNet: a Generic Network Family for Robust Visual Temporal Representations

In the past five years we have observed the rise of incredibly well perf...
research
11/29/2020

Architectural Adversarial Robustness: The Case for Deep Pursuit

Despite their unmatched performance, deep neural networks remain suscept...
research
05/21/2017

CrossNets : A New Approach to Complex Learning

We propose a novel neural network structure called CrossNets, which cons...
research
09/11/2014

Building Program Vector Representations for Deep Learning

Deep learning has made significant breakthroughs in various fields of ar...

Please sign up or login with your details

Forgot password? Click here to reset