The energy landscape of a simple neural network

06/21/2017
by   Anthony Collins Gamst, et al.
0

We explore the energy landscape of a simple neural network. In particular, we expand upon previous work demonstrating that the empirical complexity of fitted neural networks is vastly less than a naive parameter count would suggest and that this implicit regularization is actually beneficial for generalization from fitted models.

READ FULL TEXT
research
04/30/2020

A simple geometric method for navigating the energy landscape of centroidal Voronoi tessellations

Finding optimal centroidal Voronoi tessellations (CVTs) of a 2D domain p...
research
03/02/2018

Essentially No Barriers in Neural Network Energy Landscape

Training neural networks involves finding minima of a high-dimensional n...
research
06/10/2020

All Local Minima are Global for Two-Layer ReLU Neural Networks: The Hidden Convex Optimization Landscape

We are interested in two-layer ReLU neural networks from an optimization...
research
07/23/2021

Taxonomizing local versus global structure in neural network loss landscapes

Viewing neural network models in terms of their loss landscapes has a lo...
research
02/12/2020

Topologically Densified Distributions

We study regularization in the context of small sample-size learning wit...
research
02/18/2021

How we are leading a 3-XORSAT challenge: from the energy landscape to the algorithm and its efficient implementation on GPUs

A recent 3-XORSAT challenge required to minimize a very complex and roug...
research
03/15/2018

Sonifying stochastic walks on biomolecular energy landscapes

Translating the complex, multi-dimensional data from simulations of biom...

Please sign up or login with your details

Forgot password? Click here to reset