Criticality & Deep Learning I: Generally Weighted Nets

02/26/2017
by   Dan Oprisa, et al.
0

Motivated by the idea that criticality and universality of phase transitions might play a crucial role in achieving and sustaining learning and intelligent behaviour in biological and artificial networks, we analyse a theoretical and a pragmatic experimental set up for critical phenomena in deep learning. On the theoretical side, we use results from statistical physics to carry out critical point calculations in feed-forward/fully connected networks, while on the experimental side we set out to find traces of criticality in deep neural networks. This is our first step in a series of upcoming investigations to map out the relationship between criticality and learning in deep networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2018

DLL: A Blazing Fast Deep Neural Network Library

Deep Learning Library (DLL) is a new library for machine learning with d...
research
12/21/2013

Do Deep Nets Really Need to be Deep?

Currently, deep neural networks are the state of the art on problems suc...
research
05/25/2022

Exact Phase Transitions in Deep Learning

This work reports deep-learning-unique first-order and second-order phas...
research
07/05/2023

Absorbing Phase Transitions in Artificial Deep Neural Networks

Theoretical understanding of the behavior of infinitely-wide neural netw...
research
05/04/2018

Power Law in Sparsified Deep Neural Networks

The power law has been observed in the degree distributions of many biol...
research
09/07/2016

Deep Markov Random Field for Image Modeling

Markov Random Fields (MRFs), a formulation widely used in generative ima...
research
10/23/2018

A mathematical theory of semantic development in deep neural networks

An extensive body of empirical research has revealed remarkable regulari...

Please sign up or login with your details

Forgot password? Click here to reset