Expressive Power and Loss Surfaces of Deep Learning Models

08/08/2021
by   Simant Dube, et al.
0

The goals of this paper are two-fold. The first goal is to serve as an expository tutorial on the working of deep learning models which emphasizes geometrical intuition about the reasons for success of deep learning. The second goal is to complement the current results on the expressive power of deep learning models and their loss surfaces with novel insights and results. In particular, we describe how deep neural networks carve out manifolds especially when the multiplication neurons are introduced. Multiplication is used in dot products and the attention mechanism and it is employed in capsule networks and self-attention based transformers. We also describe how random polynomial, random matrix, spin glass and computational complexity perspectives on the loss surfaces are interconnected.

READ FULL TEXT
research
05/17/2022

Universal characteristics of deep neural network loss surfaces from random matrix theory

This paper considers several aspects of random matrix universality in de...
research
05/09/2018

A Unified Framework of Deep Neural Networks by Capsules

With the growth of deep learning, how to describe deep neural networks u...
research
12/22/2022

EuclidNets: An Alternative Operation for Efficient Inference of Deep Learning Models

With the advent of deep learning application on edge devices, researcher...
research
06/03/2023

Random matrix theory and the loss surfaces of neural networks

Neural network models are one of the most successful approaches to machi...
research
09/25/2019

Attention-based Deep Tropical Cyclone Rapid Intensification Prediction

Rapid intensification (RI) is when a sudden and considerable increase in...
research
02/12/2021

Applicability of Random Matrix Theory in Deep Learning

We investigate the local spectral statistics of the loss surface Hessian...
research
08/24/2022

On a Built-in Conflict between Deep Learning and Systematic Generalization

In this paper, we hypothesize that internal function sharing is one of t...

Please sign up or login with your details

Forgot password? Click here to reset