Notes on Deep Learning Theory

12/10/2020
by   Eugene. A. Golikov, et al.
0

These are the notes for the lectures that I was giving during Fall 2020 at the Moscow Institute of Physics and Technology (MIPT) and at the Yandex School of Data Analysis (YSDA). The notes cover some aspects of initialization, loss landscape, generalization, and a neural tangent kernel theory. While many other topics (e.g. expressivity, a mean-field theory, a double descent phenomenon) are missing in the current version, we plan to add them in future revisions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2018

Notes on Deep Learning for NLP

My notes on Deep Learning for NLP....
research
08/28/2019

Lecture Notes: Selected topics on robust statistical learning theory

These notes gather recent results on robust statistical learning theory....
research
12/03/2021

Lecture Notes on Support Preconditioning

These are lectures notes on support preconditioning, originally written ...
research
05/10/2021

Lecture notes on descriptional complexity and randomness

A didactical survey of the foundations of Algorithmic Information Theory...
research
10/13/2022

Notes on CSPs and Polymorphisms

These are notes from a multi-year learning seminar on the algebraic appr...
research
08/28/2019

The Rise and Fall of the Note: Changing Paper Lengths in ACM CSCW, 2000-2018

In this note, I quantitatively examine various trends in the lengths of ...
research
07/05/2023

Kernels, Data Physics

Lecture notes from the course given by Professor Julia Kempe at the summ...

Please sign up or login with your details

Forgot password? Click here to reset