Kernels, Data Physics

07/05/2023
by   Francesco Cagnetta, et al.
0

Lecture notes from the course given by Professor Julia Kempe at the summer school "Statistical physics of Machine Learning" in Les Houches. The notes discuss the so-called NTK approach to problems in machine learning, which consists of gaining an understanding of generally unsolvable problems by finding a tractable kernel formulation. The notes are mainly focused on practical applications such as data distillation and adversarial robustness, examples of inductive bias are also discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2019

Lecture Notes: Optimization for Machine Learning

Lecture notes on optimization for machine learning, derived from a cours...
research
03/29/2018

Notes on computational-to-statistical gaps: predictions using statistical physics

In these notes we describe heuristics to predict computational-to-statis...
research
06/28/2023

Sparse Representations, Inference and Learning

In recent years statistical physics has proven to be a valuable tool to ...
research
01/03/2023

Deep Learning and Computational Physics (Lecture Notes)

These notes were compiled as lecture notes for a course developed and ta...
research
12/10/2020

Notes on Deep Learning Theory

These are the notes for the lectures that I was giving during Fall 2020 ...
research
11/05/2019

Information Geometry of the Probability Simplex: A Short Course

This set of notes is intended for a short course aiming to provide an (a...
research
01/02/2023

Algorithms for Massive Data – Lecture Notes

These are the lecture notes for the course CM0622 - Algorithms for Massi...

Please sign up or login with your details

Forgot password? Click here to reset