
Classifying highdimensional Gaussian mixtures: Where kernel methods fail and neural networks succeed
A recent series of theoretical works showed that the dynamics of neural ...
read it

Capturing the learning curves of generic features maps for realistic data sets with a teacherstudent model
Teacherstudent models provide a powerful framework in which the typical...
read it

Adversarial Robustness by Design through Analog Computing and Synthetic Gradients
We propose a new defense mechanism against adversarial attacks inspired ...
read it

Hardware Beyond Backpropagation: a Photonic CoProcessor for Direct Feedback Alignment
The scaling hypothesis motivates the expansion of models past trillions ...
read it

Construction of optimal spectral methods in phase retrieval
We consider the phase retrieval problem, in which the observer wishes to...
read it

Epidemic mitigation by statistical inference from contact tracing data
Contacttracing is an essential tool in order to mitigate the impact of ...
read it

The Gaussian equivalence of generative models for learning with twolayer neural networks
Understanding the impact of data structure on learning in neural network...
read it

Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures
Despite being the workhorse of deep learning, the backpropagation algori...
read it

Reservoir Computing meets Recurrent Kernels and Structured Transforms
Reservoir Computing is a class of simple yet efficient Recurrent Neural ...
read it

Complex Dynamics in Simple Neural Networks: Understanding Gradient Flow in Phase Retrieval
Despite the widespread use of gradientbased algorithms for optimizing h...
read it

Asymptotic Errors for TeacherStudent Convex Generalized Linear Models (or : How to Prove Kabashima's Replica Formula)
There has been a recent surge of interest in the study of asymptotic rec...
read it

Generalization error in highdimensional perceptrons: Approaching Bayes error with convex optimization
We consider a commonly studied supervised classification of a synthetic ...
read it

Dynamical meanfield theory for stochastic gradient descent in Gaussian mixture classification
We analyze in a closed form the learning dynamics of stochastic gradient...
read it

Phase retrieval in high dimensions: Statistical and computational phase transitions
We consider the phase retrieval problem of reconstructing a ndimensiona...
read it

Lightintheloop: using a photonics coprocessor for scalable training of neural networks
As neural networks grow larger and more complex and datahungry, trainin...
read it

TRAMP: Compositional Inference with TRee Approximate Message Passing
We introduce tramp, standing for TRee Approximate Message Passing, a pyt...
read it

Double Trouble in Double Descent : Bias and Variance(s) in the Lazy Regime
Deep neural networks can achieve remarkable generalization performances ...
read it

The role of regularization in classification of highdimensional noisy Gaussian mixture
We consider a highdimensional mixture of two Gaussians in the noisy reg...
read it

Generalisation error in learning with random features and the hidden manifold model
We study generalised linear regression and classification for a syntheti...
read it

Asymptotic errors for convex penalized linear regression beyond Gaussian matrices
We consider the problem of learning a coefficient vector x_0 in R^N from...
read it

Rademacher complexity and spin glasses: A link between the replica and statistical theories of learning
Statistical learning theory provides bounds of the generalization gap, u...
read it

Exact asymptotics for phase retrieval and compressed sensing with random generative priors
We consider the problem of compressed sensing and of (realvalued) phase...
read it

Kernel computations from largescale random features obtained by Optical Processing Units
Approximating kernel functions with random features (RFs)has been a succ...
read it

Blind calibration for compressed sensing: State evolution and an online algorithm
Compressed sensing, allows to acquire compressible signals with a small ...
read it

Modelling the influence of data structure on learning in neural networks
The lack of crisp mathematical models that capture the structure of real...
read it

Who is Afraid of Big Bad Minima? Analysis of GradientFlow in a Spiked MatrixTensor Model
Gradientbased algorithms are effective for many machine learning tasks,...
read it

Hightemperature Expansions and Message Passing Algorithms
Improved meanfield technics are a central theme of statistical physics ...
read it

Dynamics of stochastic gradient descent for twolayer neural networks in the teacherstudent setup
Deep neural networks achieve stellar generalisation even when they have ...
read it

Principled Training of Neural Networks with Direct Feedback Alignment
The backpropagation algorithm has long been the canonical training metho...
read it

On the Universality of Noiseless Linear Estimation with Respect to the Measurement Matrix
In a noiseless linear estimation problem, one aims to reconstruct a vect...
read it

The spiked matrix model with generative priors
Using a lowdimensional parametrization of signals is a generic and powe...
read it

Passed & Spurious: analysing descent algorithms and local minima in spiked matrixtensor model
In this work we analyse quantitatively the interplay between the loss la...
read it

Generalisation dynamics of online learning in overparameterised neural networks
Deep neural networks achieve stellar generalisation on a variety of prob...
read it

Marvels and Pitfalls of the Langevin Algorithm in Noisy Highdimensional Inference
Gradientdescentbased algorithms and their stochastic versions have wid...
read it

Rankone matrix estimation: analysis of algorithmic and information theoretic limits by the spatial coupling method
Factorizing lowrank matrices is a problem with many applications in mac...
read it

Approximate messagepassing for convex optimization with nonseparable penalties
We introduce an iterative optimization scheme for convex objectives cons...
read it

Approximate Survey Propagation for Statistical Inference
Approximate message passing algorithm enjoyed considerable attention in ...
read it

Fundamental limits of detection in the spiked Wigner model
We study the fundamental limits of detecting the presence of an additive...
read it

The committee machine: Computational to statistical gaps in learning a twolayers neural network
Heuristic tools from statistical physics have been used in the past to l...
read it

Entropy and mutual information in models of deep neural networks
We examine a class of deep learning models with a tractable method to co...
read it

The Mutual Information in Random Linear Estimation Beyond i.i.d. Matrices
There has been definite progress recently in proving the variational sin...
read it

Estimation in the Spiked Wigner Model: A Short Proof of the Replica Formula
We consider the problem of estimating the rankone perturbation of a Wig...
read it

Phase Transitions, Optimal Errors and Optimality of MessagePassing in Generalized Linear Models
We consider generalized linear models (GLMs) where an unknown ndimensio...
read it

Streaming Bayesian inference: theoretical limits and minibatch approximate messagepassing
In statistical learning for realworld largescale data problems, one mu...
read it

A Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines
Restricted Boltzmann machines (RBMs) are energybased neuralnetworks wh...
read it

MultiLayer Generalized Linear Estimation
We consider the problem of reconstructing a signal from multilayered (p...
read it

Phase transitions and optimal algorithms in highdimensional Gaussian mixture clustering
We consider the problem of Gaussian mixture clustering in the highdimen...
read it

Inferring Sparsity: Compressed Sensing using Generalized Restricted Boltzmann Machines
In this work, we consider compressed sensing reconstruction from M measu...
read it

Fast Randomized SemiSupervised Clustering
We consider the problem of clustering partially labeled data from a mini...
read it

Statistical physics of inference: Thresholds and algorithms
Many questions of fundamental interest in todays science can be formulat...
read it
Florent Krzakala
verfied profile
Florent Krzakala is a professor at Sorbonne Université and a Researcher at Ecole Normale Superieure
in Paris. His research interests include Statistical Physics, Machine Learning, Statistics, Signal Processing, Computer Science and Computational Optics.
He leads the SPHINX team in Ecole Normale in Paris, and is the holder of the CFMENS Datascience chair, of a PRAIRIE Institute chair and is a fellow and member of ELLIS. He is also the funder and scientific advisor of the startup Lighton.
Visit Florent's website