Calibration measures and reliability diagrams are two fundamental tools ...
Optimizing proper loss functions is popularly believed to yield predicto...
Multicalibration is a notion of fairness that aims to provide accurate
p...
We study the fundamental question of how to define and measure the dista...
Recent advances in learning aligned multimodal representations have been...
Calibration is a fundamental property of a good predictive model: it req...
The practical success of overparameterized neural networks has motivated...
The “Neural Tangent Kernel” (NTK) (Jacot et al 2018), and its empirical
...
We investigate and leverage a connection between Differential Privacy (D...
Large neural networks trained in the overparameterized regime are able t...
In machine learning, we traditionally evaluate the performance of a sing...
The recent work of Papyan, Han, Donoho (2020) presented an intriguin...
For a given distribution, learning algorithm, and performance metric, th...
We revisit and extend model stitching (Lenc Vedaldi 2015) as a metho...
We propose a new framework for reasoning about generalization in deep
le...
We introduce a new notion of generalization – Distributional Generalizat...
Learning rate schedule can significantly affect generalization performan...
Recent empirical and theoretical studies have shown that many learning
a...
In this expository note we describe a surprising phenomenon in
overparam...
We show that a variety of modern deep learning tasks exhibit a
"double-d...
We perform an experimental study of the dynamics of Stochastic Gradient
...
Current techniques in machine learning are so far are unable to learn
cl...
Using a mild variant of polar codes we design linear compression schemes...
Adaptive data analysis has posed a challenge to science due to its abili...
The ℓ_2 tracking problem is the task of obtaining a streaming algorithm
...
Arı kan's exciting discovery of polar codes has provided an altogether n...
Social networks and interactions in social media involve both positive a...