
Universal Prediction Band via SemiDefinite Programming
We propose a computationally efficient method to construct nonparametric...
read it

Interpolating Classifiers Make Few Mistakes
This paper provides elementary analyses of the regret and generalization...
read it

Deep Learning for Individual Heterogeneity
We propose a methodology for effectively modeling individual heterogenei...
read it

Mehler's Formula, Branching Process, and Compositional Kernels of Deep Neural Networks
In this paper, we utilize a connection between compositional kernels and...
read it

A Precise HighDimensional Asymptotic Theory for Boosting and MinL1Norm Interpolated Classifiers
This paper establishes a precise highdimensional asymptotic theory for ...
read it

Estimating Certain Integral Probability Metric (IPM) is as Hard as Estimating under the IPM
We study the minimax optimal rates for estimating a range of Integral Pr...
read it

On the Minimax Optimality of Estimating the Wasserstein Metric
We study the minimax optimal rate for estimating the Wasserstein1 metri...
read it

On the Risk of MinimumNorm Interpolants and Restricted Lower Isometry of Kernels
We study the risk of minimumnorm interpolants of data in a Reproducing ...
read it

Training Neural Networks as Learning Dataadaptive Kernels: Provable Representation and Approximation Benefits
Consider the problem: given data pair (x, y) drawn from a population wit...
read it

On How Well Generative Adversarial Networks Learn Densities: Nonparametric and Parametric Results
We study in this paper the rate of convergence for learning distribution...
read it

Deep Neural Networks for Estimation and Inference: Application to Causal Effects and Other Semiparametric Estimands
We study deep neural networks and their use in semiparametric inference....
read it

Just Interpolate: Kernel "Ridgeless" Regression Can Generalize
In the absence of explicit regularization, Kernel "Ridgeless" Regression...
read it

Local Optimality and Generalization Guarantees for the Langevin Algorithm via Empirical Metastability
We study the detailed pathwise behavior of the discretetime Langevin a...
read it

Interaction Matters: A Note on Nonasymptotic Local Convergence of Generative Adversarial Networks
Motivated by the pursuit of a systematic computational and algorithmic u...
read it

How Well Can Generative Adversarial Networks Learn Densities: A Nonparametric View
We study in this paper the rate of convergence for learning densities un...
read it

How Well Can Generative Adversarial Networks (GAN) Learn Densities: A Nonparametric View
We study in this paper the rate of convergence for learning densities un...
read it

Statistical Inference for the Population Landscape via Moment Adjusted Stochastic Gradients
Modern statistical inference tasks often require iterative optimization ...
read it

FisherRao Metric, Geometry, and Complexity of Neural Networks
We study the relationship between geometry and capacity measures for dee...
read it

Weighted Message Passing and Minimum Energy Flow for Heterogeneous Stochastic Block Models with Side Information
We study the misclassification error for community detection in general ...
read it

Inference via Message Passing on Partially Labeled Stochastic Block Models
We study the community detection and recovery problem in partiallylabel...
read it

Learning with Square Loss: Localization through Offset Rademacher Complexity
We consider regression with square loss and general classes of functions...
read it

Computational and Statistical Boundaries for Submatrix Localization in a Large Noisy Matrix
The interplay between computational efficiency and statistical accuracy ...
read it

Geometric Inference for General HighDimensional Linear Inverse Problems
This paper presents a unified geometric framework for the statistical an...
read it

On ZerothOrder Stochastic Convex Optimization via Random Walks
We propose a method for zeroth order stochastic convex optimization that...
read it
Tengyuan Liang
is this you? claim profile
Assistant professor at Chicago Booth. He is also the George C. Tiao faculty fellow in data science research.