
Dataset Distillation with Infinitely Wide Convolutional Networks
The effectiveness of machine learning algorithms arises from being able ...
read it

OCTGAN: Neural ODEbased Conditional Tabular GANs
Synthesizing tabular data is attracting much attention these days for va...
read it

Explaining Neural Scaling Laws
The test loss of welltrained neural networks often follows precise powe...
read it

Towards NNGPguided Neural Architecture Search
The predictions of wide Bayesian neural networks are described by a Gaus...
read it

Dataset MetaLearning from Kernel RidgeRegression
One of the most fundamental aspects of any machine learning algorithm is...
read it

Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the InfiniteWidth Limit
Modern deep learning models have achieved great success in predictive ac...
read it

Finite Versus Infinite Neural Networks: an Empirical Study
We perform a careful, thorough, and large scale empirical study of the c...
read it

DataEfficient Deep Learning Method for Image Classification Using Data Augmentation, Focal Cosine Loss, and Ensemble
In general, sufficient data is essential for the better performance and ...
read it

NTIRE 2020 Challenge on Video Quality Mapping: Methods and Results
This paper reviews the NTIRE 2020 challenge on video quality mapping (VQ...
read it

On the infinite width limit of neural networks with a standard parameterization
There are currently two parameterizations used to derive fixed kernels c...
read it

Neural Tangents: Fast and Easy Infinite Neural Networks in Python
Neural Tangents is a library designed to enable research into infinitew...
read it

On Empirical Comparisons of Optimizers for Deep Learning
Selecting an optimizer is a central step in the contemporary deep learni...
read it

Physics Enhanced Artificial Intelligence
We propose that intelligently combining models from the domains of Artif...
read it

Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent
A longstanding goal in deep learning research has been to precisely char...
read it

Measuring the Effects of Data Parallelism on Neural Network Training
Recent hardware developments have made unprecedented amounts of data par...
read it

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes
There is a previously identified equivalence between wide fully connecte...
read it

Deep Neural Networks as Gaussian Processes
A deep fullyconnected neural network with an i.i.d. prior over its para...
read it
Jaehoon Lee
is this you? claim profile