Deep neural network classifiers partition input space into high confiden...
Deep equilibrium networks (DEQs) are a promising way to construct models...
The Neural Tangent Kernel (NTK), defined as Θ_θ^f(x_1, x_2) =
[∂ f(θ, x_...
Differentiable programming techniques are widely used in the community a...
We show that learning can be improved by using loss functions that evolv...
Machine learning is predicated on the concept of generalization: a model...
We perform a careful, thorough, and large scale empirical study of the
c...
There are currently two parameterizations used to derive fixed kernels
c...
A fundamental goal in deep learning is the characterization of trainabil...
A large fraction of computational science involves simulating the dynami...
Neural Tangents is a library designed to enable research into infinite-w...
We develop a mean field theory for batch normalization in fully-connecte...
A longstanding goal in deep learning research has been to precisely
char...
Training recurrent neural networks (RNNs) on long sequence tasks is plag...
As in many other scientific domains, we face a fundamental problem when ...
Recurrent neural networks have gained widespread use in modeling sequenc...
In recent years, state-of-the-art methods in computer vision have utiliz...
Recent work has shown that tight concentration of the entire spectrum of...
State of the art computer vision models have been shown to be vulnerable...
We study randomly initialized residual networks using mean field theory ...
It is well known that the initialization of weights in deep neural netwo...
It is becoming increasingly clear that many machine learning classifiers...
A deep fully-connected neural network with an i.i.d. prior over its
para...
A number of recent papers have provided evidence that practical design
q...
Our understanding of supercooled liquids and glasses has lagged signific...
We study the behavior of untrained neural networks whose weights and bia...