Many areas of machine learning and science involve large linear algebra
...
While Gaussian processes are a mainstay for various engineering and
scie...
Classical results establish that ensembles of small models benefit when
...
Gaussian processes scale prohibitively with the size of the dataset. In
...
Ensembling neural networks is an effective way to increase accuracy, and...
Variational approximations to Gaussian processes (GPs) typically use a s...
Gaussian processes remain popular as a flexible and expressive model cla...
Large width limits have been a recent focus of deep learning research: m...
Normalizing flows are invertible neural networks with tractable
change-o...
We introduce a simple and scalable method for training Gaussian process ...
We examine the general problem of inter-domain Gaussian Processes (GPs):...
Scalable Gaussian Process methods are computationally attractive, yet
in...
Modern deep learning is primarily an experimental science, in which empi...
Matrix square roots and their inverses arise frequently in machine learn...
We introduce Deep Sigma Point Processes, a class of parametric models
in...
Not all data in a typical training set help with generalization; some sa...
Recent work has shown that convolutional networks can be substantially
d...
The combination of inducing point methods with stochastic variational
in...
Detecting objects such as cars and pedestrians in 3D plays an indispensa...
Gaussian processes (GPs) are flexible models with state-of-the-art
perfo...
Despite advances in scalable models, the inference tools used for Gaussi...
One of the most compelling features of Gaussian process (GP) regression ...
Recent work shows that inference for Gaussian processes can be performed...
The machine learning community has become increasingly concerned with th...
The DenseNet architecture is highly computationally efficient as a resul...
We propose Deep Feature Interpolation (DFI), a new data-driven baseline ...