Monotonicity and Double Descent in Uncertainty Estimation with Gaussian Processes

10/14/2022
by   Liam Hodgkinson, et al.
0

The quality of many modern machine learning models improves as model complexity increases, an effect that has been quantified, for predictive performance, with the non-monotonic double descent learning curve. Here, we address the overarching question: is there an analogous theory of double descent for models which estimate uncertainty? We provide a partially affirmative and partially negative answer in the setting of Gaussian processes (GP). Under standard assumptions, we prove that higher model quality for optimally-tuned GPs (including uncertainty prediction) under marginal likelihood is realized for larger input dimensions, and therefore exhibits a monotone error curve. After showing that marginal likelihood does not naturally exhibit double descent in the input dimension, we highlight related forms of posterior predictive loss that do exhibit non-monotonicity. Finally, we verify empirically that our results hold for real data, beyond our considered assumptions, and we explore consequences involving synthetic covariates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/09/2018

Deep Gaussian Processes with Decoupled Inducing Inputs

Deep Gaussian Processes (DGP) are hierarchical generalizations of Gaussi...
research
03/04/2020

Optimal Regularization Can Mitigate Double Descent

Recent empirical and theoretical studies have shown that many learning a...
research
05/26/2020

Skew Gaussian Processes for Classification

Gaussian processes (GPs) are distributions over functions, which provide...
research
12/04/2019

Deep Double Descent: Where Bigger Models and More Data Hurt

We show that a variety of modern deep learning tasks exhibit a "double-d...
research
07/02/2021

Mitigating deep double descent by concatenating inputs

The double descent curve is one of the most intriguing properties of dee...
research
02/25/2022

Learning Invariant Weights in Neural Networks

Assumptions about invariances or symmetries in data can significantly in...
research
10/21/2018

Label Noise Filtering Techniques to Improve Monotonic Classification

The monotonic ordinal classification has increased the interest of resea...

Please sign up or login with your details

Forgot password? Click here to reset