Invariant Priors for Bayesian Quadrature

12/02/2021
by   Masha Naslidnyk, et al.
0

Bayesian quadrature (BQ) is a model-based numerical integration method that is able to increase sample efficiency by encoding and leveraging known structure of the integration task at hand. In this paper, we explore priors that encode invariance of the integrand under a set of bijective transformations in the input domain, in particular some unitary transformations, such as rotations, axis-flips, or point symmetries. We show initial results on superior performance in comparison to standard Bayesian quadrature on several synthetic and one real world application.

READ FULL TEXT
research
06/09/2020

Bayesian Probabilistic Numerical Integration with Tree-Based Models

Bayesian quadrature (BQ) is a method for solving numerical integration p...
research
05/13/2021

Likelihoods and Parameter Priors for Bayesian Networks

We develop simple methods for constructing likelihoods and parameter pri...
research
06/25/2020

Prior-guided Bayesian Optimization

While Bayesian Optimization (BO) is a very popular method for optimizing...
research
02/08/2022

Improving the Sample-Complexity of Deep Classification Networks with Invariant Integration

Leveraging prior knowledge on intraclass variance due to transformations...
research
11/10/2021

Understanding the Generalization Benefit of Model Invariance from a Data Perspective

Machine learning models that are developed to be invariant under certain...
research
12/11/2010

Affine Invariant, Model-Based Object Recognition Using Robust Metrics and Bayesian Statistics

We revisit the problem of model-based object recognition for intensity i...
research
05/24/2017

Bayesian Compression for Deep Learning

Compression and computational efficiency in deep learning have become a ...

Please sign up or login with your details

Forgot password? Click here to reset