Bayesian Experimental Design for Symbolic Discovery

11/29/2022
by   Kenneth L. Clarkson, et al.
0

This study concerns the formulation and application of Bayesian optimal experimental design to symbolic discovery, which is the inference from observational data of predictive models taking general functional forms. We apply constrained first-order methods to optimize an appropriate selection criterion, using Hamiltonian Monte Carlo to sample from the prior. A step for computing the predictive distribution, involving convolution, is computed via either numerical integration, or via fast transform methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2019

Maximizing conditional entropy of Hamiltonian Monte Carlo sampler

The performance of Hamiltonian Monte Carlo (HMC) sampler depends critica...
research
06/06/2018

Dynamically rescaled Hamiltonian Monte Carlo for Bayesian Hierarchical Models

Dynamically rescaled Hamiltonian Monte Carlo (DRHMC) is introduced as a ...
research
12/04/2018

Batch Selection for Parallelisation of Bayesian Quadrature

Integration over non-negative integrands is a central problem in machine...
research
06/25/2021

Posterior Covariance Information Criterion

We introduce an information criterion, PCIC, for predictive evaluation b...
research
10/23/2021

Prior Intensified Information Criterion

The widely applicable information criterion (WAIC) has been used as a mo...
research
05/28/2019

Selecting the Metric in Hamiltonian Monte Carlo

We present a selection criterion for the Euclidean metric adapted during...
research
01/14/2019

Optimality Criteria for Probabilistic Numerical Methods

It is well understood that Bayesian decision theory and average case ana...

Please sign up or login with your details

Forgot password? Click here to reset