DeepAI
Log In Sign Up

Variational Estimators for Bayesian Optimal Experimental Design

03/13/2019
by   Adam Foster, et al.
16

Bayesian optimal experimental design (BOED) is a principled framework for making efficient use of limited experimental resources. Unfortunately, its applicability is hampered by the difficulty of obtaining accurate estimates of the expected information gain (EIG) of an experiment. To address this, we introduce several classes of fast EIG estimators suited to the experiment design context by building on ideas from variational inference and mutual information estimation. We show theoretically and empirically that these estimators can provide significant gains in speed and accuracy over previous approaches. We demonstrate the practicality of our approach via a number of experiments, including an adaptive experiment with human participants.

READ FULL TEXT
03/13/2019

Variational Bayesian Optimal Experimental Design

Bayesian optimal experimental design (BOED) is a principled framework fo...
05/20/2022

Robust Expected Information Gain for Optimal Bayesian Experimental Design Using Ambiguity Sets

The ranking of experiments by expected information gain (EIG) in Bayesia...
10/07/2022

Design Amortization for Bayesian Optimal Experimental Design

Bayesian optimal experimental design is a sub-field of statistics focuse...
03/03/2021

Deep Adaptive Design: Amortizing Sequential Bayesian Experimental Design

We introduce Deep Adaptive Design (DAD), a method for amortizing the cos...
11/03/2021

Implicit Deep Adaptive Design: Policy-Based Experimental Design without Likelihoods

We introduce implicit Deep Adaptive Design (iDAD), a new method for perf...
10/14/2019

Understanding the Limitations of Variational Mutual Information Estimators

Variational approaches based on neural networks are showing promise for ...
03/04/2020

Neural-Network Heuristics for Adaptive Bayesian Quantum Estimation

Quantum metrology promises unprecedented measurement precision but suffe...

Code Repositories

boed-pytorch

A simple project, which explores the variational estimators of Foster (https://arxiv.org/abs/1903.05480) in a Bayesian linear regression setting. Using nested Monte Carlo estimators the exact (convex) information gain is calculated for the regression.


view repo