Scalable Bayesian Optimization with Sparse Gaussian Process Models

10/26/2020
by   Ang Yang, et al.
0

This thesis focuses on Bayesian optimization with the improvements coming from two aspects:(i) the use of derivative information to accelerate the optimization convergence; and (ii) the consideration of scalable GPs for handling massive data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2015

Local Nonstationarity for Efficient Bayesian Optimization

Bayesian optimization has shown to be a fundamental global optimization ...
research
03/28/2019

Using Gaussian process regression for efficient parameter reconstruction

Optical scatterometry is a method to measure the size and shape of perio...
research
07/31/2021

BoA-PTA, A Bayesian Optimization Accelerated Error-Free SPICE Solver

One of the greatest challenges in IC design is the repeated executions o...
research
06/29/2021

Attentive Neural Processes and Batch Bayesian Optimization for Scalable Calibration of Physics-Informed Digital Twins

Physics-informed dynamical system models form critical components of dig...
research
05/24/2023

The Behavior and Convergence of Local Bayesian Optimization

A recent development in Bayesian optimization is the use of local optimi...
research
01/08/2021

Bayesian optimization with improved scalability and derivative information for efficient design of nanophotonic structures

We propose the combination of forward shape derivatives and the use of a...
research
12/17/2018

Bayesian Optimization in AlphaGo

During the development of AlphaGo, its many hyper-parameters were tuned ...

Please sign up or login with your details

Forgot password? Click here to reset