Log In Sign Up

Lower Bounds on the Worst-Case Complexity of Efficient Global Optimization

by   Wenjie Xu, et al.

Efficient global optimization is a widely used method for optimizing expensive black-box functions such as tuning hyperparameter, and designing new material, etc. Despite its popularity, less attention has been paid to analyzing the inherent hardness of the problem although, given its extensive use, it is important to understand the fundamental limits of efficient global optimization algorithms. In this paper, we study the worst-case complexity of the efficient global optimization problem and, in contrast to existing kernel-specific results, we derive a unified lower bound for the complexity of efficient global optimization in terms of the metric entropy of a ball in its corresponding reproducing kernel Hilbert space (RKHS). Specifically, we show that if there exists a deterministic algorithm that achieves suboptimality gap smaller than ϵ for any function f∈ S in T function evaluations, it is necessary that T is at least Ω(log𝒩(S(𝒳), 4ϵ,·_∞)/log(R/ϵ)), where 𝒩(·,·,·) is the covering number, S is the ball centered at 0 with radius R in the RKHS and S(𝒳) is the restriction of S over the feasible set 𝒳. Moreover, we show that this lower bound nearly matches the upper bound attained by non-adaptive search algorithms for the commonly used squared exponential kernel and the Matérn kernel with a large smoothness parameter ν, up to a replacement of d/2 by d and a logarithmic term logR/ϵ. That is to say, our lower bound is nearly optimal for these kernels.


page 1

page 2

page 3

page 4


Instance-Dependent Regret Analysis of Kernelized Bandits

We study the kernelized bandit problem, that involves designing an adapt...

Spectral bounds of the ε-entropy of kernel classes

We develop new upper and lower bounds on the ε-entropy of a unit ball in...

Multi-Scale Zero-Order Optimization of Smooth Functions in an RKHS

We aim to optimize a black-box function f:XR under the assumption that f...

Bayesian Optimization under Heavy-tailed Payoffs

We consider black box optimization of an unknown function in the nonpara...

Order-Optimal Error Bounds for Noisy Kernel-Based Bayesian Quadrature

In this paper, we study the sample complexity of noisy Bayesian quadratu...

Kernel quadrature by applying a point-wise gradient descent method to discrete energies

We propose a method for generating nodes for kernel quadrature by a poin...

Finding Global Minima via Kernel Approximations

We consider the global minimization of smooth functions based solely on ...