Learning to Guide Random Search

04/25/2020
by   Ozan Sener, et al.
74

We are interested in derivative-free optimization of high-dimensional functions. The sample complexity of existing methods is high and depends on problem dimensionality, unlike the dimensionality-independent rates of first-order methods. The recent success of deep learning suggests that many datasets lie on low-dimensional manifolds that can be represented by deep nonlinear models. We therefore consider derivative-free optimization of a high-dimensional function that lies on a latent low-dimensional manifold. We develop an online learning approach that learns this manifold while performing the optimization. In other words, we jointly learn the manifold and optimize the function. Our analysis suggests that the presented method significantly reduces sample complexity. We empirically evaluate the method on continuous optimization benchmarks and high-dimensional continuous control problems. Our method achieves significantly lower sample complexity than Augmented Random Search, Bayesian optimization, covariance matrix adaptation (CMA-ES), and other derivative-free optimization algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2019

High-Dimensional Bayesian Optimization with Manifold Gaussian Processes

Bayesian optimization (BO) is a powerful approach for seeking the global...
research
10/21/2020

High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds

Despite the recent success of Bayesian optimization (BO) in a variety of...
research
01/31/2020

Re-Examining Linear Embeddings for High-Dimensional Bayesian Optimization

Bayesian optimization (BO) is a popular approach to optimize expensive-t...
research
12/31/2020

Good practices for Bayesian Optimization of high dimensional structured spaces

The increasing availability of structured but high dimensional data has ...
research
12/31/2017

ZOOpt/ZOOjl: Toolbox for Derivative-Free Optimization

Recent advances of derivative-free optimization allow efficient approxim...
research
03/02/2019

High-Dimensional Learning under Approximate Sparsity: A Unifying Framework for Nonsmooth Learning and Regularized Neural Networks

High-dimensional statistical learning (HDSL) has been widely applied in ...
research
08/08/2022

Neural Set Function Extensions: Learning with Discrete Functions in High Dimensions

Integrating functions on discrete domains into neural networks is key to...

Please sign up or login with your details

Forgot password? Click here to reset