DeepAI
Log In Sign Up

Federated Bayesian Optimization via Thompson Sampling

10/20/2020
by   Zhongxiang Dai, et al.
17

Bayesian optimization (BO) is a prominent approach to optimizing expensive-to-evaluate black-box functions. The massive computational capability of edge devices such as mobile phones, coupled with privacy concerns, has led to a surging interest in federated learning (FL) which focuses on collaborative training of deep neural networks (DNNs) via first-order optimization techniques. However, some common machine learning tasks such as hyperparameter tuning of DNNs lack access to gradients and thus require zeroth-order/black-box optimization. This hints at the possibility of extending BO to the FL setting (FBO) for agents to collaborate in these black-box optimization tasks. This paper presents federated Thompson sampling (FTS) which overcomes a number of key challenges of FBO and FL in a principled way: We (a) use random Fourier features to approximate the Gaussian process surrogate model used in BO, which naturally produces the parameters to be exchanged between agents, (b) design FTS based on Thompson sampling, which significantly reduces the number of parameters to be exchanged, and (c) provide a theoretical convergence guarantee that is robust against heterogeneous agents, which is a major challenge in FL and FBO. We empirically demonstrate the effectiveness of FTS in terms of communication efficiency, computational efficiency, and practical performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/24/2022

Communication-Efficient Stochastic Zeroth-Order Optimization for Federated Learning

Federated learning (FL), as an emerging edge artificial intelligence par...
07/23/2021

Communication Efficiency in Federated Learning: Achievements and Challenges

Federated Learning (FL) is known to perform Machine Learning tasks in a ...
10/15/2021

Evaluation of Hyperparameter-Optimization Approaches in an Industrial Federated Learning System

Federated Learning (FL) decouples model training from the need for direc...
07/01/2022

Asynchronous Distributed Bayesian Optimization at HPC Scale

Bayesian optimization (BO) is a widely used approach for computationally...
02/01/2022

Federated Active Learning (F-AL): an Efficient Annotation Strategy for Federated Learning

Federated learning (FL) has been intensively investigated in terms of co...
04/23/2021

Scalable and Flexible Deep Bayesian Optimization with Auxiliary Information for Scientific Problems

Bayesian optimization (BO) is a popular paradigm for global optimization...
10/21/2019

Bayesian Optimization Allowing for Common Random Numbers

Bayesian optimization is a powerful tool for expensive stochastic black-...

Code Repositories

Federated_Bayesian_Optimization

None


view repo