Federated Bayesian Optimization via Thompson Sampling

10/20/2020
by   Zhongxiang Dai, et al.
17

Bayesian optimization (BO) is a prominent approach to optimizing expensive-to-evaluate black-box functions. The massive computational capability of edge devices such as mobile phones, coupled with privacy concerns, has led to a surging interest in federated learning (FL) which focuses on collaborative training of deep neural networks (DNNs) via first-order optimization techniques. However, some common machine learning tasks such as hyperparameter tuning of DNNs lack access to gradients and thus require zeroth-order/black-box optimization. This hints at the possibility of extending BO to the FL setting (FBO) for agents to collaborate in these black-box optimization tasks. This paper presents federated Thompson sampling (FTS) which overcomes a number of key challenges of FBO and FL in a principled way: We (a) use random Fourier features to approximate the Gaussian process surrogate model used in BO, which naturally produces the parameters to be exchanged between agents, (b) design FTS based on Thompson sampling, which significantly reduces the number of parameters to be exchanged, and (c) provide a theoretical convergence guarantee that is robust against heterogeneous agents, which is a major challenge in FL and FBO. We empirically demonstrate the effectiveness of FTS in terms of communication efficiency, computational efficiency, and practical performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2022

Communication-Efficient Stochastic Zeroth-Order Optimization for Federated Learning

Federated learning (FL), as an emerging edge artificial intelligence par...
research
07/23/2021

Communication Efficiency in Federated Learning: Achievements and Challenges

Federated Learning (FL) is known to perform Machine Learning tasks in a ...
research
03/08/2023

Model-Agnostic Federated Learning

Since its debut in 2016, Federated Learning (FL) has been tied to the in...
research
08/09/2023

Efficient Bayesian Optimization with Deep Kernel Learning and Transformer Pre-trained on Multiple Heterogeneous Datasets

Bayesian optimization (BO) is widely adopted in black-box optimization p...
research
10/15/2021

Evaluation of Hyperparameter-Optimization Approaches in an Industrial Federated Learning System

Federated Learning (FL) decouples model training from the need for direc...
research
07/01/2022

Asynchronous Distributed Bayesian Optimization at HPC Scale

Bayesian optimization (BO) is a widely used approach for computationally...
research
01/26/2023

FedHQL: Federated Heterogeneous Q-Learning

Federated Reinforcement Learning (FedRL) encourages distributed agents t...

Please sign up or login with your details

Forgot password? Click here to reset