Achieving Linear Speedup in Non-IID Federated Bilevel Learning

02/10/2023
by   Minhui Huang, et al.
0

Federated bilevel optimization has received increasing attention in various emerging machine learning and communication applications. Recently, several Hessian-vector-based algorithms have been proposed to solve the federated bilevel optimization problem. However, several important properties in federated learning such as the partial client participation and the linear speedup for convergence (i.e., the convergence rate and complexity are improved linearly with respect to the number of sampled clients) in the presence of non-i.i.d. datasets, still remain open. In this paper, we fill these gaps by proposing a new federated bilevel algorithm named FedMBO with a novel client sampling scheme in the federated hypergradient estimation. We show that FedMBO achieves a convergence rate of 𝒪(1/√(nK)+1/K+√(n)/K^3/2) on non-i.i.d. datasets, where n is the number of participating clients in each round, and K is the total number of iteration. This is the first theoretical linear speedup result for non-i.i.d. federated bilevel optimization. Extensive experiments validate our theoretical results and demonstrate the effectiveness of our proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/27/2021

Achieving Linear Speedup with Partial Worker Participation in Non-IID Federated Learning

Federated learning (FL) is a distributed machine learning architecture t...
research
02/18/2020

Distributed Non-Convex Optimization with Sublinear Speedup under Intermittent Client Availability

Federated learning is a new distributed machine learning framework, wher...
research
04/28/2022

On the Convergence of Momentum-Based Algorithms for Federated Stochastic Bilevel Optimization Problems

In this paper, we studied the federated stochastic bilevel optimization ...
research
05/30/2023

SimFBO: Towards Simple, Flexible and Communication-efficient Federated Bilevel Learning

Federated bilevel optimization (FBO) has shown great potential recently ...
research
06/09/2021

Memory-based Optimization Methods for Model-Agnostic Meta-Learning

Recently, model-agnostic meta-learning (MAML) has garnered tremendous at...
research
09/18/2023

FedLALR: Client-Specific Adaptive Learning Rates Achieve Linear Speedup for Non-IID Data

Federated learning is an emerging distributed machine learning method, e...
research
01/28/2022

FedGCN: Convergence and Communication Tradeoffs in Federated Training of Graph Convolutional Networks

Distributed methods for training models on graph datasets have recently ...

Please sign up or login with your details

Forgot password? Click here to reset