FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning

by   Elnur Gasanov, et al.

Federated Learning (FL) is an increasingly popular machine learning paradigm in which multiple nodes try to collaboratively learn under privacy, communication and multiple heterogeneity constraints. A persistent problem in federated learning is that it is not clear what the optimization objective should be: the standard average risk minimization of supervised learning is inadequate in handling several major constraints specific to federated learning, such as communication adaptivity and personalization control. We identify several key desiderata in frameworks for federated learning and introduce a new framework, FLIX, that takes into account the unique challenges brought by federated learning. FLIX has a standard finite-sum form, which enables practitioners to tap into the immense wealth of existing (potentially non-local) methods for distributed optimization. Through a smart initialization that does not require any communication, FLIX does not require the use of local steps but is still provably capable of performing dissimilarity regularization on par with local methods. We give several algorithms for solving the FLIX formulation efficiently under communication constraints. Finally, we corroborate our theoretical results with extensive experimentation.



There are no comments yet.


page 1

page 2

page 3

page 4


Federated Learning and Wireless Communications

Federated learning becomes increasingly attractive in the areas of wirel...

Federated Learning of a Mixture of Global and Local Models

We propose a new optimization formulation for training federated learnin...

FedMix: Approximation of Mixup under Mean Augmented Federated Learning

Federated learning (FL) allows edge devices to collectively learn a mode...

Overcoming Forgetting in Federated Learning on Non-IID Data

We tackle the problem of Federated Learning in the non i.i.d. case, in w...

Accurate and Fast Federated Learning via IID and Communication-Aware Grouping

Federated learning has emerged as a new paradigm of collaborative machin...

Lower Bounds and Optimal Algorithms for Personalized Federated Learning

In this work, we consider the optimization formulation of personalized f...

Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning

Recent advances in distributed optimization have shown that Newton-type ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.