FedADMM: A Federated Primal-Dual Algorithm Allowing Partial Participation

03/28/2022
by   Han Wang, et al.
0

Federated learning is a framework for distributed optimization that places emphasis on communication efficiency. In particular, it follows a client-server broadcast model and is particularly appealing because of its ability to accommodate heterogeneity in client compute and storage resources, non-i.i.d. data assumptions, and data privacy. Our contribution is to offer a new federated learning algorithm, FedADMM, for solving non-convex composite optimization problems with non-smooth regularizers. We prove converges of FedADMM for the case when not all clients are able to participate in a given communication round under a very general sampling model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2023

Composite federated learning with heterogeneous data

We propose a novel algorithm for solving the composite Federated Learnin...
research
02/29/2020

Adaptive Federated Optimization

Federated learning is a distributed machine learning paradigm in which a...
research
01/26/2022

Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization

We present a theoretical study of server-side optimization in federated ...
research
05/31/2023

Federated Learning in the Presence of Adversarial Client Unavailability

Federated learning is a decentralized machine learning framework wherein...
research
08/30/2023

Federated Two Stage Decoupling With Adaptive Personalization Layers

Federated learning has gained significant attention due to its groundbre...
research
09/16/2020

FedSmart: An Auto Updating Federated Learning Optimization Mechanism

Federated learning has made an important contribution to data privacy-pr...
research
10/14/2022

A Primal-Dual Algorithm for Hybrid Federated Learning

Very few methods for hybrid federated learning, where clients only hold ...

Please sign up or login with your details

Forgot password? Click here to reset