Convergence and Accuracy Trade-Offs in Federated Learning and Meta-Learning

03/08/2021
by   Zachary Charles, et al.
0

We study a family of algorithms, which we refer to as local update methods, generalizing many federated and meta-learning algorithms. We prove that for quadratic models, local update methods are equivalent to first-order optimization on a surrogate loss we exactly characterize. Moreover, fundamental algorithmic choices (such as learning rates) explicitly govern a trade-off between the condition number of the surrogate loss and its alignment with the true loss. We derive novel convergence rates showcasing these trade-offs and highlight their importance in communication-limited settings. Using these insights, we are able to compare local update methods based on their convergence/accuracy trade-off, not just their convergence to critical points of the empirical loss. Our results shed new light on a broad range of phenomena, including the efficacy of server momentum in federated learning and the impact of proximal client updates.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

07/02/2020

On the Outsized Importance of Learning Rates in Local Update Methods

We study a family of algorithms, which we refer to as local update metho...
02/25/2021

Achieving Linear Convergence in Federated Learning under Objective and Systems Heterogeneity

We consider a standard federated learning architecture where a group of ...
06/09/2021

Memory-based Optimization Methods for Model-Agnostic Meta-Learning

Recently, model-agnostic meta-learning (MAML) has garnered tremendous at...
01/30/2021

On Data Efficiency of Meta-learning

Meta-learning has enabled learning statistical models that can be quickl...
04/08/2022

Global Update Guided Federated Learning

Federated learning protects data privacy and security by exchanging mode...
05/23/2021

Fast Federated Learning by Balancing Communication Trade-Offs

Federated Learning (FL) has recently received a lot of attention for lar...
05/11/2020

FedSplit: An algorithmic framework for fast federated optimization

Motivated by federated learning, we consider the hub-and-spoke model of ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.