Device Heterogeneity in Federated Learning: A Superquantile Approach

02/25/2020
by   Yassine Laguel, et al.
0

We propose a federated learning framework to handle heterogeneous client devices which do not conform to the population data distribution. The approach hinges upon a parameterized superquantile-based objective, where the parameter ranges over levels of conformity. We present an optimization algorithm and establish its convergence to a stationary point. We show how to practically implement it using secure aggregation by interleaving iterations of the usual federated averaging method with device filtering. We conclude with numerical experiments on neural networks as well as linear models on tasks from computer vision and natural language processing.

READ FULL TEXT

page 3

page 30

page 39

research
11/08/2018

Federated Learning for Mobile Keyboard Prediction

We train a recurrent neural network language model using a distributed, ...
research
07/07/2022

FedHeN: Federated Learning in Heterogeneous Networks

We propose a novel training recipe for federated learning with heterogen...
research
12/31/2019

Robust Aggregation for Federated Learning

We present a robust aggregation approach to make federated learning robu...
research
12/14/2018

On the Convergence of Federated Optimization in Heterogeneous Networks

The burgeoning field of federated learning involves training machine lea...
research
12/23/2022

Deep Unfolding-based Weighted Averaging for Federated Learning under Heterogeneous Environments

Federated learning is a collaborative model training method by iterating...
research
11/23/2020

Improving Federated Relational Data Modeling via Basis Alignment and Weight Penalty

Federated learning (FL) has attracted increasing attention in recent yea...
research
11/06/2020

Resource-Constrained Federated Learning with Heterogeneous Labels and Models

Various IoT applications demand resource-constrained machine learning me...

Please sign up or login with your details

Forgot password? Click here to reset