Adaptive Differential Filters for Fast and Communication-Efficient Federated Learning

04/09/2022
by   Daniel Becking, et al.
0

Federated learning (FL) scenarios inherently generate a large communication overhead by frequently transmitting neural network updates between clients and server. To minimize the communication cost, introducing sparsity in conjunction with differential updates is a commonly used technique. However, sparse model updates can slow down convergence speed or unintentionally skip certain update aspects, e.g., learned features, if error accumulation is not properly addressed. In this work, we propose a new scaling method operating at the granularity of convolutional filters which 1) compensates for highly sparse updates in FL processes, 2) adapts the local models to new data domains by enhancing some features in the filter space while diminishing others and 3) motivates extra sparsity in updates and thus achieves higher compression ratios, i.e., savings in the overall data transfer. Compared to unscaled updates and previous work, experimental results on different computer vision tasks (Pascal VOC, CIFAR10, Chest X-Ray) and neural networks (ResNets, MobileNets, VGGs) in uni-, bidirectional and partial update FL settings show that the proposed method improves the performance of the central server model while converging faster and reducing the total amount of transmitted data by up to 377 times.

READ FULL TEXT
research
06/18/2020

Federated Learning With Quantized Global Model Updates

We study federated learning (FL), which enables mobile devices to utiliz...
research
01/21/2021

Time-Correlated Sparsification for Communication-Efficient Federated Learning

Federated learning (FL) enables multiple clients to collaboratively trai...
research
05/10/2021

Slashing Communication Traffic in Federated Learning by Transmitting Clustered Model Updates

Federated Learning (FL) is an emerging decentralized learning framework ...
research
01/31/2022

Federated Learning with Erroneous Communication Links

In this paper, we consider the federated learning (FL) problem in the pr...
research
10/08/2021

FSL: Federated Supermask Learning

Federated learning (FL) allows multiple clients with (private) data to c...
research
08/12/2021

Communication Optimization in Large Scale Federated Learning using Autoencoder Compressed Weight Updates

Federated Learning (FL) solves many of this decade's concerns regarding ...
research
01/11/2023

Network Adaptive Federated Learning: Congestion and Lossy Compression

In order to achieve the dual goals of privacy and learning across distri...

Please sign up or login with your details

Forgot password? Click here to reset