HideNseek: Federated Lottery Ticket via Server-side Pruning and Sign Supermask

06/09/2022
by   Anish K. Vallapuram, et al.
0

Federated learning alleviates the privacy risk in distributed learning by transmitting only the local model updates to the central server. However, it faces challenges including statistical heterogeneity of clients' datasets and resource constraints of client devices, which severely impact the training performance and user experience. Prior works have tackled these challenges by combining personalization with model compression schemes including quantization and pruning. However, the pruning is data-dependent and thus must be done on the client side which requires considerable computation cost. Moreover, the pruning normally trains a binary supermask ∈{0, 1} which significantly limits the model capacity yet with no computation benefit. Consequently, the training requires high computation cost and a long time to converge while the model performance does not pay off. In this work, we propose HideNseek which employs one-shot data-agnostic pruning at initialization to get a subnetwork based on weights' synaptic saliency. Each client then optimizes a sign supermask ∈{-1, +1} multiplied by the unpruned weights to allow faster convergence with the same compression rates as state-of-the-art. Empirical results from three datasets demonstrate that compared to state-of-the-art, HideNseek improves inferences accuracies by up to 40.6% while reducing the communication cost and training time by up to 39.7% and 46.8% respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2021

Personalized Federated Learning by Structured and Unstructured Pruning under Data Heterogeneity

The traditional approach in FL tries to learn a single global model coll...
research
04/27/2021

Towards Fair Federated Learning with Zero-Shot Data Augmentation

Federated learning has emerged as an important distributed learning para...
research
02/21/2020

Coded Federated Learning

Federated learning is a method of training a global model from decentral...
research
04/15/2023

SalientGrads: Sparse Models for Communication Efficient and Data Aware Distributed Federated Training

Federated learning (FL) enables the training of a model leveraging decen...
research
12/23/2022

Deep Unfolding-based Weighted Averaging for Federated Learning under Heterogeneous Environments

Federated learning is a collaborative model training method by iterating...
research
04/12/2023

FedTrip: A Resource-Efficient Federated Learning Method with Triplet Regularization

In the federated learning scenario, geographically distributed clients c...
research
04/28/2020

Streamlining Tensor and Network Pruning in PyTorch

In order to contrast the explosion in size of state-of-the-art machine l...

Please sign up or login with your details

Forgot password? Click here to reset