Multi-Server Secure Aggregation with Unreliable Communication Links

04/15/2023
by   Kai Liang, et al.
0

In many distributed learning setups such as federated learning (FL), client nodes at the edge use individually collected data to compute local gradients and send them to a central master server. The master server then aggregates the received gradients and broadcasts the aggregation to all clients, with which the clients can update the global model. In this paper, we consider multi-server federated learning with secure aggregation and unreliable communication links. We first define a threat model using Shannon's information-theoretic security framework and propose a novel scheme called Lagrange Coding with Mask (LCM), which divides the servers into groups and uses Coding and Masking techniques. LCM can achieve a trade-off between the uplink and downlink communication loads by adjusting the number of servers in each group. Furthermore, we derive the lower bounds of the uplink and downlink communication loads, respectively, and prove that LCM achieves the optimal uplink communication load, which is unrelated to the number of collusion clients.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2022

SwiftAgg+: Achieving Asymptotically Optimal Communication Loads in Secure Aggregation for Federated Learning

We propose SwiftAgg+, a novel secure aggregation protocol for federated ...
research
11/29/2022

Multi-Server Over-the-Air Federated Learning

In this work, we propose a communication-efficient two-layer federated l...
research
12/16/2022

Coded Distributed Computing for Hierarchical Multi-task Learning

In this paper, we consider a hierarchical distributed multi-task learnin...
research
10/23/2020

Throughput-Optimal Topology Design for Cross-Silo Federated Learning

Federated learning usually employs a client-server architecture where an...
research
10/26/2020

Optimal Client Sampling for Federated Learning

It is well understood that client-master communication can be a primary ...
research
06/21/2023

Timely Asynchronous Hierarchical Federated Learning: Age of Convergence

We consider an asynchronous hierarchical federated learning (AHFL) setti...
research
09/14/2021

Fast Federated Edge Learning with Overlapped Communication and Computation and Channel-Aware Fair Client Scheduling

We consider federated edge learning (FEEL) over wireless fading channels...

Please sign up or login with your details

Forgot password? Click here to reset