Federated Coded Matrix Inversion

01/09/2023
by   Neophytos Charalambides, et al.
0

Federated learning (FL) is a decentralized model for training data distributed across client devices. Coded computing (CC) is a method for mitigating straggling workers in a centralized computing network, by using erasure-coding techniques. In this work we propose approximating the inverse of a data matrix, where the data is generated by clients; similar to the FL paradigm, while also being resilient to stragglers. To do so, we propose a CC method based on gradient coding. We modify this method so that the coordinator does not need to have access to the local data, the network we consider is not centralized, and the communications which take place are secure against potential eavesdroppers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2022

DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing

Federated learning (FL) strives to enable collaborative training of mach...
research
02/23/2023

Coded Matrix Computations for D2D-enabled Linearized Federated Learning

Federated learning (FL) is a popular technique for training a global mod...
research
07/07/2020

Coded Computing for Federated Learning at the Edge

Federated Learning (FL) is an exciting new paradigm that enables trainin...
research
05/31/2022

Secure Federated Clustering

We consider a foundational unsupervised learning task of k-means data cl...
research
01/21/2022

Orthonormal Sketches for Secure Coded Regression

In this work, we propose a method for speeding up linear regression dist...
research
08/08/2023

Iterative Sketching for Secure Coded Regression

In this work, we propose methods for speeding up linear regression distr...
research
11/12/2020

Coded Computing for Low-Latency Federated Learning over Wireless Edge Networks

Federated learning enables training a global model from data located at ...

Please sign up or login with your details

Forgot password? Click here to reset