Neural Tangent Kernel Empowered Federated Learning

10/07/2021
by   Kai Yue, et al.
6

Federated learning (FL) is a privacy-preserving paradigm where multiple participants jointly solve a machine learning problem without sharing raw data. Unlike traditional distributed learning, a unique characteristic of FL is statistical heterogeneity, namely, data distributions across participants are different from each other. Meanwhile, recent advances in the interpretation of neural networks have seen a wide use of neural tangent kernel (NTK) for convergence and generalization analyses. In this paper, we propose a novel FL paradigm empowered by the NTK framework. The proposed paradigm addresses the challenge of statistical heterogeneity by transmitting update data that are more expressive than those of the traditional FL paradigms. Specifically, sample-wise Jacobian matrices, rather than model weights/gradients, are uploaded by participants. The server then constructs an empirical kernel matrix to update a global model without explicitly performing gradient descent. We further develop a variant with improved communication efficiency and enhanced privacy. Numerical results show that the proposed paradigm can achieve the same accuracy while reducing the number of communication rounds by an order of magnitude compared to federated averaging.

READ FULL TEXT
research
09/20/2023

Preconditioned Federated Learning

Federated Learning (FL) is a distributed machine learning approach that ...
research
10/07/2022

Depersonalized Federated Learning: Tackling Statistical Heterogeneity by Alternating Stochastic Gradient Descent

Federated learning (FL) has gained increasing attention recently, which ...
research
11/23/2020

Improving Federated Relational Data Modeling via Basis Alignment and Weight Penalty

Federated learning (FL) has attracted increasing attention in recent yea...
research
09/03/2023

A Comparative Evaluation of FedAvg and Per-FedAvg Algorithms for Dirichlet Distributed Heterogeneous Data

In this paper, we investigate Federated Learning (FL), a paradigm of mac...
research
02/19/2023

Magnitude Matters: Fixing SIGNSGD Through Magnitude-Aware Sparsification in the Presence of Data Heterogeneity

Communication overhead has become one of the major bottlenecks in the di...
research
10/27/2019

Federated Uncertainty-Aware Learning for Distributed Hospital EHR Data

Recent works have shown that applying Machine Learning to Electronic Hea...
research
03/02/2022

Towards Efficient and Stable K-Asynchronous Federated Learning with Unbounded Stale Gradients on Non-IID Data

Federated learning (FL) is an emerging privacy-preserving paradigm that ...

Please sign up or login with your details

Forgot password? Click here to reset