Neural Tangent Kernel Empowered Federated Learning

by   Kai Yue, et al.

Federated learning (FL) is a privacy-preserving paradigm where multiple participants jointly solve a machine learning problem without sharing raw data. Unlike traditional distributed learning, a unique characteristic of FL is statistical heterogeneity, namely, data distributions across participants are different from each other. Meanwhile, recent advances in the interpretation of neural networks have seen a wide use of neural tangent kernel (NTK) for convergence and generalization analyses. In this paper, we propose a novel FL paradigm empowered by the NTK framework. The proposed paradigm addresses the challenge of statistical heterogeneity by transmitting update data that are more expressive than those of the traditional FL paradigms. Specifically, sample-wise Jacobian matrices, rather than model weights/gradients, are uploaded by participants. The server then constructs an empirical kernel matrix to update a global model without explicitly performing gradient descent. We further develop a variant with improved communication efficiency and enhanced privacy. Numerical results show that the proposed paradigm can achieve the same accuracy while reducing the number of communication rounds by an order of magnitude compared to federated averaging.



There are no comments yet.


page 8


Accelerating Federated Learning via Momentum Gradient Descent

Federated learning (FL) provides a communication-efficient approach to s...

Improving Federated Relational Data Modeling via Basis Alignment and Weight Penalty

Federated learning (FL) has attracted increasing attention in recent yea...

Towards Privacy-Preserving and Verifiable Federated Matrix Factorization

Recent years have witnessed the rapid growth of federated learning (FL),...

Towards Efficient and Stable K-Asynchronous Federated Learning with Unbounded Stale Gradients on Non-IID Data

Federated learning (FL) is an emerging privacy-preserving paradigm that ...

Confined Gradient Descent: Privacy-preserving Optimization for Federated Learning

Federated learning enables multiple participants to collaboratively trai...

FedH2L: Federated Learning with Model and Statistical Heterogeneity

Federated learning (FL) enables distributed participants to collectively...

Achieving Statistical Optimality of Federated Learning: Beyond Stationary Points

Federated Learning (FL) is a promising framework that has great potentia...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.