Improved Information Theoretic Generalization Bounds for Distributed and Federated Learning

02/04/2022
by   L. P. Barnes, et al.
0

We consider information-theoretic bounds on expected generalization error for statistical learning problems in a networked setting. In this setting, there are K nodes, each with its own independent dataset, and the models from each node have to be aggregated into a final centralized model. We consider both simple averaging of the models as well as more complicated multi-round algorithms. We give upper bounds on the expected generalization error for a variety of problems, such as those with Bregman divergence or Lipschitz continuous losses, that demonstrate an improved dependence of 1/K on the number of nodes. These "per node" bounds are in terms of the mutual information between the training dataset and the trained weights at each node, and are therefore useful in describing the generalization properties inherent to having communication or privacy constraints at each node.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2017

Information-theoretic analysis of generalization capability of learning algorithms

We derive upper bounds on the generalization error of a learning algorit...
research
02/05/2023

Tighter Information-Theoretic Generalization Bounds from Supersamples

We present a variety of novel information-theoretic generalization bound...
research
04/30/2020

Generalization Error for Linear Regression under Distributed Learning

Distributed learning facilitates the scaling-up of data processing by di...
research
06/06/2022

Rate-Distortion Theoretic Bounds on Generalization Error for Distributed Learning

In this paper, we use tools from rate-distortion theory to establish new...
research
10/11/2021

An Information-Theoretic Analysis of The Cost of Decentralization for Learning and Inference Under Privacy Constraints

In vertical federated learning (FL), the features of a data sample are d...
research
02/28/2023

Asymptotically Optimal Generalization Error Bounds for Noisy, Iterative Algorithms

We adopt an information-theoretic framework to analyze the generalizatio...
research
06/21/2019

Robustness of Dynamical Quantities of Interest via Goal-Oriented Information Theory

Variational-principle-based methods that relate expectations of a quanti...

Please sign up or login with your details

Forgot password? Click here to reset