An Information-Theoretic Analysis for Transfer Learning: Error Bounds and Applications

07/12/2022
by   Xuetong Wu, et al.
0

Transfer learning, or domain adaptation, is concerned with machine learning problems in which training and testing data come from possibly different probability distributions. In this work, we give an information-theoretic analysis on the generalization error and excess risk of transfer learning algorithms, following a line of work initiated by Russo and Xu. Our results suggest, perhaps as expected, that the Kullback-Leibler (KL) divergence D(μ||μ') plays an important role in the characterizations where μ and μ' denote the distribution of the training data and the testing test, respectively. Specifically, we provide generalization error upper bounds for the empirical risk minimization (ERM) algorithm where data from both distributions are available in the training phase. We further apply the analysis to approximated ERM methods such as the Gibbs algorithm and the stochastic gradient descent method. We then generalize the mutual information bound with ϕ-divergence and Wasserstein distance. These generalizations lead to tighter bounds and can handle the case when μ is not absolutely continuous with respect to μ'. Furthermore, we apply a new set of techniques to obtain an alternative upper bound which gives a fast (and optimal) learning rate for some learning problems. Finally, inspired by the derived bounds, we propose the InfoBoost algorithm in which the importance weights for source and target data are adjusted adaptively in accordance to information measures. The empirical results show the effectiveness of the proposed algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2020

Information-theoretic analysis for transfer learning

Transfer learning, or domain adaptation, is concerned with machine learn...
research
10/13/2020

Information-Theoretic Bounds on Transfer Generalization Gap Based on Jensen-Shannon Divergence

In transfer learning, training and testing data sets are drawn from diff...
research
03/10/2021

Multicalibrated Partitions for Importance Weights

The ratio between the probability that two distributions R and P give to...
research
03/26/2023

On the tightness of information-theoretic bounds on generalization error of learning algorithms

A recent line of works, initiated by Russo and Xu, has shown that the ge...
research
05/06/2022

Fast Rate Generalization Error Bounds: Variations on a Theme

A recent line of works, initiated by Russo and Xu, has shown that the ge...
research
11/04/2020

Transfer Meta-Learning: Information-Theoretic Bounds and Information Meta-Risk Minimization

Meta-learning automatically infers an inductive bias by observing data f...
research
01/28/2022

Stochastic Chaining and Strengthened Information-Theoretic Generalization Bounds

We propose a new approach to apply the chaining technique in conjunction...

Please sign up or login with your details

Forgot password? Click here to reset