Information-Theoretic Understanding of Population Risk Improvement with Model Compression

01/27/2019
by   Yuheng Bu, et al.
0

We show that model compression can improve the population risk of a pre-trained model, by studying the tradeoff between the decrease in the generalization error and the increase in the empirical risk with model compression. We first prove that model compression reduces an information-theoretic bound on the generalization error; this allows for an interpretation of model compression as a regularization technique to avoid overfitting. We then characterize the increase in empirical risk with model compression using rate distortion theory. These results imply that the population risk could be improved by model compression if the decrease in generalization error exceeds the increase in empirical risk. We show through a linear regression example that such a decrease in population risk due to model compression is indeed possible. Our theoretical results further suggest that the Hessian-weighted K-means clustering compression approach can be improved by regularizing the distance between the clustering centers. We provide experiments with neural networks to support our theoretical assertions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2023

On the tightness of information-theoretic bounds on generalization error of learning algorithms

A recent line of works, initiated by Russo and Xu, has shown that the ge...
research
05/06/2022

Fast Rate Generalization Error Bounds: Variations on a Theme

A recent line of works, initiated by Russo and Xu, has shown that the ge...
research
03/21/2020

On Information Plane Analyses of Neural Network Classifiers – A Review

We review the current literature concerned with information plane analys...
research
10/03/2022

Information-Theoretic Analysis of Unsupervised Domain Adaptation

This paper uses information-theoretic tools to analyze the generalizatio...
research
02/15/2022

Information-Theoretic Analysis of Minimax Excess Risk

Two main concepts studied in machine learning theory are generalization ...
research
11/19/2017

Compression-Based Regularization with an Application to Multi-Task Learning

This paper investigates, from information theoretic grounds, a learning ...
research
01/30/2023

Compression, Generalization and Learning

A compression function is a map that slims down an observational set int...

Please sign up or login with your details

Forgot password? Click here to reset