Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network

09/25/2019
by   Taiji Suzuki, et al.
8

One of biggest issues in deep learning theory is its generalization ability despite the huge model size. The classical learning theory suggests that overparameterized models cause overfitting. However, practically used large deep models avoid overfitting, which is not well explained by the classical approaches. To resolve this issue, several attempts have been made. Among them, the compression based bound is one of the promising approaches. However, the compression based bound can be applied only to a compressed network, and it is not applicable to the non-compressed original network. In this paper, we give a unified frame-work that can convert compression based bounds to those for non-compressed original networks. The bound gives even better rate than the one for the compressed network by improving the bias term. By establishing the unified frame-work, we can obtain a data dependent generalization error bound which gives a tighter evaluation than the data independent ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2018

Compressibility and Generalization in Large-Scale Deep Learning

Modern neural networks are highly overparameterized, with capacity to su...
research
11/24/2022

PAC-Bayes Compression Bounds So Tight That They Can Explain Generalization

While there has been progress in developing non-vacuous generalization b...
research
08/26/2018

Spectral-Pruning: Compressing deep neural network via spectral analysis

The model size of deep neural network is getting larger and larger to re...
research
06/15/2021

Compression Implies Generalization

Explaining the surprising generalization performance of deep neural netw...
research
01/30/2023

Compression, Generalization and Learning

A compression function is a map that slims down an observational set int...
research
11/17/2020

Analyzing and Mitigating Compression Defects in Deep Learning

With the proliferation of deep learning methods, many computer vision pr...
research
10/15/2019

REVE: Regularizing Deep Learning with Variational Entropy Bound

Studies on generalization performance of machine learning algorithms und...

Please sign up or login with your details

Forgot password? Click here to reset