Compression-Based Regularization with an Application to Multi-Task Learning

11/19/2017
by   Matías Vera, et al.
0

This paper investigates, from information theoretic grounds, a learning problem based on the principle that any regularity in a given dataset can be exploited to extract compact features from data, i.e., using fewer bits than needed to fully describe the data itself, in order to build meaningful representations of a relevant content (multiple labels). We begin by introducing the noisy lossy source coding paradigm with the log-loss fidelity criterion which provides the fundamental tradeoffs between the cross-entropy loss (average risk) and the information rate of the features (model complexity). Our approach allows an information theoretic formulation of the multi-task learning (MTL) problem which is a supervised learning framework in which the prediction models for several related tasks are learned jointly from common representations to achieve better generalization performance. Then, we present an iterative algorithm for computing the optimal tradeoffs and its global convergence is proven provided that some conditions hold. An important property of this algorithm is that it provides a natural safeguard against overfitting, because it minimizes the average risk taking into account a penalization induced by the model complexity. Remarkably, empirical results illustrate that there exists an optimal information rate minimizing the excess risk which depends on the nature and the amount of available training data. An application to hierarchical text categorization is also investigated, extending previous works.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2017

Classifying Documents within Multiple Hierarchical Datasets using Multi-Task Learning

Multi-task learning (MTL) is a supervised learning paradigm in which the...
research
05/17/2016

Recurrent Neural Network for Text Classification with Multi-Task Learning

Neural network based methods have obtained great progress on a variety o...
research
07/10/2017

A Generalized Recurrent Neural Architecture for Text Classification with Multi-Task Learning

Multi-task learning leverages potential correlations among related tasks...
research
12/22/2018

Universal Supervised Learning for Individual Data

Universal supervised learning is considered from an information theoreti...
research
01/27/2019

Information-Theoretic Understanding of Population Risk Improvement with Model Compression

We show that model compression can improve the population risk of a pre-...
research
12/16/2022

Coded Distributed Computing for Hierarchical Multi-task Learning

In this paper, we consider a hierarchical distributed multi-task learnin...
research
05/31/2018

Minimax Learning for Remote Prediction

The classical problem of supervised learning is to infer an accurate pre...

Please sign up or login with your details

Forgot password? Click here to reset