The Information Complexity of Learning Tasks, their Structure and their Distance

04/05/2019
by   Alessandro Achille, et al.
26

We introduce an asymmetric distance in the space of learning tasks, and a framework to compute their complexity. These concepts are foundational to the practice of transfer learning, ubiquitous in Deep Learning, whereby a parametric model is pre-trained for a task, and then used for another after fine-tuning. The framework we develop is intrinsically non-asymptotic, capturing the finite nature of the training dataset, yet it allows distinguishing learning from memorization. It encompasses, as special cases, classical notions from Kolmogorov complexity, Shannon, and Fisher Information. However, unlike some of those frameworks, it can be applied easily to large-scale models and real-world datasets. It is the first framework to explicitly account for the optimization scheme, which plays a crucial role in Deep Learning, in measuring complexity and information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2018

The Dynamics of Differential Learning I: Information-Dynamics and Task Reachability

We study the topology of the space of learning tasks, which is critical ...
research
03/22/2017

Knowledge Transfer for Melanoma Screening with Deep Learning

Knowledge transfer impacts the performance of deep learning -- the state...
research
02/28/2017

Borrowing Treasures from the Wealthy: Deep Transfer Learning through Selective Joint Fine-tuning

Deep neural networks require a large amount of labeled training data dur...
research
03/16/2023

Learning for Amalgamation: A Multi-Source Transfer Learning Framework For Sentiment Classification

Transfer learning plays an essential role in Deep Learning, which can re...
research
09/14/2023

Efficiently Robustify Pre-trained Models

A recent trend in deep learning algorithms has been towards training lar...

Please sign up or login with your details

Forgot password? Click here to reset