A picture of the space of typical learnable tasks

10/31/2022
by   Rahul Ramesh, et al.
0

We develop a technique to analyze representations learned by deep networks when they are trained on different tasks using supervised, meta- and contrastive learning. We develop a technique to visualize such representations using an isometric embedding of the space of probabilistic models into a lower-dimensional space, i.e., one that preserves pairwise distances. We discover the following surprising phenomena that shed light upon the structure in the space of learnable tasks: (1) the manifold of probabilistic models trained on different tasks using different representation learning methods is effectively low-dimensional; (2) supervised learning on one task results in a surprising amount of progress on seemingly dissimilar tasks; progress on other tasks is larger if the training task has diverse classes; (3) the structure of the space of tasks indicated by our analysis is consistent with parts of the Wordnet phylogenetic tree; (4) fine-tuning a model upon a sub-task does not change the representation much if the model was trained for a large number of epochs; (5) episodic meta-learning algorithms fit similar models eventually as that of supervised learning, even if the two traverse different trajectories during training; (6) contrastive learning methods trained on different datasets learn similar representations. We use classification tasks constructed from the CIFAR-10 and Imagenet datasets to study these phenomena.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2023

The Training Process of Many Deep Networks Explores the Same Low-Dimensional Manifold

We develop information-geometric techniques to analyze the trajectories ...
research
02/05/2023

On the Role of Contrastive Representation Learning in Adversarial Robustness: An Empirical Study

Self-supervised contrastive learning has solved one of the significant o...
research
09/16/2022

MetaMask: Revisiting Dimensional Confounder for Self-Supervised Learning

As a successful approach to self-supervised learning, contrastive learni...
research
09/19/2023

Graph Contrastive Learning Meets Graph Meta Learning: A Unified Method for Few-shot Node Tasks

Graph Neural Networks (GNNs) have become popular in Graph Representation...
research
05/21/2023

Many or Few Samples? Comparing Transfer, Contrastive and Meta-Learning in Encrypted Traffic Classification

The popularity of Deep Learning (DL), coupled with network traffic visib...
research
02/27/2023

Analyzing Populations of Neural Networks via Dynamical Model Embedding

A core challenge in the interpretation of deep neural networks is identi...
research
10/04/2018

The Dynamics of Differential Learning I: Information-Dynamics and Task Reachability

We study the topology of the space of learning tasks, which is critical ...

Please sign up or login with your details

Forgot password? Click here to reset