Manifold Characteristics That Predict Downstream Task Performance

05/16/2022
by   Ruan van der Merwe, et al.
0

Pretraining methods are typically compared by evaluating the accuracy of linear classifiers, transfer learning performance, or visually inspecting the representation manifold's (RM) lower-dimensional projections. We show that the differences between methods can be understood more clearly by investigating the RM directly, which allows for a more detailed comparison. To this end, we propose a framework and new metric to measure and compare different RMs. We also investigate and report on the RM characteristics for various pretraining methods. These characteristics are measured by applying sequentially larger local alterations to the input data, using white noise injections and Projected Gradient Descent (PGD) adversarial attacks, and then tracking each datapoint. We calculate the total distance moved for each datapoint and the relative change in distance between successive alterations. We show that self-supervised methods learn an RM where alterations lead to large but constant size changes, indicating a smoother RM than fully supervised methods. We then combine these measurements into one metric, the Representation Manifold Quality Metric (RMQM), where larger values indicate larger and less variable step sizes, and show that RMQM correlates positively with performance on downstream tasks.

READ FULL TEXT
research
03/31/2020

How Useful is Self-Supervised Pretraining for Visual Tasks?

Recent advances have spurred incredible progress in self-supervised pret...
research
08/12/2021

AMMUS : A Survey of Transformer-based Pretrained Models in Natural Language Processing

Transformer-based pretrained language models (T-PTLMs) have achieved gre...
research
06/14/2021

Self-Supervised Metric Learning in Multi-View Data: A Downstream Task Perspective

Self-supervised metric learning has been a successful approach for learn...
research
01/24/2023

SMART: Self-supervised Multi-task pretrAining with contRol Transformers

Self-supervised pretraining has been extensively studied in language and...
research
02/16/2023

Learning to diagnose cirrhosis from radiological and histological labels with joint self and weakly-supervised pretraining strategies

Identifying cirrhosis is key to correctly assess the health of the liver...
research
09/13/2019

White-Box Adversarial Defense via Self-Supervised Data Estimation

In this paper, we study the problem of how to defend classifiers against...
research
05/09/2023

Comparing Foundation Models using Data Kernels

Recent advances in self-supervised learning and neural network scaling h...

Please sign up or login with your details

Forgot password? Click here to reset