Nonequilibrium thermodynamics of self-supervised learning

06/16/2021
by   Domingos S. P. Salazar, et al.
0

Self-supervised learning (SSL) of energy based models has an intuitive relation to equilibrium thermodynamics because the softmax layer, mapping energies to probabilities, is a Gibbs distribution. However, in what way SSL is a thermodynamic process? We show that some SSL paradigms behave as a thermodynamic composite system formed by representations and self-labels in contact with a nonequilibrium reservoir. Moreover, this system is subjected to usual thermodynamic cycles, such as adiabatic expansion and isochoric heating, resulting in a generalized Gibbs ensemble (GGE). In this picture, we show that learning is seen as a demon that operates in cycles using feedback measurements to extract negative work from the system. As applications, we examine some SSL algorithms using this idea.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2022

Homomorphic Self-Supervised Learning

In this work, we observe that many existing self-supervised learning alg...
research
04/20/2021

Distill on the Go: Online knowledge distillation in self-supervised learning

Self-supervised learning solves pretext prediction tasks that do not req...
research
04/01/2022

Simplicial Embeddings in Self-Supervised Learning and Downstream Classification

We introduce Simplicial Embeddings (SEMs) as a way to constrain the enco...
research
03/27/2023

On the stepwise nature of self-supervised learning

We present a simple picture of the training process of self-supervised l...
research
10/24/2022

Holistically-Attracted Wireframe Parsing: From Supervised to Self-Supervised Learning

This paper presents Holistically-Attracted Wireframe Parsing (HAWP) for ...

Please sign up or login with your details

Forgot password? Click here to reset