Invariant-equivariant representation learning for multi-class data

02/08/2019
by   Ilya Feige, et al.
0

Representations learnt through deep neural networks tend to be highly informative, but opaque in terms of what information they learn to encode. We introduce an approach to probabilistic modelling that learns to represent data with two separate deep representations: an invariant representation that encodes the information of the class from which the data belongs, and an equivariant representation that encodes the symmetry transformation defining the particular data point within the class manifold (equivariant in the sense that the representation varies naturally with symmetry transformations). This approach is based primarily on the strategic routing of data through the two latent variables, and thus is conceptually transparent, easy to implement, and in-principle generally applicable to any data comprised of discrete classes of continuous distributions (e.g. objects in images, topics in language, individuals in behavioural data). We demonstrate qualitatively compelling representation learning and competitive quantitative performance, in both supervised and semi-supervised settings, versus comparable modelling approaches in the literature with little fine tuning.

READ FULL TEXT

page 6

page 7

research
07/07/2022

Equivariant Representation Learning via Class-Pose Decomposition

We introduce a general method for learning representations that are equi...
research
09/30/2019

Equivariant Hamiltonian Flows

This paper introduces equivariant hamiltonian flows, a method for learni...
research
01/18/2019

Probabilistic symmetry and invariant neural networks

In an effort to improve the performance of deep neural networks in data-...
research
09/07/2016

Discrete Variational Autoencoders

Probabilistic models with discrete latent variables naturally capture da...
research
08/17/2022

How does the degree of novelty impacts semi-supervised representation learning for novel class retrieval?

Supervised representation learning with deep networks tends to overfit t...
research
05/04/2023

Interpretable Sentence Representation with Variational Autoencoders and Attention

In this thesis, we develop methods to enhance the interpretability of re...
research
11/09/2017

A Separation Principle for Control in the Age of Deep Learning

We review the problem of defining and inferring a "state" for a control ...

Please sign up or login with your details

Forgot password? Click here to reset