No Representation without Transformation

12/09/2019
by   Giorgio Giannone, et al.
13

We propose to extend Latent Variable Models with a simple idea: learn to encode not only samples but also transformations of such samples. This means that the latent space is not only populated by embeddings but also by higher order objects that map between these embeddings. We show how a hierarchical graphical model can be utilized to enforce desirable algebraic properties of such latent mappings. These mappings in turn structure the latent space and hence can have a core impact on downstream tasks that are solved in the latent space. We demonstrate this impact on a set of experiments and also show that the representation of these latent mappings reflects interpretable properties.

READ FULL TEXT
research
12/05/2019

Representing Closed Transformation Paths in Encoded Network Latent Space

Deep generative networks have been widely used for learning mappings fro...
research
01/25/2019

On the Limitations of Representing Functions on Sets

Recent work on the representation of functions on sets has considered th...
research
11/25/2021

Learning Conditional Invariance through Cycle Consistency

Identifying meaningful and independent factors of variation in a dataset...
research
02/10/2021

Addressing the Topological Defects of Disentanglement via Distributed Operators

A core challenge in Machine Learning is to learn to disentangle natural ...
research
12/15/2022

Reliable Measures of Spread in High Dimensional Latent Spaces

Understanding geometric properties of natural language processing models...
research
11/30/2017

An interpretable latent variable model for attribute applicability in the Amazon catalogue

Learning attribute applicability of products in the Amazon catalog (e.g....
research
01/18/2021

Visualizing Missing Surfaces In Colonoscopy Videos using Shared Latent Space Representations

Optical colonoscopy (OC), the most prevalent colon cancer screening tool...

Please sign up or login with your details

Forgot password? Click here to reset