Relative representations enable zero-shot latent space communication

09/30/2022
by   Luca Moschella, et al.
6

Neural networks embed the geometric structure of a data manifold lying in a high-dimensional space into latent representations. Ideally, the distribution of the data points in the latent space should depend only on the task, the data, the loss, and other architecture-specific constraints. However, factors such as the random weights initialization, training hyperparameters, or other sources of randomness in the training phase may induce incoherent latent spaces that hinder any form of reuse. Nevertheless, we empirically observe that, under the same data and modeling choices, distinct latent spaces typically differ by an unknown quasi-isometric transformation: that is, in each space, the distances between the encodings do not change. In this work, we propose to adopt pairwise similarities as an alternative data representation, that can be used to enforce the desired invariance without any additional training. We show how neural architectures can leverage these relative representations to guarantee, in practice, latent isometry invariance, effectively enabling latent space communication: from zero-shot model stitching to latent space comparison between diverse settings. We extensively validate the generalization capability of our approach on different datasets, spanning various modalities (images, text, graphs), tasks (e.g., classification, reconstruction) and architectures (e.g., CNNs, GCNs, transformers).

READ FULL TEXT
research
03/01/2023

Bootstrapping Parallel Anchors for Relative Representations

The use of relative representations for latent embeddings has shown pote...
research
12/26/2017

Zero-Shot Learning via Latent Space Encoding

Zero-Shot Learning (ZSL) is typically achieved by resorting to a class s...
research
04/17/2018

Learning Sparse Latent Representations with the Deep Copula Information Bottleneck

Deep latent variable models are powerful tools for representation learni...
research
07/12/2021

Structured Latent Embeddings for Recognizing Unseen Classes in Unseen Domains

The need to address the scarcity of task-specific annotated data has res...
research
10/29/2020

Latent Space Oddity: Exploring Latent Spaces to Design Guitar Timbres

We introduce a novel convolutional network architecture with an interpre...
research
08/16/2021

Structure-Aware Feature Generation for Zero-Shot Learning

Zero-Shot Learning (ZSL) targets at recognizing unseen categories by lev...
research
03/28/2019

Toroidal AutoEncoder

Enforcing distributions of latent variables in neural networks is an act...

Please sign up or login with your details

Forgot password? Click here to reset