Bootstrapping Parallel Anchors for Relative Representations

03/01/2023
by   Irene Cannistraci, et al.
0

The use of relative representations for latent embeddings has shown potential in enabling latent space communication and zero-shot model stitching across a wide range of applications. Nevertheless, relative representations rely on a certain amount of parallel anchors to be given as input, which can be impractical to obtain in certain scenarios. To overcome this limitation, we propose an optimization-based method to discover new parallel anchors from a limited number of seeds. Our approach can be used to find semantic correspondence between different domains, align their relative spaces, and achieve competitive results in several tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2022

Relative representations enable zero-shot latent space communication

Neural networks embed the geometric structure of a data manifold lying i...
research
07/12/2021

Structured Latent Embeddings for Recognizing Unseen Classes in Unseen Domains

The need to address the scarcity of task-specific annotated data has res...
research
08/27/2018

Learning Multilingual Word Embeddings in Latent Metric Space: A Geometric Approach

We propose a novel geometric approach for learning bilingual mappings gi...
research
06/19/2023

Renderers are Good Zero-Shot Representation Learners: Exploring Diffusion Latents for Metric Learning

Can the latent spaces of modern generative neural rendering models serve...
research
03/11/2023

ZeroNLG: Aligning and Autoencoding Domains for Zero-Shot Multimodal and Multilingual Natural Language Generation

Natural Language Generation (NLG) accepts input data in the form of imag...
research
11/02/2022

Learning an Artificial Language for Knowledge-Sharing in Multilingual Translation

The cornerstone of multilingual neural translation is shared representat...

Please sign up or login with your details

Forgot password? Click here to reset