Learning the Space of Deep Models

06/10/2022
by   Gianluca Berardi, et al.
0

Embedding of large but redundant data, such as images or text, in a hierarchy of lower-dimensional spaces is one of the key features of representation learning approaches, which nowadays provide state-of-the-art solutions to problems once believed hard or impossible to solve. In this work, in a plot twist with a strong meta aftertaste, we show how trained deep models are as redundant as the data they are optimized to process, and how it is therefore possible to use deep learning models to embed deep learning models. In particular, we show that it is possible to use representation learning to learn a fixed-size, low-dimensional embedding space of trained deep models and that such space can be explored by interpolation or optimization to attain ready-to-use models. We find that it is possible to learn an embedding space of multiple instances of the same architecture and of multiple architectures. We address image classification and neural representation of signals, showing how our embedding space can be learnt so as to capture the notions of performance and 3D shape, respectively. In the Multi-Architecture setting we also show how an embedding trained only on a subset of architectures can learn to generate already-trained instances of architectures it never sees instantiated at training time.

READ FULL TEXT

page 9

page 10

page 11

page 13

research
04/16/2020

Multiple Visual-Semantic Embedding for Video Retrieval from Query Sentence

Visual-semantic embedding aims to learn a joint embedding space where re...
research
09/09/2019

Neural Architecture Search in Embedding Space

The neural architecture search (NAS) algorithm with reinforcement learni...
research
09/30/2020

The Utility of Decorrelating Colour Spaces in Vector Quantised Variational Autoencoders

Vector quantised variational autoencoders (VQ-VAE) are characterised by ...
research
03/06/2021

Simplicial Complex Representation Learning

Simplicial complexes form an important class of topological spaces that ...
research
01/28/2019

Hierarchically Clustered Representation Learning

The joint optimization of representation learning and clustering in the ...
research
01/23/2023

Characterizing Polarization in Social Networks using the Signed Relational Latent Distance Model

Graph representation learning has become a prominent tool for the charac...
research
04/28/2023

Exploiting the Distortion-Semantic Interaction in Fisheye Data

In this work, we present a methodology to shape a fisheye-specific repre...

Please sign up or login with your details

Forgot password? Click here to reset