Gradient flow encoding with distance optimization adaptive step size

05/11/2021
by   Kyriakos Flouris, et al.
0

The autoencoder model uses an encoder to map data samples to a lower dimensional latent space and then a decoder to map the latent space representations back to the data space. Implicitly, it relies on the encoder to approximate the inverse of the decoder network, so that samples can be mapped to and back from the latent space faithfully. This approximation may lead to sub-optimal latent space representations. In this work, we investigate a decoder-only method that uses gradient flow to encode data samples in the latent space. The gradient flow is defined based on a given decoder and aims to find the optimal latent space representation for any given sample through optimisation, eliminating the need of an approximate inversion through an encoder. Implementing gradient flow through ordinary differential equations (ODE), we leverage the adjoint method to train a given decoder. We further show empirically that the costly integrals in the adjoint method may not be entirely necessary. Additionally, we propose a 2^nd order ODE variant to the method, which approximates Nesterov's accelerated gradient descent, with faster convergence per iteration. Commonly used ODE solvers can be quite sensitive to the integration step-size depending on the stiffness of the ODE. To overcome the sensitivity for gradient flow encoding, we use an adaptive solver that prioritises minimising loss at each integration step. We assess the proposed method in comparison to the autoencoding model. In our experiments, GFE showed a much higher data-efficiency than the autoencoding model, which can be crucial for data scarce applications.

READ FULL TEXT
research
07/17/2023

Complexity Matters: Rethinking the Latent Space for Generative Modeling

In generative modeling, numerous successful approaches leverage a low-di...
research
07/10/2019

Interpretable Deep Learning Model for the Detection and Reconstruction of Dysarthric Speech

This paper proposed a novel approach for the detection and reconstructio...
research
08/22/2022

Learning Low Bending and Low Distortion Manifold Embeddings: Theory and Applications

Autoencoders, which consist of an encoder and a decoder, are widely used...
research
03/20/2019

OCGAN: One-class Novelty Detection Using GANs with Constrained Latent Representations

We present a novel model called OCGAN for the classical problem of one-c...
research
03/29/2021

Bayesian Attention Networks for Data Compression

The lossless data compression algorithm based on Bayesian Attention Netw...
research
09/10/2020

Self-Supervised Annotation of Seismic Images using Latent Space Factorization

Annotating seismic data is expensive, laborious and subjective due to th...
research
05/22/2023

MacLaSa: Multi-Aspect Controllable Text Generation via Efficient Sampling from Compact Latent Space

Multi-aspect controllable text generation aims to generate fluent senten...

Please sign up or login with your details

Forgot password? Click here to reset