NAS-X: Neural Adaptive Smoothing via Twisting

08/28/2023
by   Dieterich Lawson, et al.
0

We present Neural Adaptive Smoothing via Twisting (NAS-X), a method for learning and inference in sequential latent variable models based on reweighted wake-sleep (RWS). NAS-X works with both discrete and continuous latent variables, and leverages smoothing SMC to fit a broader range of models than traditional RWS methods. We test NAS-X on discrete and continuous tasks and find that it substantially outperforms previous variational and RWS-based methods in inference and parameter recovery.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2018

Revisiting Reweighted Wake-Sleep

Discrete latent-variable models, while applicable in a variety of settin...
research
02/14/2018

DVAE++: Discrete Variational Autoencoders with Overlapping Transformations

Training of discrete latent variable models remains challenging because ...
research
06/11/2020

Discrete Latent Variable Representations for Low-Resource Text Classification

While much work on deep latent variable models of text uses continuous l...
research
11/02/2022

Speeding up NAS with Adaptive Subset Selection

A majority of recent developments in neural architecture search (NAS) ha...
research
07/16/2020

BRP-NAS: Prediction-based NAS using GCNs

Neural architecture search (NAS) enables researchers to automatically ex...
research
01/18/2021

Mind the Gap when Conditioning Amortised Inference in Sequential Latent-Variable Models

Amortised inference enables scalable learning of sequential latent-varia...
research
04/17/2020

Continuous-Discrete Filtering and Smoothing on Submanifolds of Euclidean Space

In this paper the issue of filtering and smoothing in continuous discret...

Please sign up or login with your details

Forgot password? Click here to reset