Sequential Variational Autoencoders for Collaborative Filtering

11/25/2018
by   Noveen Sachdeva, et al.
0

Variational autoencoders were proven successful in domains such as computer vision and speech processing. Their adoption for modeling user preferences is still unexplored, although recently it is starting to gain attention in the current literature. In this work, we propose a model which extends variational autoencoders by exploiting the rich information present in the past preference history. We introduce a recurrent version of the VAE, where instead of passing a subset of the whole history regardless of temporal dependencies, we rather pass the consumption sequence subset through a recurrent neural network. At each time-step of the RNN, the sequence is fed through a series of fully-connected layers, the output of which models the probability distribution of the most likely future preferences. We show that handling temporal information is crucial for improving the accuracy of the VAE: In fact, our model beats the current state-of-the-art by valuable margins because of its ability to capture temporal dependencies among the user-consumption sequence using the recurrent encoder still keeping the fundamentals of variational autoencoders intact.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2021

A Benchmark of Dynamical Variational Autoencoders applied to Speech Spectrogram Modeling

The Variational Autoencoder (VAE) is a powerful deep generative model th...
research
03/07/2023

Speech Modeling with a Hierarchical Transformer Dynamical VAE

The dynamical variational autoencoders (DVAEs) are a family of latent-va...
research
08/28/2020

Dynamical Variational Autoencoders: A Comprehensive Review

The Variational Autoencoder (VAE) is a powerful deep generative model th...
research
11/03/2019

Enhancing VAEs for Collaborative Filtering: Flexible Priors Gating Mechanisms

Neural network based models for collaborative filtering have started to ...
research
05/31/2021

Variational Autoencoders: A Harmonic Perspective

In this work we study Variational Autoencoders (VAEs) from the perspecti...
research
07/08/2018

Learning The Sequential Temporal Information with Recurrent Neural Networks

Recurrent Networks are one of the most powerful and promising artificial...
research
02/15/2017

Generative Temporal Models with Memory

We consider the general problem of modeling temporal data with long-rang...

Please sign up or login with your details

Forgot password? Click here to reset