Log In Sign Up

Deep Variational Autoencoder with Shallow Parallel Path for Top-N Recommendation (VASP)

by   Vojtěch Vančura, et al.

Recently introduced EASE algorithm presents a simple and elegant way, how to solve the top-N recommendation task. In this paper, we introduce Neural EASE to further improve the performance of this algorithm by incorporating techniques for training modern neural networks. Also, there is a growing interest in the recsys community to utilize variational autoencoders (VAE) for this task. We introduce deep autoencoder FLVAE benefiting from multiple non-linear layers without an information bottleneck while not overfitting towards the identity. We show how to learn FLVAE in parallel with Neural EASE and achieve the state of the art performance on the MovieLens 20M dataset and competitive results on the Netflix Prize dataset.


page 1

page 2

page 3

page 4


Conditioned Variational Autoencoder for top-N item recommendation

In this paper, we propose a Conditioned Variational Autoencoder to impro...

Variational Information Bottleneck on Vector Quantized Autoencoders

In this paper, we provide an information-theoretic interpretation of the...

Linearizing Visual Processes with Convolutional Variational Autoencoders

This work studies the problem of modeling non-linear visual processes by...

Deep Feature Consistent Variational Autoencoder

We present a novel method for constructing Variational Autoencoder (VAE)...

Insider Detection using Deep Autoencoder and Variational Autoencoder Neural Networks

Insider attacks are one of the most challenging cybersecurity issues for...

Representation Learning using Graph Autoencoders with Residual Connections

Graph autoencoders are very efficient at embedding graph-based complex d...

Variational Autoencoding the Lagrangian Trajectories of Particles in a Combustion System

We introduce a deep learning method to simulate the motion of particles ...