DeepAI AI Chat
Log In Sign Up

Mixed Membership Recurrent Neural Networks

by   Ghazal Fazelnia, et al.
Capital One
Columbia University

Models for sequential data such as the recurrent neural network (RNN) often implicitly model a sequence as having a fixed time interval between observations and do not account for group-level effects when multiple sequences are observed. We propose a model for grouped sequential data based on the RNN that accounts for varying time intervals between observations in a sequence by learning a group-level base parameter to which each sequence can revert. Our approach is motivated by the mixed membership framework, and we show how it can be used for dynamic topic modeling in which the distribution on topics (not the topics themselves) are evolving in time. We demonstrate our approach on a dataset of 3.4 million online grocery shopping orders made by 206K customers.


page 1

page 2

page 3

page 4


Unified recurrent neural network for many feature types

There are time series that are amenable to recurrent neural network (RNN...

Deep Temporal-Recurrent-Replicated-Softmax for Topical Trends over Time

Dynamic topic modeling facilitates the identification of topical trends ...

Sequential Click Prediction for Sponsored Search with Recurrent Neural Networks

Click prediction is one of the fundamental problems in sponsored search....

Mandarin tone modeling using recurrent neural networks

We propose an Encoder-Classifier framework to model the Mandarin tones u...

Phenotyping Endometriosis through Mixed Membership Models of Self-Tracking Data

We investigate the use of self-tracking data and unsupervised mixed-memb...

Online Abuse of UK MPs from 2015 to 2019: Working Paper

We extend previous work about general election-related abuse of UK MPs w...

Machine Learning on Sequential Data Using a Recurrent Weighted Average

Recurrent Neural Networks (RNN) are a type of statistical model designed...