MOI-Mixer: Improving MLP-Mixer with Multi Order Interactions in Sequential Recommendation

08/17/2021
by   Hojoon Lee, et al.
0

Successful sequential recommendation systems rely on accurately capturing the user's short-term and long-term interest. Although Transformer-based models achieved state-of-the-art performance in the sequential recommendation task, they generally require quadratic memory and time complexity to the sequence length, making it difficult to extract the long-term interest of users. On the other hand, Multi-Layer Perceptrons (MLP)-based models, renowned for their linear memory and time complexity, have recently shown competitive results compared to Transformer in various tasks. Given the availability of a massive amount of the user's behavior history, the linear memory and time complexity of MLP-based models make them a promising alternative to explore in the sequential recommendation task. To this end, we adopted MLP-based models in sequential recommendation but consistently observed that MLP-based methods obtain lower performance than those of Transformer despite their computational benefits. From experiments, we observed that introducing explicit high-order interactions to MLP layers mitigates such performance gap. In response, we propose the Multi-Order Interaction (MOI) layer, which is capable of expressing an arbitrary order of interactions within the inputs while maintaining the memory and time complexity of the MLP layer. By replacing the MLP layer with the MOI layer, our model was able to achieve comparable performance with Transformer-based models while retaining the MLP-based models' computational benefits.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2018

Recurrent Neural Networks for Long and Short-Term Sequential Recommendation

Recommender systems objectives can be broadly characterized as modeling ...
research
03/11/2023

AutoMLP: Automated MLP for Sequential Recommendations

Sequential recommender systems aim to predict users' next interested ite...
research
02/28/2022

Filter-enhanced MLP is All You Need for Sequential Recommendation

Recently, deep neural networks such as RNN, CNN and Transformer have bee...
research
12/08/2022

Denoising Self-attentive Sequential Recommendation

Transformer-based sequential recommenders are very powerful for capturin...
research
05/30/2023

Client: Cross-variable Linear Integrated Enhanced Transformer for Multivariate Long-Term Time Series Forecasting

Long-term time series forecasting (LTSF) is a crucial aspect of modern s...
research
07/26/2023

Integrating Offline Reinforcement Learning with Transformers for Sequential Recommendation

We consider the problem of sequential recommendation, where the current ...
research
02/22/2019

Towards Neural Mixture Recommender for Long Range Dependent User Sequences

Understanding temporal dynamics has proved to be highly valuable for acc...

Please sign up or login with your details

Forgot password? Click here to reset