OptEmbed: Learning Optimal Embedding Table for Click-through Rate Prediction

08/09/2022
by   Fuyuan Lyu, et al.
0

Learning embedding table plays a fundamental role in Click-through rate(CTR) prediction from the view of the model performance and memory usage. The embedding table is a two-dimensional tensor, with its axes indicating the number of feature values and the embedding dimension, respectively. To learn an efficient and effective embedding table, recent works either assign various embedding dimensions for feature fields and reduce the number of embeddings respectively or mask the embedding table parameters. However, all these existing works cannot get an optimal embedding table. On the one hand, various embedding dimensions still require a large amount of memory due to the vast number of features in the dataset. On the other hand, decreasing the number of embeddings usually suffers from performance degradation, which is intolerable in CTR prediction. Finally, pruning embedding parameters will lead to a sparse embedding table, which is hard to be deployed. To this end, we propose an optimal embedding table learning framework OptEmbed, which provides a practical and general method to find an optimal embedding table for various base CTR models. Specifically, we propose pruning the redundant embeddings regarding corresponding features' importance by learnable pruning thresholds. Furthermore, we consider assigning various embedding dimensions as one single candidate architecture. To efficiently search the optimal embedding dimensions, we design a uniform embedding dimension sampling scheme to equally train all candidate architectures, meaning architecture-related parameters and learnable thresholds are trained simultaneously in one supernet. We then propose an evolution search method based on the supernet to find the optimal embedding dimensions for each field. Experiments on public datasets show that OptEmbed can learn a compact embedding table which can further improve the model performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/19/2021

Learnable Embedding Sizes for Recommender Systems

The embedding-based representation learning is commonly used in deep lea...
research
04/07/2022

Single-shot Embedding Dimension Search in Recommender System

As a crucial component of most modern deep recommender systems, feature ...
research
08/17/2022

Field-wise Embedding Size Search via Structural Hard Auxiliary Mask Pruning for Click-Through Rate Prediction

Feature embeddings are one of the most essential steps when training dee...
research
08/24/2021

Learning Effective and Efficient Embedding via an Adaptively-Masked Twins-based Layer

Embedding learning for categorical features is crucial for the deep lear...
research
06/08/2020

Differentiable Neural Input Search for Recommender Systems

Latent factor models are the driving forces of the state-of-the-art reco...
research
01/26/2023

Optimizing Feature Set for Click-Through Rate Prediction

Click-through prediction (CTR) models transform features into latent vec...
research
08/24/2021

Binary Code based Hash Embedding for Web-scale Applications

Nowadays, deep learning models are widely adopted in web-scale applicati...

Please sign up or login with your details

Forgot password? Click here to reset