Alleviating Cold-start Problem in CTR Prediction with A Variational Embedding Learning Framework

01/17/2022
by   Xiaoxiao Xu, et al.
0

We propose a general Variational Embedding Learning Framework (VELF) for alleviating the severe cold-start problem in CTR prediction. VELF addresses the cold start problem via alleviating over-fits caused by data-sparsity in two ways: learning probabilistic embedding, and incorporating trainable and regularized priors which utilize the rich side information of cold start users and advertisements (Ads). The two techniques are naturally integrated into a variational inference framework, forming an end-to-end training process. Abundant empirical tests on benchmark datasets well demonstrate the advantages of our proposed VELF. Besides, extended experiments confirmed that our parameterized and regularized priors provide more generalization capability than traditional fixed priors.

READ FULL TEXT
research
10/06/2021

QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks

The advent of noisy intermediate-scale quantum (NISQ) computers raises a...
research
05/13/2022

Exploiting Variational Domain-Invariant User Embedding for Partially Overlapped Cross Domain Recommendation

Cross-Domain Recommendation (CDR) has been popularly studied to utilize ...
research
10/01/2018

Graph Diffusion-Embedding Networks

We present a novel graph diffusion-embedding networks (GDEN) for graph s...
research
10/09/2022

Neural Extended Kalman Filters for Learning and Predicting Dynamics of Structural Systems

Accurate structural response prediction forms a main driver for structur...
research
11/22/2016

Inducing Interpretable Representations with Variational Autoencoders

We develop a framework for incorporating structured graphical models in ...
research
11/21/2018

Joint Mapping and Calibration via Differentiable Sensor Fusion

We leverage automatic differentiation (AD) and probabilistic programming...

Please sign up or login with your details

Forgot password? Click here to reset