Rethinking Position Bias Modeling with Knowledge Distillation for CTR Prediction

04/01/2022
by   Congcong Liu, et al.
0

Click-through rate (CTR) Prediction is of great importance in real-world online ads systems. One challenge for the CTR prediction task is to capture the real interest of users from their clicked items, which is inherently biased by presented positions of items, i.e., more front positions tend to obtain higher CTR values. A popular line of existing works focuses on explicitly estimating position bias by result randomization which is expensive and inefficient, or by inverse propensity weighting (IPW) which relies heavily on the quality of the propensity estimation. Another common solution is modeling position as features during offline training and simply adopting fixed value or dropout tricks when serving. However, training-inference inconsistency can lead to sub-optimal performance. Furthermore, post-click information such as position values is informative while less exploited in CTR prediction. This work proposes a simple yet efficient knowledge distillation framework to alleviate the impact of position bias and leverage position information to improve CTR prediction. We demonstrate the performance of our proposed method on a real-world production dataset and online A/B tests, achieving significant improvements over competing baseline models. The proposed method has been deployed in the real world online ads systems, serving main traffic on one of the world's largest e-commercial platforms.

READ FULL TEXT
research
06/10/2021

Deep Position-wise Interaction Network for CTR Prediction

Click-through rate (CTR) prediction plays an important role in online ad...
research
07/29/2023

Click-Conversion Multi-Task Model with Position Bias Mitigation for Sponsored Search in eCommerce

Position bias, the phenomenon whereby users tend to focus on higher-rank...
research
07/11/2019

Privileged Features Distillation for E-Commerce Recommendations

Features play an important role in most prediction tasks of e-commerce r...
research
11/11/2022

PILE: Pairwise Iterative Logits Ensemble for Multi-Teacher Labeled Distillation

Pre-trained language models have become a crucial part of ranking system...
research
08/11/2022

Self-Knowledge Distillation via Dropout

To boost the performance, deep neural networks require deeper or wider n...
research
05/10/2023

Improving position bias estimation against sparse and skewed dataset with item embedding

Estimating position bias is a well-known challenge in Learning to rank (...
research
11/24/2020

DADNN: Multi-Scene CTR Prediction via Domain-Aware Deep Neural Network

Click through rate(CTR) prediction is a core task in advertising systems...

Please sign up or login with your details

Forgot password? Click here to reset