KDSM: An uplift modeling framework based on knowledge distillation and sample matching

03/06/2023
by   Chang Sun, et al.
0

Uplift modeling aims to estimate the treatment effect on individuals, widely applied in the e-commerce platform to target persuadable customers and maximize the return of marketing activities. Among the existing uplift modeling methods, tree-based methods are adept at fitting increment and generalization, while neural-network-based models excel at predicting absolute value and precision, and these advantages have not been fully explored and combined. Also, the lack of counterfactual sample pairs is the root challenge in uplift modeling. In this paper, we proposed an uplift modeling framework based on Knowledge Distillation and Sample Matching (KDSM). The teacher model is the uplift decision tree (UpliftDT), whose structure is exploited to construct counterfactual sample pairs, and the pairwise incremental prediction is treated as another objective for the student model. Under the idea of multitask learning, the student model can achieve better performance on generalization and even surpass the teacher. Extensive offline experiments validate the universality of different combinations of teachers and student models and the superiority of KDSM measured against the baselines. In online A/B testing, the cost of each incremental room night is reduced by 6.5%.

READ FULL TEXT
research
05/04/2022

Generalized Knowledge Distillation via Relationship Matching

The knowledge of a well-trained deep neural network (a.k.a. the "teacher...
research
02/23/2022

Multi-Teacher Knowledge Distillation for Incremental Implicitly-Refined Classification

Incremental learning methods can learn new classes continually by distil...
research
10/19/2021

Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation

Knowledge Distillation is becoming one of the primary trends among neura...
research
12/31/2019

Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation

Knowledge distillation (KD) is a new method for transferring knowledge o...
research
10/04/2022

Knowledge Distillation based Contextual Relevance Matching for E-commerce Product Search

Online relevance matching is an essential task of e-commerce product sea...
research
08/14/2020

Free Lunch! Retrospective Uplift Modeling for Dynamic Promotions Recommendation within ROI Constraints

Promotions and discounts have become key components of modern e-commerce...

Please sign up or login with your details

Forgot password? Click here to reset