Curriculum Modeling the Dependence among Targets with Multi-task Learning for Financial Marketing

04/25/2023
by   Yunpeng Weng, et al.
0

Multi-task learning for various real-world applications usually involves tasks with logical sequential dependence. For example, in online marketing, the cascade behavior pattern of impression → click → conversion is usually modeled as multiple tasks in a multi-task manner, where the sequential dependence between tasks is simply connected with an explicitly defined function or implicitly transferred information in current works. These methods alleviate the data sparsity problem for long-path sequential tasks as the positive feedback becomes sparser along with the task sequence. However, the error accumulation and negative transfer will be a severe problem for downstream tasks. Especially, at the beginning stage of training, the optimization for parameters of former tasks is not converged yet, and thus the information transferred to downstream tasks is negative. In this paper, we propose a prior information merged model (PIMM), which explicitly models the logical dependence among tasks with a novel prior information merged (PIM) module for multiple sequential dependence task learning in a curriculum manner. Specifically, the PIM randomly selects the true label information or the prior task prediction with a soft sampling strategy to transfer to the downstream task during the training. Following an easy-to-difficult curriculum paradigm, we dynamically adjust the sampling probability to ensure that the downstream task will get the effective information along with the training. The offline experimental results on both public and product datasets verify that PIMM outperforms state-of-the-art baselines. Moreover, we deploy the PIMM in a large-scale FinTech platform, and the online experiments also demonstrate the effectiveness of PIMM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2021

Modeling the Sequential Dependence among Audience Multi-step Conversions with Multi-task Learning in Targeted Display Advertising

In most real-world large-scale online applications (e.g., e-commerce or ...
research
01/06/2023

Task Aware Feature Extraction Framework for Sequential Dependence Multi-Task Learning

Multi-task learning (MTL) has been successfully implemented in many real...
research
05/18/2020

Efficient Image Gallery Representations at Scale Through Multi-Task Learning

Image galleries provide a rich source of diverse information about a pro...
research
09/19/2022

Effective Adaptation in Multi-Task Co-Training for Unified Autonomous Driving

Aiming towards a holistic understanding of multiple downstream tasks sim...
research
08/31/2023

CL-MAE: Curriculum-Learned Masked Autoencoders

Masked image modeling has been demonstrated as a powerful pretext task f...
research
06/17/2021

Unsupervised Path Representation Learning with Curriculum Negative Sampling

Path representations are critical in a variety of transportation applica...
research
05/12/2016

Learning the Curriculum with Bayesian Optimization for Task-Specific Word Representation Learning

We use Bayesian optimization to learn curricula for word representation ...

Please sign up or login with your details

Forgot password? Click here to reset