Contextual Multi-Armed Bandits for Causal Marketing

10/02/2018
by   Neela Sawant, et al.
0

This work explores the idea of a causal contextual multi-armed bandit approach to automated marketing, where we estimate and optimize the causal (incremental) effects. Focusing on causal effect leads to better return on investment (ROI) by targeting only the persuadable customers who wouldn't have taken the action organically. Our approach draws on strengths of causal inference, uplift modeling, and multi-armed bandits. It optimizes on causal treatment effects rather than pure outcome, and incorporates counterfactual generation within data collection. Following uplift modeling results, we optimize over the incremental business metric. Multi-armed bandit methods allow us to scale to multiple treatments and to perform off-policy policy evaluation on logged data. The Thompson sampling strategy in particular enables exploration of treatments on similar customer contexts and materialization of counterfactual outcomes. Preliminary offline experiments on a retail Fashion marketing dataset show merits of our proposal.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2020

Multi-armed bandit approach to password guessing

The multi-armed bandit is a mathematical interpretation of the problem a...
research
02/08/2021

Counterfactual Contextual Multi-Armed Bandit: a Real-World Application to Diagnose Apple Diseases

Post-harvest diseases of apple are one of the major issues in the econom...
research
08/22/2019

Online Inference for Advertising Auctions

Advertisers that engage in real-time bidding (RTB) to display their ads ...
research
06/16/2022

Pure Exploration of Causal Bandits

Causal bandit problem integrates causal inference with multi-armed bandi...
research
12/03/2021

Chronological Causal Bandits

This paper studies an instance of the multi-armed bandit (MAB) problem, ...
research
10/16/2019

Optimising Individual-Treatment-Effect Using Bandits

Applying causal inference models in areas such as economics, healthcare ...
research
03/07/2020

Online Residential Demand Response via Contextual Multi-Armed Bandits

Residential load demands have huge potential to be exploited to enhance ...

Please sign up or login with your details

Forgot password? Click here to reset