Online Auctions and Multi-scale Online Learning

05/26/2017
by   Sébastien Bubeck, et al.
0

We consider revenue maximization in online auctions and pricing. A seller sells an identical item in each period to a new buyer, or a new set of buyers. For the online posted pricing problem, we show regret bounds that scale with the best fixed price, rather than the range of the values. We also show regret bounds that are almost scale free, and match the offline sample complexity, when comparing to a benchmark that requires a lower bound on the market share. These results are obtained by generalizing the classical learning from experts and multi-armed bandit problems to their multi-scale versions. In this version, the reward of each action is in a different range, and the regret w.r.t. a given action scales with its own range, rather than the maximum range.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2022

Complete Policy Regret Bounds for Tallying Bandits

Policy regret is a well established notion of measuring the performance ...
research
06/08/2021

Scale Free Adversarial Multi Armed Bandits

We consider the Scale-Free Adversarial Multi Armed Bandit(MAB) problem, ...
research
02/09/2018

Make the Minority Great Again: First-Order Regret Bound for Contextual Bandits

Regret bounds in online learning compare the player's performance to L^*...
research
07/09/2018

Dynamic Pricing with Finitely Many Unknown Valuations

Motivated by posted price auctions where buyers are grouped in an unknow...
research
02/01/2023

Uniswap Liquidity Provision: An Online Learning Approach

Decentralized Exchanges (DEXs) are new types of marketplaces leveraging ...
research
04/20/2021

Joint Online Learning and Decision-making via Dual Mirror Descent

We consider an online revenue maximization problem over a finite time ho...
research
07/27/2023

Learning in Repeated Multi-Unit Pay-As-Bid Auctions

Motivated by Carbon Emissions Trading Schemes, Treasury Auctions, and Pr...

Please sign up or login with your details

Forgot password? Click here to reset