DeepAI AI Chat
Log In Sign Up

Settling the Sample Complexity of Single-parameter Revenue Maximization

by   Chenghao Guo, et al.

This paper settles the sample complexity of single-parameter revenue maximization by showing matching upper and lower bounds, up to a poly-logarithmic factor, for all families of value distributions that have been considered in the literature. The upper bounds are unified under a novel framework, which builds on the strong revenue monotonicity by Devanur, Huang, and Psomas (STOC 2016), and an information theoretic argument. This is fundamentally different from the previous approaches that rely on either constructing an ϵ-net of the mechanism space, explicitly or implicitly via statistical learning theory, or learning an approximately accurate version of the virtual values. To our knowledge, it is the first time information theoretical arguments are used to show sample complexity upper bounds, instead of lower bounds. Our lower bounds are also unified under a meta construction of hard instances.


Towards Testing Monotonicity of Distributions Over General Posets

In this work, we consider the sample complexity required for testing the...

Strong Revenue (Non-)Monotonicity of Single-parameter Auctions

Consider Myerson's optimal auction with respect to an inaccurate prior, ...

Generalizing Complex Hypotheses on Product Distributions: Auctions, Prophet Inequalities, and Pandora's Problem

This paper explores a theory of generalization for learning problems on ...

Prior-Free Clock Auctions for Bidders with Interdependent Values

We study the problem of selling a good to a group of bidders with interd...

Reproducibility in Learning

We introduce the notion of a reproducible algorithm in the context of le...

Nearly Optimal Algorithms for Level Set Estimation

The level set estimation problem seeks to find all points in a domain X ...

Learning to Control Linear Systems can be Hard

In this paper, we study the statistical difficulty of learning to contro...