Complexity-Free Generalization via Distributionally Robust Optimization

06/21/2021
by   Henry Lam, et al.
0

Established approaches to obtain generalization bounds in data-driven optimization and machine learning mostly build on solutions from empirical risk minimization (ERM), which depend crucially on the functional complexity of the hypothesis class. In this paper, we present an alternate route to obtain these bounds on the solution from distributionally robust optimization (DRO), a recent data-driven optimization framework based on worst-case analysis and the notion of ambiguity set to capture statistical uncertainty. In contrast to the hypothesis class complexity in ERM, our DRO bounds depend on the ambiguity set geometry and its compatibility with the true loss function. Notably, when using maximum mean discrepancy as a DRO distance metric, our analysis implies, to the best of our knowledge, the first generalization bound in the literature that depends solely on the true loss function, entirely free of any complexity measures or bounds on the hypothesis class.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2019

Hypothesis Set Stability and Generalization

We present an extensive study of generalization for data-dependent hypot...
research
06/24/2023

Smoothed f-Divergence Distributionally Robust Optimization: Exponential Rate Efficiency and Complexity-Free Calibration

In data-driven optimization, sample average approximation is known to su...
research
03/16/2023

Distributionally Robust Optimization using Cost-Aware Ambiguity Sets

We present a novel class of ambiguity sets for distributionally robust o...
research
06/07/2021

Encoding-dependent generalization bounds for parametrized quantum circuits

A large body of recent work has begun to explore the potential of parame...
research
12/03/2022

Hedging against Complexity: Distributionally Robust Optimization with Parametric Approximation

Empirical risk minimization (ERM) and distributionally robust optimizati...
research
05/30/2014

Generalization Bounds for Learning with Linear, Polygonal, Quadratic and Conic Side Knowledge

In this paper, we consider a supervised learning setting where side know...
research
05/19/2012

New Analysis and Algorithm for Learning with Drifting Distributions

We present a new analysis of the problem of learning with drifting distr...

Please sign up or login with your details

Forgot password? Click here to reset