Multi-Observation Regression

02/27/2018
by   Rafael Frongillo, et al.
0

Recent work introduced loss functions which measure the error of a prediction based on multiple simultaneous observations or outcomes. In this paper, we explore the theoretical and practical questions that arise when using such multi-observation losses for regression on data sets of (x,y) pairs. When a loss depends on only one observation, the average empirical loss decomposes by applying the loss to each pair, but for the multi-observation case, empirical loss is not even well-defined, and the possibility of statistical guarantees is unclear without several (x,y) pairs with exactly the same x value. We propose four algorithms formalizing the concept of empirical risk minimization for this problem, two of which have statistical guarantees in settings allowing both slow and fast convergence rates, but which are out-performed empirically by the other two. Empirical results demonstrate practicality of these algorithms in low-dimensional settings, while lower bounds demonstrate intrinsic difficulty in higher dimensions. Finally, we demonstrate the potential benefit of the algorithms over natural baselines that use traditional single-observation losses via both lower bounds and simulations.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

09/14/2020

Risk Bounds for Robust Deep Learning

It has been observed that certain loss functions can render deep-learnin...
05/27/2014

Differentially Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds

In this paper, we initiate a systematic investigation of differentially ...
02/24/2021

On the Minimal Error of Empirical Risk Minimization

We study the minimal error of the Empirical Risk Minimization (ERM) proc...
01/18/2014

Excess Risk Bounds for Exponentially Concave Losses

The overarching goal of this paper is to derive excess risk bounds for l...
08/24/2016

AIDE: Fast and Communication Efficient Distributed Optimization

In this paper, we present two new communication-efficient methods for di...
06/27/2012

Consistent Multilabel Ranking through Univariate Losses

We consider the problem of rank loss minimization in the setting of mult...
11/16/2020

Multi-label classification: do Hamming loss and subset accuracy really conflict with each other?

Various evaluation measures have been developed for multi-label classifi...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.