Fast and Robust Rank Aggregation against Model Misspecification

05/29/2019
by   Yuangang Pan, et al.
0

In rank aggregation, preferences from different users are summarized into a total order under the homogeneous data assumption. Thus, model misspecification arises and rank aggregation methods take some noise models into account. However, they all rely on certain noise model assumptions and cannot handle agnostic noises in the real world. In this paper, we propose CoarsenRank, which rectifies the underlying data distribution directly and aligns it to the homogeneous data assumption without involving any noise model. To this end, we define a neighborhood of the data distribution over which Bayesian inference of CoarsenRank is performed, and therefore the resultant posterior enjoys robustness against model misspecification. Further, we derive a tractable closed-form solution for CoarsenRank making it computationally efficient. Experiments on real-world datasets show that CoarsenRank is fast and robust, achieving consistent improvement over baseline methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/23/2020

Approximate Bayesian inference for a spatial point process model exhibiting regularity and random aggregation

In this paper, we propose a doubly stochastic spatial point process mode...
research
01/28/2017

Multiclass MinMax Rank Aggregation

We introduce a new family of minmax rank aggregation problems under two ...
research
12/07/2021

RID-Noise: Towards Robust Inverse Design under Noisy Environments

From an engineering perspective, a design should not only perform well i...
research
02/01/2017

Denoising Hyperspectral Image with Non-i.i.d. Noise Structure

Hyperspectral image (HSI) denoising has been attracting much research at...
research
06/03/2018

Generalized Robust Bayesian Committee Machine for Large-scale Gaussian Process Regression

In order to scale standard Gaussian process (GP) regression to large-sca...
research
06/04/2023

Riemannian Low-Rank Model Compression for Federated Learning with Over-the-Air Aggregation

Low-rank model compression is a widely used technique for reducing the c...
research
06/21/2015

Communication Efficient Distributed Agnostic Boosting

We consider the problem of learning from distributed data in the agnosti...

Please sign up or login with your details

Forgot password? Click here to reset