
Statistical Inference for Incomplete Ranking Data: The Case of RankDependent Coarsening
We consider the problem of statistical inference for ranking data, speci...
read it

A Maximum Likelihood Approach For Selecting Sets of Alternatives
We consider the problem of selecting a subset of alternatives given nois...
read it

Bayesian Inference of Natural Rankings in Incomplete Competition Networks
Competition between a complex system's constituents and a corresponding ...
read it

A New Correlation Coefficient for Aggregating Nonstrict and Incomplete Rankings
We introduce a correlation coefficient that is specifically designed to ...
read it

Rankings for Bipartite Tournaments via Chain Editing
Ranking the participants of a tournament has applications in voting, pai...
read it

Beyond Pairwise Comparisons in Social Choice: A Setwise Kemeny Aggregation Problem
In this paper, we advocate the use of setwise contests for aggregating a...
read it

Optimizing positional scoring rules for rank aggregation
Nowadays, several crowdsourcing projects exploit social choice methods f...
read it
Aggregating Incomplete and Noisy Rankings
We consider the problem of learning the true ordering of a set of alternatives from largely incomplete and noisy rankings. We introduce a natural generalization of both the classical Mallows model of ranking distributions and the extensively studied model of noisy pairwise comparisons. Our selective Mallows model outputs a noisy ranking on any given subset of alternatives, based on an underlying Mallows distribution. Assuming a sequence of subsets where each pair of alternatives appears frequently enough, we obtain strong asymptotically tight upper and lower bounds on the sample complexity of learning the underlying complete ranking and the (identities and the) ranking of the topk alternatives from selective Mallows rankings. Moreover, building on the work of (Braverman and Mossel, 2009), we show how to efficiently compute the maximum likelihood complete ranking from selective Mallows rankings.
READ FULL TEXT
Comments
There are no comments yet.