DeepAI AI Chat
Log In Sign Up

Computational and Statistical Tradeoffs in Learning to Rank

08/22/2016
by   Ashish Khetan, et al.
University of Illinois at Urbana-Champaign
0

For massive and heterogeneous modern datasets, it is of fundamental interest to provide guarantees on the accuracy of estimation when computational resources are limited. In the application of learning to rank, we provide a hierarchy of rank-breaking mechanisms ordered by the complexity in thus generated sketch of the data. This allows the number of data points collected to be gracefully traded off against computational resources available, while guaranteeing the desired level of accuracy. Theoretical guarantees on the proposed generalized rank-breaking implicitly provide such trade-offs, which can be explicitly characterized under certain canonical scenarios on the structure of the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/21/2016

Data-driven Rank Breaking for Efficient Rank Aggregation

Rank aggregation systems collect ordinal preferences from individuals to...
11/23/2019

Managing Collaboration in Heterogeneous Swarms of Robots with Blockchains

One of the key challenges in the collaboration within heterogeneous mult...
10/08/2021

LCS: Learning Compressible Subspaces for Adaptive Network Compression at Inference Time

When deploying deep learning models to a device, it is traditionally ass...
03/05/2020

Linear-Time Parameterized Algorithms with Limited Local Resources

We propose a new (theoretical) computational model for the study of mass...
03/31/2018

Fundamental Resource Trade-offs for Encoded Distributed Optimization

Dealing with the shear size and complexity of today's massive data sets ...
06/21/2019

Trade-offs in Large-Scale Distributed Tuplewise Estimation and Learning

The development of cluster computing frameworks has allowed practitioner...
11/21/2017

Generating Analytic Insights on Human Behaviour using Image Processing

This paper proposes a method to track human figures in physical spaces a...