Backward-Compatible Prediction Updates: A Probabilistic Approach

07/02/2021
by   Frederik Träuble, et al.
14

When machine learning systems meet real world applications, accuracy is only one of several requirements. In this paper, we assay a complementary perspective originating from the increasing availability of pre-trained and regularly improving state-of-the-art models. While new improved models develop at a fast pace, downstream tasks vary more slowly or stay constant. Assume that we have a large unlabelled data set for which we want to maintain accurate predictions. Whenever a new and presumably better ML models becomes available, we encounter two problems: (i) given a limited budget, which data points should be re-evaluated using the new model?; and (ii) if the new predictions differ from the current ones, should we update? Problem (i) is about compute cost, which matters for very large data sets and models. Problem (ii) is about maintaining consistency of the predictions, which can be highly relevant for downstream applications; our demand is to avoid negative flips, i.e., changing correct to incorrect predictions. In this paper, we formalize the Prediction Update Problem and present an efficient probabilistic approach as answer to the above questions. In extensive experiments on standard classification benchmark data sets, we show that our method outperforms alternative strategies along key metrics for backward-compatible prediction updates.

READ FULL TEXT
research
01/25/2023

Backward Compatibility During Data Updates by Weight Interpolation

Backward compatibility of model predictions is a desired property when u...
research
05/04/2023

Boundary-aware Backward-Compatible Representation via Adversarial Learning in Image Retrieval

Image retrieval plays an important role in the Internet world. Usually, ...
research
02/02/2023

Fast Online Value-Maximizing Prediction Sets with Conformal Cost Control

Many real-world multi-label prediction problems involve set-valued predi...
research
04/29/2021

MOROCCO: Model Resource Comparison Framework

The new generation of pre-trained NLP models push the SOTA to the new li...
research
11/08/2022

BT^2: Backward-compatible Training with Basis Transformation

Modern retrieval system often requires recomputing the representation of...
research
12/05/2016

Highly Efficient Regression for Scalable Person Re-Identification

Existing person re-identification models are poor for scaling up to larg...
research
06/03/2017

IDK Cascades: Fast Deep Learning by Learning not to Overthink

Advances in deep learning have led to substantial increases in predictio...

Please sign up or login with your details

Forgot password? Click here to reset