Streamlining Evaluation with ir-measures

11/26/2021
by   Sean MacAvaney, et al.
0

We present ir-measures, a new tool that makes it convenient to calculate a diverse set of evaluation measures used in information retrieval. Rather than implementing its own measure calculations, ir-measures provides a common interface to a handful of evaluation tools. The necessary tools are automatically invoked (potentially multiple times) to calculate all the desired metrics, simplifying the evaluation process for the user. The tool also makes it easier for researchers to use recently-proposed measures (such as those from the C/W/L framework) alongside traditional measures, potentially encouraging their adoption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2023

A comment to "A General Theory of IR Evaluation Measures"

The paper "A General Theory of IR Evaluation Measures" develops a formal...
research
04/02/2023

An Intrinsic Framework of Information Retrieval Evaluation Measures

Information retrieval (IR) evaluation measures are cornerstones for dete...
research
01/19/2022

repro_eval: A Python Interface to Reproducibility Measures of System-oriented IR Experiments

In this work we introduce repro_eval - a tool for reactive reproducibili...
research
03/03/2021

Simplified Data Wrangling with ir_datasets

Managing the data for Information Retrieval (IR) experiments can be chal...
research
01/31/2019

An InfoVis Tool for Interactive Component-Based Evaluation

In this paper, we present an InfoVis tool based on Sankey diagrams for t...
research
12/22/2022

Response to Moffat's Comment on "Towards Meaningful Statements in IR Evaluation: Mapping Evaluation Measures to Interval Scales"

Moffat recently commented on our previous work. Our work focused on how ...
research
11/01/2020

Primer – A Tool for Testing Honeypot Measures of Effectiveness

Honeypots are a deceptive technology used to capture malicious activity....

Please sign up or login with your details

Forgot password? Click here to reset