NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned

by   Sewon Min, et al.

We review the EfficientQA competition from NeurIPS 2020. The competition focused on open-domain question answering (QA), where systems take natural language questions as input and return natural language answers. The aim of the competition was to build systems that can predict correct answers while also satisfying strict on-disk memory budgets. These memory budgets were designed to encourage contestants to explore the trade-off between storing large, redundant, retrieval corpora or the parameters of large learned models. In this report, we describe the motivation and organization of the competition, review the best submissions, and analyze system predictions to inform a discussion of evaluation for open-domain QA.


CREPE: Open-Domain Question Answering with False Presuppositions

Information seeking users often pose questions with false presupposition...

Pirá: A Bilingual Portuguese-English Dataset for Question-Answering about the Ocean

Current research in natural language processing is highly dependent on c...

Talk to Papers: Bringing Neural Question Answering to Academic Search

We introduce Talk to Papers, which exploits the recent open-domain quest...

NeurIPS 2020 NLC2CMD Competition: Translating Natural Language to Bash Commands

The NLC2CMD Competition hosted at NeurIPS 2020 aimed to bring the power ...

A Quantitative Evaluation of Natural Language Question Interpretation for Question Answering Systems

Systematic benchmark evaluation plays an important role in the process o...

When to Read Documents or QA History: On Unified and Selective Open-domain QA

This paper studies the problem of open-domain question answering, with t...

Relevance-guided Supervision for OpenQA with ColBERT

Systems for Open-Domain Question Answering (OpenQA) generally depend on ...

Please sign up or login with your details

Forgot password? Click here to reset