Decoder-only Large Language Models (LLMs) have demonstrated potential in...
Pretrained language models (PLMs) are today the primary model for natura...
Non-parametric, k-nearest-neighbor algorithms have recently made inroads...
The burgeoning progress in the field of Large Language Models (LLMs) her...
African languages are severely under-represented in NLP research due to ...
Multi-hop QA (Question Answering) is the task of finding the answer to a...
Access to external knowledge is essential for many natural language
proc...
Query expansion is an effective approach for mitigating vocabulary misma...
Factorisation-based Models (FMs), such as DistMult, have enjoyed endurin...
Relation Extraction in the biomedical domain is challenging due to the l...
Natural language processing models often exploit spurious correlations
b...
In Dynamic Adversarial Data Collection (DADC), human annotators are task...
Biological spiking neural networks (SNNs) can temporally encode informat...
Learning good representations on multi-relational graphs is essential to...
Research shows that natural language processing models are generally
con...
Recent work on Open Domain Question Answering has shown that there is a ...
Adaptive Computation (AC) has been shown to be effective in improving th...
In this paper, we aim to improve abstractive dialogue summarization qual...
When primed with only a handful of training samples, very large pretrain...
Despite the availability of very large datasets and pretrained models,
s...
Open-domain Question Answering models which directly leverage question-a...
We review the EfficientQA competition from NeurIPS 2020. The competition...
Most approaches to Open-Domain Question Answering consist of a light-wei...
Ideally Open-Domain Question Answering models should exhibit a number of...
Attempts to render deep learning models interpretable, data-efficient, a...
Tracking progress in machine learning has become increasingly difficult ...
While recent efforts have shown that neural text processing models are
v...
Machine reading comprehension (MRC) has received considerable attention ...
Current reading comprehension models generalise well to in-distribution ...
Innovations in annotation methodology have been a propellant for Reading...
Existing analysis work in machine reading comprehension (MRC) is largely...
Recent studies revealed that reading comprehension (RC) systems learn to...
We present an approach for automatic extraction of measured values from ...
Electronic health records (EHR) are increasingly being used for construc...
Like all sub-fields of machine learning, Bayesian Deep Learning is drive...
Grammatical error correction, like other machine learning tasks, greatly...
Many Machine Reading and Natural Language Understanding tasks require re...
We argue that extrapolation to examples outside the training space will ...
Most Reading Comprehension methods limit themselves to queries which can...
In this paper we present a novel Neural Network algorithm for conducting...
Multi-hop inference is necessary for machine learning systems to success...
In this work, we investigate several neural network architectures for
fi...
In this work we propose a novel attention-based neural network model for...
We present a novel learning method for word embeddings designed for rela...
In this work, we present a novel neural network based architecture for
i...