Event scenarios are often complex and involve multiple event sequences
c...
Recent work has shown that large language models are capable of generati...
Knowledge about outcomes is critical for complex event understanding but...
Natural language inference (NLI) is critical for complex decision-making...
Anticipating future actions in a video is useful for many autonomous and...
Getting the most out of limited resources allows advances in natural lan...
The events in a narrative can be understood as a coherent whole via the
...
Question-answering datasets require a broad set of reasoning skills. We ...
NLP models learn sentence representations for downstream tasks by tuning...
Natural language is generated by people, yet traditional language modeli...
Can language models read biomedical texts and explain the biomedical
mec...
Much of natural language processing is focused on leveraging large capac...
How can we generate concise explanations for multi-hop Reading Comprehen...
To build challenging multi-hop question answering datasets, we propose a...
Language understanding must identify the logical connections between eve...
Answering questions about why characters perform certain actions is cent...
How much information do NLP tasks really need from a transformer's atten...
Existing software-based energy measurements of NLP models are not accura...
We present BewQA, a system specifically designed to answer a class of
qu...
A major challenge in fine-tuning deep learning models for automatic
summ...
We introduce PerSenT, a dataset of crowd-sourced annotations of the sent...
Nearest neighbor search (NNS) has a wide range of applications in inform...
Accurate and reliable measurement of energy consumption is critical for
...
Preconditions provide a form of logical connection between events that
e...
Predicting how events induce emotions in the characters of a story is
ty...
The measurement of true progress in multihop question-answering has been...
Transformer-based QA models use input-wide self-attention – i.e. across ...
Early work on narrative modeling used explicit plans and goals to genera...
Multi-task learning (MTL) is a common paradigm that seeks to improve the...
Learning target side syntactic structure has been shown to improve Neura...
Question Answering (QA) naturally reduces to an entailment problem, name...
We introduce entity post-modifier generation as an instance of a
collabo...
Scripts define knowledge about how everyday scenarios (such as going to ...
Predictive models over social media language have shown promise in captu...
Sentence encoders are typically trained on language modeling tasks which...
Seq2Seq based neural architectures have become the go-to architecture to...
Attention-based neural abstractive summarization systems equipped with c...
Robust and flexible event representations are important to many core are...
In this paper, we explore optimizations to run Recurrent Neural Network ...
Our goal is to answer elementary-level science questions using knowledge...