Large Language Models (LLMs) have revolutionized Natural Language Proces...
Recent works have introduced Abstract Meaning Representation (AMR) for
D...
Large language models (LLMs) have a wealth of knowledge that allows them...
The Open-Domain Question Answering (ODQA) task involves retrieving and
s...
Transformer models bring propelling advances in various NLP tasks, thus
...
Modern language models mostly take sub-words as input, a design that bal...
Interpreting the reasoning process from questions to answers poses a
cha...
In recent years, there is a surge of generation-based information extrac...
Transformers have made progress in miscellaneous tasks, but suffer from
...
Dialogue meaning representation formulates natural language utterance
se...
Knowledge and expertise in the real-world can be disjointedly owned. To ...
Named Entity Recognition (NER) is the task of identifying spans that
rep...
Cycle-consistent training is widely used for jointly learning a forward ...
With the emerging branch of incorporating factual knowledge into pre-tra...
Two important tasks at the intersection of knowledge graphs and natural
...
Adversarial attacks for discrete data (such as text) has been proved
sig...
In this paper, we introduce the prior knowledge, multi-scale structure, ...
Sentences produced by abstractive summarization systems can be ungrammat...
The Transformer model is widely successful on many natural language
proc...
Although the fully-connected attention-based model Transformer has achie...
Text generation is a fundamental building block in natural language
proc...