Task-oriented semantic parsing models have achieved strong results in re...
Data efficiency, despite being an attractive characteristic, is often
ch...
Modern task-oriented semantic parsing approaches typically use seq2seq
t...
An effective recipe for building seq2seq, non-autoregressive, task-orien...
Task-oriented semantic parsing models typically have high resource
requi...
Compressive summarization systems typically rely on a crafted set of
syn...
An advantage of seq2seq abstractive summarization models is that they
ge...
Task-oriented dialog models typically leverage complex neural architectu...
Natural disasters (e.g., hurricanes) affect millions of people each year...
Pre-trained Transformers are now ubiquitous in natural language processi...
The increasing computational and memory complexities of deep neural netw...
The Lottery Ticket Hypothesis suggests large, over-parameterized neural
...
Insightful findings in political science often require researchers to an...