Pre-trained large language models (PLMs) underlie most new developments ...
Cross-lingual summarization consists of generating a summary in one lang...
Reliable automatic evaluation of summarization systems is challenging du...
While conditional generation models can now generate natural language we...
Modern deep models for summarization attains impressive benchmark
perfor...
A typical product or place often has hundreds of reviews, and summarizat...
Abstractive summarization has enjoyed renewed interest in recent years,
...
We consider the problem of automatically generating stories in multiple
...
The availability of large, high-quality datasets has been one of the mai...
The ability to convey relevant and faithful information is critical for ...
Large language models have been shown to achieve remarkable performance
...
We propose Composition Sampling, a simple but effective method to genera...
The highly popular Transformer architecture, based on self-attention, is...
Professional summaries are written with document-level information, such...
Pre-trained transformer-based sequence-to-sequence models have become th...
We propose encoder-centric stepwise models for extractive summarization ...
It is well known that the standard likelihood training and approximate
d...
The rise of neural networks, and particularly recurrent neural networks,...