Unsupervised Dual-Cascade Learning with Pseudo-Feedback Distillation for Query-based Extractive Summarization

11/01/2018
by   Haggai Roitman, et al.
0

We propose Dual-CES -- a novel unsupervised, query-focused, multi-document extractive summarizer. Dual-CES is designed to better handle the tradeoff between saliency and focus in summarization. To this end, Dual-CES employs a two-step dual-cascade optimization approach with saliency-based pseudo-feedback distillation. Overall, Dual-CES significantly outperforms all other state-of-the-art unsupervised alternatives. Dual-CES is even shown to be able to outperform strong supervised summarizers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/01/2016

AttSum: Joint Learning of Focusing and Summarization with Neural Attention

Query relevance ranking and sentence saliency ranking are the two main t...
research
05/18/2022

ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self On-the-fly Distillation for Dense Passage Retrieval

Neural retrievers based on pre-trained language models (PLMs), such as d...
research
03/23/2022

An introduction to using dual quaternions to study kinematics

We advocate for the use of dual quaternions to represent poses, twists, ...
research
05/22/2023

D^2TV: Dual Knowledge Distillation and Target-oriented Vision Modeling for Many-to-Many Multimodal Summarization

Many-to-many multimodal summarization (M^3S) task aims to generate summa...
research
12/07/2021

SalFBNet: Learning Pseudo-Saliency Distribution via Feedback Convolutional Networks

Feed-forward only convolutional neural networks (CNNs) may ignore intrin...
research
05/17/2020

Dual Learning: Theoretical Study and an Algorithmic Extension

Dual learning has been successfully applied in many machine learning app...

Please sign up or login with your details

Forgot password? Click here to reset