Tag and Correct: Question aware Open Information Extraction with Two-stage Decoding

09/16/2020
by   Martin Kuo, et al.
6

Question Aware Open Information Extraction (Question aware Open IE) takes question and passage as inputs, outputting an answer tuple which contains a subject, a predicate, and one or more arguments. Each field of answer is a natural language word sequence and is extracted from the passage. The semi-structured answer has two advantages which are more readable and falsifiable compared to span answer. There are two approaches to solve this problem. One is an extractive method which extracts candidate answers from the passage with the Open IE model, and ranks them by matching with questions. It fully uses the passage information at the extraction step, but the extraction is independent to the question. The other one is the generative method which uses a sequence to sequence model to generate answers directly. It combines the question and passage as input at the same time, but it generates the answer from scratch, which does not use the facts that most of the answer words come from in the passage. To guide the generation by passage, we present a two-stage decoding model which contains a tagging decoder and a correction decoder. At the first stage, the tagging decoder will tag keywords from the passage. At the second stage, the correction decoder will generate answers based on tagged keywords. Our model could be trained end-to-end although it has two stages. Compared to previous generative models, we generate better answers by generating coarse to fine. We evaluate our model on WebAssertions (Yan et al., 2018) which is a Question aware Open IE dataset. Our model achieves a BLEU score of 59.32, which is better than previous generative methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2019

Open Information Extraction from Question-Answer Pairs

Open Information Extraction (OpenIE) extracts meaningful structured tupl...
research
05/30/2023

KEYword based Sampling (KEYS) for Large Language Models

Question answering (Q/A) can be formulated as a generative task (Mitra, ...
research
01/23/2018

Assertion-based QA with Question-Aware Open Information Extraction

We present assertion based question answering (ABQA), an open domain que...
research
10/14/2021

Cross-Lingual GenQA: A Language-Agnostic Generative Question Answering Approach for Open-Domain Question Answering

Open-Retrieval Generative Question Answering (GenQA) is proven to delive...
research
12/02/2019

Improving Question Generation with Sentence-level Semantic Matching and Answer Position Inferring

Taking an answer and its context as input, sequence-to-sequence models h...
research
11/25/2019

Conclusion-Supplement Answer Generation for Non-Factoid Questions

This paper tackles the goal of conclusion-supplement answer generation f...
research
11/19/2019

Extended Answer and Uncertainty Aware Neural Question Generation

In this paper, we study automatic question generation, the task of creat...

Please sign up or login with your details

Forgot password? Click here to reset