Building a Question and Answer System for News Domain

05/12/2021
by   Sandipan Basu, et al.
0

This project attempts to build a Question- Answering system in the News Domain, where Passages will be News articles, and anyone can ask a Question against it. We have built a span-based model using an Attention mechanism, where the model predicts the answer to a question as to the position of the start and end tokens in a paragraph. For training our model, we have used the Stanford Question and Answer (SQuAD 2.0) dataset[1]. To do well on SQuAD 2.0, systems must not only answer questions when possible but also determine when no answer is supported by the paragraph and abstain from answering. Our model architecture comprises three layers- Embedding Layer, RNN Layer, and the Attention Layer. For the Embedding layer, we used GloVe and the Universal Sentence Encoder. For the RNN Layer, we built variations of the RNN Layer including bi-LSTM and Stacked LSTM and we built an Attention Layer using a Context to Question Attention and also improvised on the innovative Bidirectional Attention Layer. Our best performing model which uses GloVe Embedding combined with Bi-LSTM and Context to Question Attention achieved an F1 Score and EM of 33.095 and 33.094 respectively. We also leveraged transfer learning and built a Transformer based model using BERT. The BERT-based model achieved an F1 Score and EM of 57.513 and 49.769 respectively. We concluded that the BERT model is superior in all aspects of answering various types of questions.

READ FULL TEXT
research
05/02/2019

Conditioning LSTM Decoder and Bi-directional Attention Based Question Answering System

Applying neural-networks on Question Answering has gained increasing pop...
research
10/19/2021

Ensemble ALBERT on SQuAD 2.0

Machine question answering is an essential yet challenging task in natur...
research
03/16/2020

TRANS-BLSTM: Transformer with Bidirectional LSTM for Language Understanding

Bidirectional Encoder Representations from Transformers (BERT) has recen...
research
04/04/2021

Conversational Question Answering over Knowledge Graphs with Transformer and Graph Attention Networks

This paper addresses the task of (complex) conversational question answe...
research
12/14/2019

BERTQA – Attention on Steroids

In this work, we extend the Bidirectional Encoder Representations from T...
research
02/25/2020

Exploring BERT Parameter Efficiency on the Stanford Question Answering Dataset v2.0

In this paper we explore the parameter efficiency of BERT arXiv:1810.048...
research
05/12/2022

Supplementary Material: Implementation and Experiments for GAU-based Model

In February this year Google proposed a new Transformer variant called F...

Please sign up or login with your details

Forgot password? Click here to reset