Question Classification with Deep Contextualized Transformer

10/17/2019
by   Haozheng Luo, et al.
0

The latest work for Question and Answer problems is to use the Stanford Parse Tree. We build on prior work and develop a new method to handle the Question and Answer problem with the Deep Contextualized Transformer to manage some aberrant expressions. We also conduct extensive evaluations of the SQuAD and SwDA dataset and show significant improvement over QA problem classification of industry needs. We also investigate the impact of different models for the accuracy and efficiency of the problem answers. It shows that our new method is more effective for solving QA problems with higher accuracy

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2021

Will this Question be Answered? Question Filtering via Answer Model Distillation for Efficient Question Answering

In this paper we propose a novel approach towards improving the efficien...
research
04/18/2021

Can NLI Models Verify QA Systems' Predictions?

To build robust question answering systems, we need the ability to verif...
research
10/07/2020

Unsupervised Evaluation for Question Answering with Transformers

It is challenging to automatically evaluate the answer of a QA model at ...
research
12/01/2020

Just Ask: Learning to Answer Questions from Millions of Narrated Videos

Modern approaches to visual question answering require large annotated d...
research
11/08/2019

A Comprehensive Comparison of Machine Learning Based Methods Used in Bengali Question Classification

QA classification system maps questions asked by humans to an appropriat...
research
12/06/2019

Can AI Generate Love Advice?: Toward Neural Answer Generation for Non-Factoid Questions

Deep learning methods that extract answers for non-factoid questions fro...
research
09/22/2021

Eliciting Thinking Hierarchy without Prior

A key challenge in crowdsourcing is that majority may make systematic mi...

Please sign up or login with your details

Forgot password? Click here to reset