Comparative Study of Machine Learning Models and BERT on SQuAD

05/22/2020
by   Devshree Patel, et al.
0

This study aims to provide a comparative analysis of performance of certain models popular in machine learning and the BERT model on the Stanford Question Answering Dataset (SQuAD). The analysis shows that the BERT model, which was once state-of-the-art on SQuAD, gives higher accuracy in comparison to other models. However, BERT requires a greater execution time even when only 100 samples are used. This shows that with increasing accuracy more amount of time is invested in training the data. Whereas in case of preliminary machine learning models, execution time for full data is lower but accuracy is compromised.

READ FULL TEXT

page 2

page 3

page 6

research
04/21/2023

Tokenization Tractability for Human and Machine Learning Model: An Annotation Study

Is tractable tokenization for humans also tractable for machine learning...
research
08/19/2019

A Study of BERT for Non-Factoid Question-Answering under Passage Length Constraints

We study the use of BERT for non-factoid question-answering, focusing on...
research
10/28/2022

Feature Engineering vs BERT on Twitter Data

In this paper, we compare the performances of traditional machine learni...
research
12/26/2019

A Comparative Study on Machine Learning Algorithms for the Control of a Wall Following Robot

A comparison of the performance of various machine learning models to pr...
research
11/28/2018

Predicting the Computational Cost of Deep Learning Models

Deep learning is rapidly becoming a go-to tool for many artificial intel...
research
11/27/2021

Tapping BERT for Preposition Sense Disambiguation

Prepositions are frequently occurring polysemous words. Disambiguation o...
research
05/11/2021

Addressing "Documentation Debt" in Machine Learning Research: A Retrospective Datasheet for BookCorpus

Recent literature has underscored the importance of dataset documentatio...

Please sign up or login with your details

Forgot password? Click here to reset