A Syntax Aware BERT for Identifying Well-Formed Queries in a Curriculum Framework

08/21/2022
by   Avinash Madasu, et al.
9

A well formed query is defined as a query which is formulated in the manner of an inquiry, and with correct interrogatives, spelling and grammar. While identifying well formed queries is an important task, few works have attempted to address it. In this paper we propose transformer based language model - Bidirectional Encoder Representations from Transformers (BERT) to this task. We further imbibe BERT with parts-of-speech information inspired from earlier works. Furthermore, we also train the model in multiple curriculum settings for improvement in performance. Curriculum Learning over the task is experimented with Baby Steps and One Pass techniques. Proposed architecture performs exceedingly well on the task. The best approach achieves accuracy of 83.93 outperforming previous state-of-the-art at 75.0 approximate human upper bound of 88.4

READ FULL TEXT
research
01/19/2022

TourBERT: A pretrained language model for the tourism industry

The Bidirectional Encoder Representations from Transformers (BERT) is cu...
research
02/19/2021

Analyzing Curriculum Learning for Sentiment Analysis along Task Difficulty, Pacing and Visualization Axes

While Curriculum Learning (CL) has recently gained traction in Natural l...
research
09/17/2020

DSC IIT-ISM at SemEval-2020 Task 6: Boosting BERT with Dependencies for Definition Extraction

We explore the performance of Bidirectional Encoder Representations from...
research
04/17/2020

Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning

Even though BERT achieves successful performance improvements in various...
research
02/04/2019

A Comprehensive Exploration on WikiSQL with Table-Aware Word Contextualization

WikiSQL is the task of mapping a natural language question to a SQL quer...
research
08/28/2018

Identifying Well-formed Natural Language Questions

Understanding search queries is a hard problem as it involves dealing wi...
research
10/21/2020

BERT for Joint Multichannel Speech Dereverberation with Spatial-aware Tasks

We propose a method for joint multichannel speech dereverberation with t...

Please sign up or login with your details

Forgot password? Click here to reset