A Syntax Aware BERT for Identifying Well-Formed Queries in a Curriculum Framework

08/21/2022
by   Avinash Madasu, et al.
9

A well formed query is defined as a query which is formulated in the manner of an inquiry, and with correct interrogatives, spelling and grammar. While identifying well formed queries is an important task, few works have attempted to address it. In this paper we propose transformer based language model - Bidirectional Encoder Representations from Transformers (BERT) to this task. We further imbibe BERT with parts-of-speech information inspired from earlier works. Furthermore, we also train the model in multiple curriculum settings for improvement in performance. Curriculum Learning over the task is experimented with Baby Steps and One Pass techniques. Proposed architecture performs exceedingly well on the task. The best approach achieves accuracy of 83.93 outperforming previous state-of-the-art at 75.0 approximate human upper bound of 88.4

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset