Neural Generation of Regular Expressions from Natural Language with Minimal Domain Knowledge

08/09/2016
by   Nicholas Locascio, et al.
0

This paper explores the task of translating natural language queries into regular expressions which embody their meaning. In contrast to prior work, the proposed neural model does not utilize domain-specific crafting, learning to translate directly from a parallel corpus. To fully explore the potential of neural models, we propose a methodology for collecting a large corpus of regular expression, natural language pairs. Our resulting model achieves a performance gain of 19.6

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/16/2019

Sketch-Driven Regular Expression Generation from Natural Language and Examples

Recent systems for converting natural language descriptions into regular...
research
06/14/2022

Learning from Uncurated Regular Expressions

Significant work has been done on learning regular expressions from a se...
research
02/08/2017

Exploiting Domain Knowledge via Grouped Weight Sharing with Application to Text Categorization

A fundamental advantage of neural models for NLP is their ability to lea...
research
05/20/2016

As Cool as a Cucumber: Towards a Corpus of Contemporary Similes in Serbian

Similes are natural language expressions used to compare unlikely things...
research
10/19/2021

Idiomatic Expression Identification using Semantic Compatibility

Idiomatic expressions are an integral part of natural language and const...
research
11/24/2022

Reducing a Set of Regular Expressions and Analyzing Differences of Domain-specific Statistic Reporting

Due to the large amount of daily scientific publications, it is impossib...
research
09/23/2022

A Neural Model for Regular Grammar Induction

Grammatical inference is a classical problem in computational learning t...

Please sign up or login with your details

Forgot password? Click here to reset