Neural Bi-Lexicalized PCFG Induction

05/31/2021
by   Songlin Yang, et al.
0

Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction. However, to reduce computational complexity, they make a strong independence assumption on the generation of the child word and thus bilexical dependencies are ignored. In this paper, we propose an approach to parameterize L-PCFGs without making implausible independence assumptions. Our approach directly models bilexical dependencies and meanwhile reduces both learning and representation complexities of L-PCFGs. Experimental results on the English WSJ dataset confirm the effectiveness of our approach in improving both running speed and unsupervised parsing performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2019

Compound Probabilistic Context-Free Grammars for Grammar Induction

We study a formalization of the grammar induction problem that models se...
research
07/29/2020

The Return of Lexical Dependencies: Neural Lexicalized PCFGs

In this paper we demonstrate that context free grammar (CFG) based metho...
research
05/24/2017

Matroids Hitting Sets and Unsupervised Dependency Grammar Induction

This paper formulates a novel problem on graphs: find the minimal subset...
research
09/10/2018

Depth-bounding is effective: Improvements and evaluation of unsupervised PCFG induction

There have been several recent attempts to improve the accuracy of gramm...
research
04/28/2021

PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols

Probabilistic context-free grammars (PCFGs) with neural parameterization...
research
09/23/2022

A Neural Model for Regular Grammar Induction

Grammatical inference is a classical problem in computational learning t...
research
10/15/2020

Montague Grammar Induction

We propose a computational modeling framework for inducing combinatory c...

Please sign up or login with your details

Forgot password? Click here to reset