DeepAI
Log In Sign Up

Neural Bi-Lexicalized PCFG Induction

05/31/2021
by   Songlin Yang, et al.
0

Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction. However, to reduce computational complexity, they make a strong independence assumption on the generation of the child word and thus bilexical dependencies are ignored. In this paper, we propose an approach to parameterize L-PCFGs without making implausible independence assumptions. Our approach directly models bilexical dependencies and meanwhile reduces both learning and representation complexities of L-PCFGs. Experimental results on the English WSJ dataset confirm the effectiveness of our approach in improving both running speed and unsupervised parsing performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/24/2019

Compound Probabilistic Context-Free Grammars for Grammar Induction

We study a formalization of the grammar induction problem that models se...
07/29/2020

The Return of Lexical Dependencies: Neural Lexicalized PCFGs

In this paper we demonstrate that context free grammar (CFG) based metho...
05/24/2017

Matroids Hitting Sets and Unsupervised Dependency Grammar Induction

This paper formulates a novel problem on graphs: find the minimal subset...
04/28/2021

PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols

Probabilistic context-free grammars (PCFGs) with neural parameterization...
09/10/2018

Depth-bounding is effective: Improvements and evaluation of unsupervised PCFG induction

There have been several recent attempts to improve the accuracy of gramm...
09/23/2022

A Neural Model for Regular Grammar Induction

Grammatical inference is a classical problem in computational learning t...
03/10/2022

Realizing Implicit Computational Complexity

This abstract aims at presenting an ongoing effort to apply a novel typi...