DeepAI
Log In Sign Up

PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols

04/28/2021
by   Songlin Yang, et al.
0

Probabilistic context-free grammars (PCFGs) with neural parameterization have been shown to be effective in unsupervised phrase-structure grammar induction. However, due to the cubic computational complexity of PCFG representation and parsing, previous approaches cannot scale up to a relatively large number of (nonterminal and preterminal) symbols. In this work, we present a new parameterization form of PCFGs based on tensor decomposition, which has at most quadratic computational complexity in the symbol number and therefore allows us to use a much larger number of symbols. We further use neural parameterization for the new form to improve unsupervised parsing performance. We evaluate our model across ten languages and empirically demonstrate the effectiveness of using more symbols. Our code: https://github.com/sustcsonglin/TN-PCFG

READ FULL TEXT

page 1

page 2

page 3

page 4

12/18/2022

Unsupervised Discontinuous Constituency Parsing with Mildly Context-Sensitive Grammars

We study grammar induction with mildly context-sensitive grammars for un...
07/29/2020

The Return of Lexical Dependencies: Neural Lexicalized PCFGs

In this paper we demonstrate that context free grammar (CFG) based metho...
10/30/2020

Lake symbols for island parsing

Context: An island parser reads an input text and builds the parse (or a...
05/31/2021

Neural Bi-Lexicalized PCFG Induction

Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar ...
07/20/2018

On Euclidean Methods for Cubic and Quartic Jacobi Symbols

We study the bit complexity of two methods, related to the Euclidean alg...
03/03/2021

An Empirical Study of Compound PCFGs

Compound probabilistic context-free grammars (C-PCFGs) have recently est...
04/23/2013

Learning Visual Symbols for Parsing Human Poses in Images

Parsing human poses in images is fundamental in extracting critical visu...