A logical word embedding for learning grammar

04/28/2023
by   Sean Deyo, et al.
0

We introduce the logical grammar emdebbing (LGE), a model inspired by pregroup grammars and categorial grammars to enable unsupervised inference of lexical categories and syntactic rules from a corpus of text. LGE produces comprehensible output summarizing its inferences, has a completely transparent process for producing novel sentences, and can learn from as few as a hundred sentences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/18/2022

Learning grammar with a divide-and-concur neural network

We implement a divide-and-concur iterative projection approach to contex...
research
10/02/2019

A CCG-based Compositional Semantics and Inference System for Comparatives

Comparative constructions play an important role in natural language inf...
research
05/22/2018

Paracompositionality, MWEs and Argument Substitution

Multi-word expressions, verb-particle constructions, idiomatically combi...
research
05/31/2019

Constructive Type-Logical Supertagging with Self-Attention Networks

We propose a novel application of self-attention networks towards gramma...
research
05/16/2020

Logical Inferences with Comparatives and Generalized Quantifiers

Comparative constructions pose a challenge in Natural Language Inference...
research
08/07/2017

Generative Statistical Models with Self-Emergent Grammar of Chord Sequences

Generative statistical models of chord sequences play crucial roles in m...
research
10/25/2022

Dual Mechanism Priming Effects in Hindi Word Order

Word order choices during sentence production can be primed by preceding...

Please sign up or login with your details

Forgot password? Click here to reset