Expectation-based Minimalist Grammars

09/28/2021
by   Cristiano Chesi, et al.
0

Expectation-based Minimalist Grammars (e-MGs) are simplified versions of the (Conflated) Minimalist Grammars, (C)MGs, formalized by Stabler (Stabler, 2011, 2013, 1997) and Phase-based Minimalist Grammars, PMGs (Chesi, 2005, 2007; Stabler, 2011). The crucial simplification consists of driving structure building only by relying on lexically encoded categorial top-down expectations. The commitment on a top-down derivation (as in e-MGs and PMGs, as opposed to (C)MGs, Chomsky, 1995; Stabler, 2011) allows us to define a core derivation that should be the same in both parsing and generation (Momma Phillips, 2018).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2023

Variational Bayes Made Easy

Variational Bayes is a popular method for approximate inference but its ...
research
09/25/2022

Efficient evaluation of expectations of functions of a stable Lévy process and its extremum

Integral representations for expectations of functions of a stable Lévy ...
research
06/09/2021

Expectation Programming

Building on ideas from probabilistic programming, we introduce the conce...
research
08/15/2018

Automatic Derivation Of Formulas Using Reforcement Learning

This paper presents an artificial intelligence algorithm that can be use...
research
06/09/2021

Data-Driven Invariant Learning for Probabilistic Programs

Morgan and McIver's weakest pre-expectation framework is one of the most...
research
01/27/2021

Deriving monadic quicksort (Declarative Pearl)

To demonstrate derivation of monadic programs, we present a specificatio...

Please sign up or login with your details

Forgot password? Click here to reset