Expectation-based Minimalist Grammars

by   Cristiano Chesi, et al.

Expectation-based Minimalist Grammars (e-MGs) are simplified versions of the (Conflated) Minimalist Grammars, (C)MGs, formalized by Stabler (Stabler, 2011, 2013, 1997) and Phase-based Minimalist Grammars, PMGs (Chesi, 2005, 2007; Stabler, 2011). The crucial simplification consists of driving structure building only by relying on lexically encoded categorial top-down expectations. The commitment on a top-down derivation (as in e-MGs and PMGs, as opposed to (C)MGs, Chomsky, 1995; Stabler, 2011) allows us to define a core derivation that should be the same in both parsing and generation (Momma Phillips, 2018).



page 1

page 2

page 3

page 4


Expectation Programming

Building on ideas from probabilistic programming, we introduce the conce...

Automatic Derivation Of Formulas Using Reforcement Learning

This paper presents an artificial intelligence algorithm that can be use...

Data-Driven Invariant Learning for Probabilistic Programs

Morgan and McIver's weakest pre-expectation framework is one of the most...

Deriving monadic quicksort (Declarative Pearl)

To demonstrate derivation of monadic programs, we present a specificatio...

On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective

Renewal processes are a popular approach used in modelling infectious di...

A Simple Re-Derivation of Onsager's Solution of the 2D Ising Model using Experimental Mathematics

In this case study, we illustrate the great potential of experimental ma...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.