Expectation-based Minimalist Grammars

09/28/2021
by   Cristiano Chesi, et al.
0

Expectation-based Minimalist Grammars (e-MGs) are simplified versions of the (Conflated) Minimalist Grammars, (C)MGs, formalized by Stabler (Stabler, 2011, 2013, 1997) and Phase-based Minimalist Grammars, PMGs (Chesi, 2005, 2007; Stabler, 2011). The crucial simplification consists of driving structure building only by relying on lexically encoded categorial top-down expectations. The commitment on a top-down derivation (as in e-MGs and PMGs, as opposed to (C)MGs, Chomsky, 1995; Stabler, 2011) allows us to define a core derivation that should be the same in both parsing and generation (Momma Phillips, 2018).

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

06/09/2021

Expectation Programming

Building on ideas from probabilistic programming, we introduce the conce...
08/15/2018

Automatic Derivation Of Formulas Using Reforcement Learning

This paper presents an artificial intelligence algorithm that can be use...
06/09/2021

Data-Driven Invariant Learning for Probabilistic Programs

Morgan and McIver's weakest pre-expectation framework is one of the most...
01/27/2021

Deriving monadic quicksort (Declarative Pearl)

To demonstrate derivation of monadic programs, we present a specificatio...
06/30/2020

On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective

Renewal processes are a popular approach used in modelling infectious di...
05/23/2018

A Simple Re-Derivation of Onsager's Solution of the 2D Ising Model using Experimental Mathematics

In this case study, we illustrate the great potential of experimental ma...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.