DeepAI
Log In Sign Up

Equiprobable mappings in weighted constraint grammars

07/12/2019
by   Arto Anttila, et al.
0

We show that MaxEnt is so rich that it can distinguish between any two different mappings: there always exists a nonnegative weight vector which assigns them different MaxEnt probabilities. Stochastic HG instead does admit equiprobable mappings and we give a complete formal characterization of them. We compare these different predictions of the two frameworks on a test case of Finnish stress.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/18/2021

Tools for Analysis of Shannon-Kotel'nikov Mappings

This document summarizes results on S-K mappings obtained since 2017....
05/24/2021

Two-to-one mappings and involutions without fixed points over _2^n

In this paper, two-to-one mappings and involutions without any fixed poi...
03/01/2022

Two Classes of Power Mappings with Boomerang Uniformity 2

Let q be an odd prime power. Let F_1(x)=x^d_1 and F_2(x)=x^d_2 be power ...
02/27/2018

Augmented CycleGAN: Learning Many-to-Many Mappings from Unpaired Data

Learning inter-domain mappings from unpaired data can improve performanc...
05/27/2019

On Mappings on the Hypercube with Small Average Stretch

Let A ⊆{0,1}^n be a set of size 2^n-1, and let ϕ{0,1}^n-1→ A be a biject...
10/15/2019

Further study of 2-to-1 mappings over F_2^n

2-to-1 mappings over finite fields play an important role in symmetric c...
03/08/2021

Millions of 5-State n^3 Sequence Generators via Local Mappings

In this paper, we come back on the notion of local simulation allowing t...