DeepAI AI Chat
Log In Sign Up

Learning with Partially Ordered Representations

06/19/2019
by   Jane Chandlee, et al.
Stony Brook University
Haverford College
lis-lab.fr
Rutgers University
0

This paper examines the characterization and learning of grammars defined with enriched representational models. Model-theoretic approaches to formal language theory traditionally assume that each position in a string belongs to exactly one unary relation. We consider unconventional string models where positions can have multiple, shared properties, which are arguably useful in many applications. We show the structures given by these models are partially ordered, and present a learning algorithm that exploits this ordering relation to effectively prune the hypothesis space. We prove this learning algorithm, which takes positive examples as input, finds the most general grammar which covers the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/22/2022

On the growth rate of polyregular functions

We consider polyregular functions, which are certain string-to-string fu...
12/20/2017

Sheaf-Theoretic Stratification Learning

In this paper, we investigate a sheaf-theoretic interpretation of strati...
05/30/2019

String-to-String Interpretations with Polynomial-Size Output

String-to-string MSO interpretations are like Courcelle's MSO transducti...
06/04/2021

On (co-lex) Ordering Automata

The states of a deterministic finite automaton A can be identified with ...
06/21/2018

An output-sensitive algorithm for the minimization of 2-dimensional String Covers

String covers are a powerful tool for analyzing the quasi-periodicity of...
08/24/2019

When a Dollar Makes a BWT

The Burrows-Wheeler-Transform (BWT) is a reversible string transformatio...
10/09/2018

Alignments as Compositional Structures

Alignments, i.e., position-wise comparisons of two or more strings or or...