DeepAI AI Chat
Log In Sign Up

Learning with Partially Ordered Representations

by   Jane Chandlee, et al.
Stony Brook University
Haverford College
Rutgers University

This paper examines the characterization and learning of grammars defined with enriched representational models. Model-theoretic approaches to formal language theory traditionally assume that each position in a string belongs to exactly one unary relation. We consider unconventional string models where positions can have multiple, shared properties, which are arguably useful in many applications. We show the structures given by these models are partially ordered, and present a learning algorithm that exploits this ordering relation to effectively prune the hypothesis space. We prove this learning algorithm, which takes positive examples as input, finds the most general grammar which covers the data.


page 1

page 2

page 3

page 4


On the growth rate of polyregular functions

We consider polyregular functions, which are certain string-to-string fu...

Sheaf-Theoretic Stratification Learning

In this paper, we investigate a sheaf-theoretic interpretation of strati...

String-to-String Interpretations with Polynomial-Size Output

String-to-string MSO interpretations are like Courcelle's MSO transducti...

On (co-lex) Ordering Automata

The states of a deterministic finite automaton A can be identified with ...

An output-sensitive algorithm for the minimization of 2-dimensional String Covers

String covers are a powerful tool for analyzing the quasi-periodicity of...

When a Dollar Makes a BWT

The Burrows-Wheeler-Transform (BWT) is a reversible string transformatio...

Alignments as Compositional Structures

Alignments, i.e., position-wise comparisons of two or more strings or or...