Learning with Partially Ordered Representations

06/19/2019
by   Jane Chandlee, et al.
0

This paper examines the characterization and learning of grammars defined with enriched representational models. Model-theoretic approaches to formal language theory traditionally assume that each position in a string belongs to exactly one unary relation. We consider unconventional string models where positions can have multiple, shared properties, which are arguably useful in many applications. We show the structures given by these models are partially ordered, and present a learning algorithm that exploits this ordering relation to effectively prune the hypothesis space. We prove this learning algorithm, which takes positive examples as input, finds the most general grammar which covers the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2022

On the growth rate of polyregular functions

We consider polyregular functions, which are certain string-to-string fu...
research
12/20/2017

Sheaf-Theoretic Stratification Learning

In this paper, we investigate a sheaf-theoretic interpretation of strati...
research
05/30/2019

String-to-String Interpretations with Polynomial-Size Output

String-to-string MSO interpretations are like Courcelle's MSO transducti...
research
06/04/2021

On (co-lex) Ordering Automata

The states of a deterministic finite automaton A can be identified with ...
research
06/21/2018

An output-sensitive algorithm for the minimization of 2-dimensional String Covers

String covers are a powerful tool for analyzing the quasi-periodicity of...
research
08/24/2019

When a Dollar Makes a BWT

The Burrows-Wheeler-Transform (BWT) is a reversible string transformatio...
research
11/28/2018

Fixed-length Bit-string Representation of Fingerprint by Normalized Local Structures

In this paper, we propose a method to represent a fingerprint image by a...

Please sign up or login with your details

Forgot password? Click here to reset