Entropy Bounds for Grammar-Based Tree Compressors

01/10/2019
by   Danny Hucke, et al.
0

The definition of k^th-order empirical entropy of strings is extended to node labelled binary trees. A suitable binary encoding of tree straight-line programs (that have been used for grammar-based tree compression before) is shown to yield binary tree encodings of size bounded by the k^th-order empirical entropy plus some lower order terms. This generalizes recent results for grammar-based string compression to grammar-based tree compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2019

Balancing Straight-Line Programs

It is shown that a context-free grammar of size m that produces a single...
research
02/15/2018

Grammar-based Compression of Unranked Trees

We introduce forest straight-line programs (FSLPs) as a compressed repre...
research
04/23/2018

Entropy bounds for grammar compression

In grammar compression we represent a string as a context free grammar. ...
research
06/01/2020

A Comparison of Empirical Tree Entropies

Whereas for strings, higher-order empirical entropy is the standard entr...
research
04/11/2019

Modeling the Complexity and Descriptive Adequacy of Construction Grammars

This paper uses the Minimum Description Length paradigm to model the com...
research
01/27/2020

Unsupervised Program Synthesis for Images using Tree-Structured LSTM

Program synthesis has recently emerged as a promising approach to the im...
research
02/22/2019

On Transforming Narrowing Trees into Regular Tree Grammars Generating Ranges of Substitutions

The grammar representation of a narrowing tree for a syntactically deter...

Please sign up or login with your details

Forgot password? Click here to reset