The Next 700 Program Transformers

08/25/2021
by   Geoffrey Hamilton, et al.
0

In this paper, we describe a hierarchy of program transformers in which the transformer at each level of the hierarchy builds on top of those at lower levels. The program transformer at level 1 of the hierarchy corresponds to positive supercompilation, and that at level 2 corresponds to distillation. We prove that the transformers at each level terminate. We then consider the speedups that can be obtained at each level in the hierarchy, and try to characterise the improvements that can be made.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2021

Scalable Transformers for Neural Machine Translation

Transformer has been widely adopted in Neural Machine Translation (NMT) ...
research
09/21/2014

A High-Level Model of Neocortical Feedback Based on an Event Window Segmentation Algorithm

The author previously presented an event window segmentation (EWS) algor...
research
05/02/2021

Synthesizing Abstract Transformers

This paper addresses the problem of creating abstract transformers autom...
research
06/23/2019

A Fine-Grained Variant of the Hierarchy of Lasserre

There has been much recent interest in hierarchies of progressively stro...
research
02/25/2021

How to represent part-whole hierarchies in a neural network

This paper does not describe a working system. Instead, it presents a si...
research
03/08/2022

Joint rotational invariance and adversarial training of a dual-stream Transformer yields state of the art Brain-Score for Area V4

Modern high-scoring models of vision in the brain score competition do n...
research
01/09/2022

A Knapsack Intersection Hierarchy Applied to All-or-Nothing Flow in Trees

We introduce a natural knapsack intersection hierarchy for strengthening...

Please sign up or login with your details

Forgot password? Click here to reset