Dataflow Matrix Machines and V-values: a Bridge between Programs and Neural Nets

12/20/2017
by   Michael Bukatin, et al.
0

Dataflow matrix machines generalize neural nets by replacing streams of numbers with streams of vectors (or other kinds of linear streams admitting a notion of linear combination of several streams) and adding a few more changes on top of that, namely arbitrary input and output arities for activation functions, countable-sized networks with finite dynamically changeable active part capable of unbounded growth, and a very expressive self-referential mechanism. While recurrent neural networks are Turing-complete, they form an esoteric programming platform, not conductive for practical general-purpose programming. Dataflow matrix machines are more suitable as a general-purpose programming platform, although it remains to be seen whether this platform can be made fully competitive with more traditional programming platforms currently in use. At the same time, dataflow matrix machines retain the key property of recurrent neural networks: programs are expressed via matrices of real numbers, and continuous changes to those matrices produce arbitrarily small variations in the programs associated with those matrices. Spaces of vector-like elements are of particular importance in this context. In particular, we focus on the vector space V of finite linear combinations of strings, which can be also understood as the vector space of finite prefix trees with numerical leaves, the vector space of "mixed rank tensors", or the vector space of recurrent maps. This space, and a family of spaces of vector-like elements derived from it, are sufficiently expressive to cover all cases of interest we are currently aware of, and allow a compact and streamlined version of dataflow matrix machines based on a single space of vector-like elements and variadic neurons. We call elements of these spaces V-values. Their role in our context is somewhat similar to the role of S-expressions in Lisp.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2017

Dataflow Matrix Machines as a Model of Computations with Linear Streams

We overview dataflow matrix machines as a Turing complete generalization...
research
06/30/2016

Programming Patterns in Dataflow Matrix Machines and Generalized Recurrent Neural Nets

Dataflow matrix machines arise naturally in the context of synchronous d...
research
05/17/2016

Dataflow matrix machines as programmable, dynamically expandable, self-referential generalized recurrent neural networks

Dataflow matrix machines are a powerful generalization of recurrent neur...
research
03/29/2016

Dataflow Matrix Machines as a Generalization of Recurrent Neural Networks

Dataflow matrix machines are a powerful generalization of recurrent neur...
research
01/30/2019

Compositionality for Recursive Neural Networks

Modelling compositionality has been a longstanding area of research in t...
research
06/23/2019

Neural networks with motivation

Motivational salience is a mechanism that determines an organism's curre...
research
09/09/2011

Learning Sequence Neighbourhood Metrics

Recurrent neural networks (RNNs) in combination with a pooling operator ...

Please sign up or login with your details

Forgot password? Click here to reset