The Space of Functions Computed By Deep Layered Machines

04/19/2020
by   Alexander Mozeika, et al.
0

We study the space of Boolean functions computed by random layered machines, including deep neural networks, and Boolean circuits. Investigating recurrent and layered feed-forward architectures, we find that the spaces of functions realized by both architectures are the same. We show that, depending on the initial conditions and computing elements used, the entropy of Boolean functions computed by deep layered machines is either monotonically increasing or decreasing with growing depth, and characterize the space of functions computed at the large depth limit.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/24/2019

Computing the Number of Affine Equivalence Classes of Boolean Functions modulo Functions of Different Degrees

Affine equivalence classes of Boolean functions has many applications in...
research
01/30/2021

Size and Depth Separation in Approximating Natural Functions with Neural Networks

When studying the expressive power of neural networks, a main challenge ...
research
04/27/2022

Machines of finite depth: towards a formalization of neural networks

We provide a unifying framework where artificial neural networks and the...
research
09/06/2019

Mapping finite state machines to zk-SNARKS Using Category Theory

We provide a categorical procedure to turn graphs corresponding to state...
research
01/12/2018

Self-Predicting Boolean Functions

A Boolean function g is said to be an optimal predictor for another Bool...
research
08/18/2023

Noise Sensitivity and Stability of Deep Neural Networks for Binary Classification

A first step is taken towards understanding often observed non-robustnes...
research
07/28/2021

Statistically Meaningful Approximation: a Case Study on Approximating Turing Machines with Transformers

A common lens to theoretically study neural net architectures is to anal...

Please sign up or login with your details

Forgot password? Click here to reset