Deep neural networks as nested dynamical systems

11/01/2021
by   David I. Spivak, et al.
0

There is an analogy that is often made between deep neural networks and actual brains, suggested by the nomenclature itself: the "neurons" in deep neural networks should correspond to neurons (or nerve cells, to avoid confusion) in the brain. We claim, however, that this analogy doesn't even type check: it is structurally flawed. In agreement with the slightly glib summary of Hebbian learning as "cells that fire together wire together", this article makes the case that the analogy should be different. Since the "neurons" in deep neural networks are managing the changing weights, they are more akin to the synapses in the brain; instead, it is the wires in deep neural networks that are more like nerve cells, in that they are what cause the information to flow. An intuition that nerve cells seem like more than mere wires is exactly right, and is justified by a precise category-theoretic analogy which we will explore in this article. Throughout, we will continue to highlight the error in equating artificial neurons with nerve cells by leaving "neuron" in quotes or by calling them artificial neurons. We will first explain how to view deep neural networks as nested dynamical systems with a very restricted sort of interaction pattern, and then explain a more general sort of interaction for dynamical systems that is useful throughout engineering, but which fails to adapt to changing circumstances. As mentioned, an analogy is then forced upon us by the mathematical formalism in which they are both embedded. We call the resulting encompassing generalization deeply interacting learning systems: they have complex interaction as in control theory, but adaptation to circumstances as in deep neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2022

Deep Neural Networks as Complex Networks

Deep Neural Networks are, from a physical perspective, graphs whose `lin...
research
04/09/2022

Neural networks embrace learned diversity

Diversity conveys advantages in nature, yet homogeneous neurons typicall...
research
06/21/2023

Finite-time Lyapunov exponents of deep neural networks

We compute how small input perturbations affect the output of deep neura...
research
10/06/2021

Characterizing Learning Dynamics of Deep Neural Networks via Complex Networks

In this paper, we interpret Deep Neural Networks with Complex Network Th...
research
07/06/2021

Dynamical System Parameter Identification using Deep Recurrent Cell Networks

In this paper, we investigate the parameter identification problem in dy...
research
08/04/2023

Deep neural networks from the perspective of ergodic theory

The design of deep neural networks remains somewhat of an art rather tha...
research
09/30/2022

Disentangling with Biological Constraints: A Theory of Functional Cell Types

Neurons in the brain are often finely tuned for specific task variables....

Please sign up or login with your details

Forgot password? Click here to reset