Learners' languages

03/01/2021
by   David I. Spivak, et al.
0

In "Backprop as functor", the authors show that the fundamental elements of deep learning – gradient descent and backpropagation – can be conceptualized as a strong monoidal functor 𝐏𝐚𝐫𝐚(𝐄𝐮𝐜)→𝐋𝐞𝐚𝐫𝐧 from the category of parameterized Euclidean spaces to that of learners, a category developed explicitly to capture parameter update and backpropagation. It was soon realized that there is an isomorphism 𝐋𝐞𝐚𝐫𝐧≅𝐏𝐚𝐫𝐚(𝐒𝐋𝐞𝐧𝐬), where 𝐒𝐋𝐞𝐧𝐬 is the symmetric monoidal category of simple lenses as used in functional programming. In this note, we observe that 𝐒𝐋𝐞𝐧𝐬 is a full subcategory of 𝐏𝐨𝐥𝐲, the category of polynomial functors in one variable, via the functor A↦ Ay^A. Using the fact that (𝐏𝐨𝐥𝐲,⊗) is monoidal closed, we show that a map A→ B in 𝐏𝐚𝐫𝐚(𝐒𝐋𝐞𝐧𝐬) has a natural interpretation in terms of dynamical systems (more precisely, generalized Moore machines) whose interface is the internal-hom type [Ay^A,By^B]. Finally, we review the fact that the category p-𝐂𝐨𝐚𝐥𝐠 of dynamical systems on any p∈𝐏𝐨𝐥𝐲 forms a topos, and consider the logical propositions that can be stated in its internal language. We give gradient descent as an example, and we conclude by discussing some directions for future work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2021

Internal Category with Families in Presheaves

In this note, we review a construction of category with families (CwF) i...
research
09/09/2022

Deriving dynamical systems for language based on the Tolerance Principle

In this research note, I derive explicit dynamical systems for language ...
research
09/16/2016

Gradient Descent Learns Linear Dynamical Systems

We prove that gradient descent efficiently converges to the global optim...
research
03/05/2019

Lenses and Learners

Lenses are a well-established structure for modelling bidirectional tran...
research
08/06/2019

Generalized Lens Categories via functors C^ op→Cat

Lenses have a rich history and have recently received a great deal of at...
research
11/28/2017

Backprop as Functor: A compositional perspective on supervised learning

A supervised learning algorithm searches over a set of functions A → B p...

Please sign up or login with your details

Forgot password? Click here to reset