Learners' languages
In "Backprop as functor", the authors show that the fundamental elements of deep learning – gradient descent and backpropagation – can be conceptualized as a strong monoidal functor 𝐏𝐚𝐫𝐚(𝐄𝐮𝐜)→𝐋𝐞𝐚𝐫𝐧 from the category of parameterized Euclidean spaces to that of learners, a category developed explicitly to capture parameter update and backpropagation. It was soon realized that there is an isomorphism 𝐋𝐞𝐚𝐫𝐧≅𝐏𝐚𝐫𝐚(𝐒𝐋𝐞𝐧𝐬), where 𝐒𝐋𝐞𝐧𝐬 is the symmetric monoidal category of simple lenses as used in functional programming. In this note, we observe that 𝐒𝐋𝐞𝐧𝐬 is a full subcategory of 𝐏𝐨𝐥𝐲, the category of polynomial functors in one variable, via the functor A↦ Ay^A. Using the fact that (𝐏𝐨𝐥𝐲,⊗) is monoidal closed, we show that a map A→ B in 𝐏𝐚𝐫𝐚(𝐒𝐋𝐞𝐧𝐬) has a natural interpretation in terms of dynamical systems (more precisely, generalized Moore machines) whose interface is the internal-hom type [Ay^A,By^B]. Finally, we review the fact that the category p-𝐂𝐨𝐚𝐥𝐠 of dynamical systems on any p∈𝐏𝐨𝐥𝐲 forms a topos, and consider the logical propositions that can be stated in its internal language. We give gradient descent as an example, and we conclude by discussing some directions for future work.
READ FULL TEXT