Holomorphic feedforward networks

05/09/2021
by   Michael R. Douglas, et al.
0

A very popular model in machine learning is the feedforward neural network (FFN). The FFN can approximate general functions and mitigate the curse of dimensionality. Here we introduce FFNs which represent sections of holomorphic line bundles on complex manifolds, and ask some questions about their approximating power. We also explain formal similarities between the standard approach to supervised learning and the problem of finding numerical Ricci flat Kähler metrics, which allow carrying some ideas between the two problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2021

Calabi-Yau Metrics, Energy Functionals and Machine-Learning

We apply machine learning to the problem of finding numerical Calabi-Yau...
research
12/31/2020

Neural Network Approximations for Calabi-Yau Metrics

Ricci flat metrics for Calabi-Yau threefolds are not known analytically....
research
10/18/2019

Machine learning Calabi-Yau metrics

We apply machine learning to the problem of finding numerical Calabi-Yau...
research
10/26/2017

Biologically Inspired Feedforward Supervised Learning for Deep Self-Organizing Map Networks

In this study, we propose a novel deep neural network and its supervised...
research
01/03/2023

Operator theory, kernels, and Feedforward Neural Networks

In this paper we show how specific families of positive definite kernels...
research
07/17/2018

Expressive power of outer product manifolds on feed-forward neural networks

Hierarchical neural networks are exponentially more efficient than their...
research
12/16/2021

Machine Learning Kreuzer–Skarke Calabi–Yau Threefolds

Using a fully connected feedforward neural network we study topological ...

Please sign up or login with your details

Forgot password? Click here to reset