Algorithmic Complexities in Backpropagation and Tropical Neural Networks

01/03/2021
by   Ozgur Ceyhan, et al.
0

In this note, we propose a novel technique to reduce the algorithmic complexity of neural network training by using matrices of tropical real numbers instead of matrices of real numbers. Since the tropical arithmetics replaces multiplication with addition, and addition with max, we theoretically achieve several order of magnitude better constant factors in time complexities in the training phase. The fact that we replace the field of real numbers with the tropical semiring of real numbers and yet achieve the same classification results via neural networks come from deep results in topology and analysis, which we verify in our note. We then explore artificial neural networks in terms of tropical arithmetics and tropical algebraic geometry, and introduce the multi-layered tropical neural networks as universal approximators. After giving a tropical reformulation of the backpropagation algorithm, we verify the algorithmic complexity is substantially lower than the usual backpropagation as the tropical arithmetic is free of the complexity of usual multiplication.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2015

Neural GPUs Learn Algorithms

Learning an algorithm from examples is a fundamental problem that has be...
research
05/07/2018

Computational Complexity of Space-Bounded Real Numbers

In this work we study the space complexity of computable real numbers re...
research
06/12/2020

AlgebraNets

Neural networks have historically been built layerwise from the set of f...
research
02/12/2021

Min-Max-Plus Neural Networks

We present a new model of neural networks called Min-Max-Plus Neural Net...
research
10/03/2022

Limitations of neural network training due to numerical instability of backpropagation

We study the training of deep neural networks by gradient descent where ...
research
08/07/2023

Solving Falkner-Skan type equations via Legendre and Chebyshev Neural Blocks

In this paper, a new deep-learning architecture for solving the non-line...
research
09/03/2021

Dive into Layers: Neural Network Capacity Bounding using Algebraic Geometry

The empirical results suggest that the learnability of a neural network ...

Please sign up or login with your details

Forgot password? Click here to reset