Approximation Capabilities of Neural Networks using Morphological Perceptrons and Generalizations

07/16/2022
by   William Chang, et al.
0

Standard artificial neural networks (ANNs) use sum-product or multiply-accumulate node operations with a memoryless nonlinear activation. These neural networks are known to have universal function approximation capabilities. Previously proposed morphological perceptrons use max-sum, in place of sum-product, node processing and have promising properties for circuit implementations. In this paper we show that these max-sum ANNs do not have universal approximation capabilities. Furthermore, we consider proposed signed-max-sum and max-star-sum generalizations of morphological ANNs and show that these variants also do not have universal approximation capabilities. We contrast these variations to log-number system (LNS) implementations which also avoid multiplications, but do exhibit universal approximation capabilities.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2019

Approximation capabilities of neural networks on unbounded domains

We prove universal approximation theorems of neural networks in L^p(R× [...
research
01/17/2022

Parametrized Convex Universal Approximators for Decision-Making Problems

Parametrized max-affine (PMA) and parametrized log-sum-exp (PLSE) networ...
research
09/23/2022

Achieve the Minimum Width of Neural Networks for Universal Approximation

The universal approximation property (UAP) of neural networks is fundame...
research
01/01/2019

Dense Morphological Network: An Universal Function Approximator

Artificial neural networks are built on the basic operation of linear co...
research
08/27/2013

Improving Sparse Associative Memories by Escaping from Bogus Fixed Points

The Gripon-Berrou neural network (GBNN) is a recently invented recurrent...
research
09/20/2019

The Max-Product Algorithm Viewed as Linear Data-Fusion: A Distributed Detection Scenario

In this paper, we disclose the statistical behavior of the max-product a...
research
03/28/2013

A Massively Parallel Associative Memory Based on Sparse Neural Networks

Associative memories store content in such a way that the content can be...

Please sign up or login with your details

Forgot password? Click here to reset