Bipolar Morphological Neural Networks: Convolution Without Multiplication

11/05/2019
by   Elena Limonova, et al.
41

In the paper we introduce a novel bipolar morphological neuron and bipolar morphological layer models. The models use only such operations as addition, subtraction and maximum inside the neuron and exponent and logarithm as activation functions for the layer. The proposed models unlike previously introduced morphological neural networks approximate the classical computations and show better recognition results. We also propose layer-by-layer approach to train the bipolar morphological networks, which can be further developed to an incremental approach for separate neurons to get higher accuracy. Both these approaches do not require special training algorithms and can use a variety of gradient descent methods. To demonstrate efficiency of the proposed model we consider classical convolutional neural networks and convert the pre-trained convolutional layers to the bipolar morphological layers. Seeing that the experiments on recognition of MNIST and MRZ symbols show only moderate decrease of accuracy after conversion and training, bipolar neuron model can provide faster inference and be very useful in mobile and embedded systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2020

ResNet-like Architecture with Low Hardware Requirements

One of the most computationally intensive parts in modern recognition sy...
research
09/04/2019

Deep Morphological Neural Networks

Mathematical morphology is a theory and technique to collect features li...
research
04/04/2019

Transfer Learning with Sparse Associative Memories

In this paper, we introduce a novel layer designed to be used as the out...
research
01/21/2022

Training Hybrid Classical-Quantum Classifiers via Stochastic Variational Optimization

Quantum machine learning has emerged as a potential practical applicatio...
research
02/19/2021

Going beyond p-convolutions to learn grayscale morphological operators

Integrating mathematical morphology operations within deep neural networ...
research
02/12/2023

Quantum Neuron Selection: Finding High Performing Subnetworks With Quantum Algorithms

Gradient descent methods have long been the de facto standard for traini...
research
05/27/2023

A Hybrid Quantum-Classical Approach based on the Hadamard Transform for the Convolutional Layer

In this paper, we propose a novel Hadamard Transform (HT)-based neural n...

Please sign up or login with your details

Forgot password? Click here to reset