Two Bicomplex Least Mean Square (BLMS) algorithms

09/24/2022
by   Daniel Alpay, et al.
0

We study and introduce new gradient operators in the complex and bicomplex settings, inspired from the well-known Least Mean Square (LMS) algorithm invented in 1960 by Widrow and Hoff for Adaptive Linear Neuron (ADALINE). These gradient operators will be used to formulate new learning rules for the Bicomplex Least Mean Square (BLMS) algorithms. This approach extends both the classical real and complex LMS algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2021

A Novel Quantum Calculus-based Complex Least Mean Square Algorithm (q-CLMS)

In this research, a novel adaptive filtering algorithm is proposed for c...
research
01/23/2014

Kernel Least Mean Square with Adaptive Kernel Size

Kernel adaptive filters (KAF) are a class of powerful nonlinear filters ...
research
08/12/2019

BGD-based Adam algorithm for time-domain equalizer in PAM-based optical interconnects

To the best of our knowledge, for the first time, we propose adaptive mo...
research
09/20/2016

Distributed Adaptive Learning of Graph Signals

The aim of this paper is to propose distributed strategies for adaptive ...
research
06/10/2021

Investigating Alternatives to the Root Mean Square for Adaptive Gradient Methods

Adam is an adaptive gradient method that has experienced widespread adop...
research
10/16/2019

Root Mean Square Layer Normalization

Layer normalization (LayerNorm) has been successfully applied to various...
research
11/24/2020

Acceleration of Cooperative Least Mean Square via Chebyshev Periodical Successive Over-Relaxation

A distributed algorithm for least mean square (LMS) can be used in distr...

Please sign up or login with your details

Forgot password? Click here to reset