A stochastic behavior analysis of stochastic restricted-gradient descent algorithm in reproducing kernel Hilbert spaces

10/14/2014
by   Masa-aki Takizawa, et al.
0

This paper presents a stochastic behavior analysis of a kernel-based stochastic restricted-gradient descent method. The restricted gradient gives a steepest ascent direction within the so-called dictionary subspace. The analysis provides the transient and steady state performance in the mean squared error criterion. It also includes stability conditions in the mean and mean-square sense. The present study is based on the analysis of the kernel normalized least mean square (KNLMS) algorithm initially proposed by Chen et al. Simulation results validate the analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2021

Mean-square Analysis of the NLMS Algorithm

This work presents a novel approach to the mean-square analysis of the n...
research
06/22/2013

Online dictionary learning for kernel LMS. Analysis and forward-backward splitting algorithm

Adaptive filtering algorithms operating in reproducing kernel Hilbert sp...
research
08/25/2016

Transient performance analysis of zero-attracting LMS

Zero-attracting least-mean-square (ZA-LMS) algorithm has been widely use...
research
09/09/2019

A Complete Transient Analysis for the Incremental LMS Algorithm

The incremental least mean square (ILMS) algorithm was presented in <cit...
research
10/30/2018

Exact Expectation Analysis of the Deficient-Length LMS Algorithm

Stochastic models that predict adaptive filtering algorithms performance...
research
02/10/2022

On One-Bit Quantization

We consider the one-bit quantizer that minimizes the mean squared error ...
research
04/13/2023

Multi-kernel Correntropy-based Orientation Estimation of IMUs: Gradient Descent Methods

This paper presents two computationally efficient algorithms for the ori...

Please sign up or login with your details

Forgot password? Click here to reset