DeepAI AI Chat
Log In Sign Up

A stochastic behavior analysis of stochastic restricted-gradient descent algorithm in reproducing kernel Hilbert spaces

by   Masa-aki Takizawa, et al.

This paper presents a stochastic behavior analysis of a kernel-based stochastic restricted-gradient descent method. The restricted gradient gives a steepest ascent direction within the so-called dictionary subspace. The analysis provides the transient and steady state performance in the mean squared error criterion. It also includes stability conditions in the mean and mean-square sense. The present study is based on the analysis of the kernel normalized least mean square (KNLMS) algorithm initially proposed by Chen et al. Simulation results validate the analysis.


page 1

page 2

page 3

page 4


Mean-square Analysis of the NLMS Algorithm

This work presents a novel approach to the mean-square analysis of the n...

Online dictionary learning for kernel LMS. Analysis and forward-backward splitting algorithm

Adaptive filtering algorithms operating in reproducing kernel Hilbert sp...

Transient performance analysis of zero-attracting LMS

Zero-attracting least-mean-square (ZA-LMS) algorithm has been widely use...

A Complete Transient Analysis for the Incremental LMS Algorithm

The incremental least mean square (ILMS) algorithm was presented in <cit...

Exact Expectation Analysis of the Deficient-Length LMS Algorithm

Stochastic models that predict adaptive filtering algorithms performance...

Multi-kernel Correntropy-based Orientation Estimation of IMUs: Gradient Descent Methods

This paper presents two computationally efficient algorithms for the ori...

On One-Bit Quantization

We consider the one-bit quantizer that minimizes the mean squared error ...