
Maximum Total Correntropy Diffusion Adaptation over Networks with Noisy Links
Distributed estimation over networks draws much attraction in recent yea...
read it

Distributed Adaptive LMF Algorithm for Sparse Parameter Estimation in Gaussian Mixture Noise
A distributed adaptive algorithm for estimation of sparse unknown parame...
read it

Blind Signal Separation in the Presence of Gaussian Noise
A prototypical blind signal separation problem is the socalled cocktail...
read it

Differential Evolution with Nearest & Better Option for Function Optimization
Differential evolution is the conventional algorithm with the fastest co...
read it

On statistical Calderón problems
For D a bounded domain in R^d, d > 3, with smooth boundary ∂ D, the non...
read it

Noisy Optimization: Convergence with a Fixed Number of Resamplings
It is known that evolution strategies in continuous domains might not co...
read it

A Subpixel Registration Algorithm for Low PSNR Images
This paper presents a fast algorithm for obtaining highaccuracy subpixe...
read it
A New NoiseAssistant LMS Algorithm for Preventing the Stalling Effect
In this paper, we introduce a new algorithm to deal with the stalling effect in the LMS algorithm used in adaptive filters. We modify the update rule of the tap weight vectors by adding noise, generated by a noise generator. The properties of the proposed method are investigated by two novel theorems. As it is shown, the resulting algorithm, called Added Noise LMS (ANLMS), improves the resistance capability of the conventional LMS algorithm against the stalling effect. The probability of update with additive white Gaussian noise is calculated in the paper. Convergence of the proposed method is investigated and it is proved that the rate of convergence of the introduced method is equal to that of LMS algorithm in the expected value sense, provided that the distribution of the added noise is uniform. Finally, it is shown that the order of complexity of the proposed algorithm is linear as the conventional LMS algorithm.
READ FULL TEXT
Comments
There are no comments yet.