Study of Sparsity-Aware Subband Adaptive Filtering Algorithms with Adjustable Penalties

by   Y. Yu, et al.

We propose two sparsity-aware normalized subband adaptive filter (NSAF) algorithms by using the gradient descent method to minimize a combination of the original NSAF cost function and the l1-norm penalty function on the filter coefficients. This l1-norm penalty exploits the sparsity of a system in the coefficients update formulation, thus improving the performance when identifying sparse systems. Compared with prior work, the proposed algorithms have lower computational complexity with comparable performance. We study and devise statistical models for these sparsity-aware NSAF algorithms in the mean square sense involving their transient and steady -state behaviors. This study relies on the vectorization argument and the paraunitary assumption imposed on the analysis filter banks, and thus does not restrict the input signal to being Gaussian or having another distribution. In addition, we propose to adjust adaptively the intensity parameter of the sparsity attraction term. Finally, simulation results in sparse system identification demonstrate the effectiveness of our theoretical results.



There are no comments yet.


page 1

page 2

page 3

page 4


Low-Complexity Set-Membership Normalized LMS Algorithm for Sparse System Modeling

In this work, we propose two low-complexity set-membership normalized le...

A Generalized Proportionate-Type Normalized Subband Adaptive Filter

We show that a new design criterion, i.e., the least squares on subband ...

Sparsity-Aware SSAF Algorithm with Individual Weighting Factors for Acoustic Echo Cancellation

In this paper, we propose and analyze the sparsity-aware sign subband ad...

Study of Diffusion Normalized Least Mean M-estimate Algorithms

This work proposes diffusion normalized least mean M-estimate algorithm ...

Do Proportionate Algorithms Exploit Sparsity?

Adaptive filters exploiting sparsity have been a very active research fi...

Representation Learning via Cauchy Convolutional Sparse Coding

In representation learning, Convolutional Sparse Coding (CSC) enables un...

Channel Pruning In Quantization-aware Training: An Adaptive Projection-gradient Descent-shrinkage-splitting Method

We propose an adaptive projection-gradient descent-shrinkage-splitting m...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.