DeepAI AI Chat
Log In Sign Up

Contraction Principle based Robust Iterative Algorithms for Machine Learning

10/05/2013
by   Rangeet Mitra, et al.
0

Iterative algorithms are ubiquitous in the field of data mining. Widely known examples of such algorithms are the least mean square algorithm, backpropagation algorithm of neural networks. Our contribution in this paper is an improvement upon this iterative algorithms in terms of their respective performance metrics and robustness. This improvement is achieved by a new scaling factor which is multiplied to the error term. Our analysis shows that in essence, we are minimizing the corresponding LASSO cost function, which is the reason of its increased robustness. We also give closed form expressions for the number of iterations for convergence and the MSE floor of the original cost function for a minimum targeted value of the L1 norm. As a concluding theme based on the stochastic subgradient algorithm, we give a comparison between the well known Dantzig selector and our algorithm based on contraction principle. By these simulations we attempt to show the optimality of our approach for any widely used parent iterative optimization problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/04/2015

l1-norm Penalized Orthogonal Forward Regression

A l1-norm penalized orthogonal forward regression (l1-POFR) algorithm is...
06/13/2019

Non-convex optimization via strongly convex majoirziation-minimization

In this paper, we introduce a class of nonsmooth nonconvex least square ...
12/23/2022

On a fixed-point continuation method for a convex optimization problem

We consider a variation of the classical proximal-gradient algorithm for...
01/07/2020

Efficient ML Direction of Arrival Estimation assuming Unknown Sensor Noise Powers

This paper presents an efficient method for computing maximum likelihood...
02/04/2019

Study of Robust Distributed Diffusion RLS Algorithms with Side Information for Adaptive Networks

This work develops robust diffusion recursive least squares algorithms t...
10/01/2012

Sparse LMS via Online Linearized Bregman Iteration

We propose a version of least-mean-square (LMS) algorithm for sparse sys...
02/14/2019

On Many-to-Many Mapping Between Concordance Correlation Coefficient and Mean Square Error

The concordance correlation coefficient (CCC) is one of the most widely ...