Contraction Principle based Robust Iterative Algorithms for Machine Learning

10/05/2013
by   Rangeet Mitra, et al.
0

Iterative algorithms are ubiquitous in the field of data mining. Widely known examples of such algorithms are the least mean square algorithm, backpropagation algorithm of neural networks. Our contribution in this paper is an improvement upon this iterative algorithms in terms of their respective performance metrics and robustness. This improvement is achieved by a new scaling factor which is multiplied to the error term. Our analysis shows that in essence, we are minimizing the corresponding LASSO cost function, which is the reason of its increased robustness. We also give closed form expressions for the number of iterations for convergence and the MSE floor of the original cost function for a minimum targeted value of the L1 norm. As a concluding theme based on the stochastic subgradient algorithm, we give a comparison between the well known Dantzig selector and our algorithm based on contraction principle. By these simulations we attempt to show the optimality of our approach for any widely used parent iterative optimization problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2015

l1-norm Penalized Orthogonal Forward Regression

A l1-norm penalized orthogonal forward regression (l1-POFR) algorithm is...
research
06/13/2019

Non-convex optimization via strongly convex majoirziation-minimization

In this paper, we introduce a class of nonsmooth nonconvex least square ...
research
12/23/2022

On a fixed-point continuation method for a convex optimization problem

We consider a variation of the classical proximal-gradient algorithm for...
research
01/07/2020

Efficient ML Direction of Arrival Estimation assuming Unknown Sensor Noise Powers

This paper presents an efficient method for computing maximum likelihood...
research
02/04/2019

Study of Robust Distributed Diffusion RLS Algorithms with Side Information for Adaptive Networks

This work develops robust diffusion recursive least squares algorithms t...
research
12/07/2018

A biconvex analysis for Lasso l1 reweighting

l1 reweighting algorithms are very popular in sparse signal recovery and...
research
09/20/2023

Multiplying poles to avoid unwanted points in root finding and optimization

In root finding and optimization, there are many cases where there is a ...

Please sign up or login with your details

Forgot password? Click here to reset