Robust Hierarchical-Optimization RLS Against Sparse Outliers

by   Konstantinos Slavakis, et al.

This paper fortifies the recently introduced hierarchical-optimization recursive least squares (HO-RLS) against outliers which contaminate infrequently linear-regression models. Outliers are modeled as nuisance variables and are estimated together with the linear filter/system variables via a sparsity-inducing (non-)convexly regularized least-squares task. The proposed outlier-robust HO-RLS builds on steepest-descent directions with a constant step size (learning rate), needs no matrix inversion (lemma), accommodates colored nominal noise of known correlation matrix, exhibits small computational footprint, and offers theoretical guarantees, in a probabilistic sense, for the convergence of the system estimates to the solutions of a hierarchical-optimization problem: Minimize a convex loss, which models a-priori knowledge about the unknown system, over the minimizers of the classical ensemble LS loss. Extensive numerical tests on synthetically generated data in both stationary and non-stationary scenarios showcase notable improvements of the proposed scheme over state-of-the-art techniques.



There are no comments yet.


page 1

page 2

page 3

page 4


Robust artificial neural networks and outlier detection. Technical report

Large outliers break down linear and nonlinear regression models. Robust...

Noise Statistics Oblivious GARD For Robust Regression With Sparse Outliers

Linear regression models contaminated by Gaussian noise (inlier) and pos...

Weighted Linear Bandits for Non-Stationary Environments

We consider a stochastic linear bandit model in which the available acti...

Bridging Convex and Nonconvex Optimization in Robust PCA: Noise, Outliers, and Missing Data

This paper delivers improved theoretical guarantees for the convex progr...

Outlier Detection Using Distributionally Robust Optimization under the Wasserstein Metric

We present a Distributionally Robust Optimization (DRO) approach to outl...

A New Perspective on Boosting in Linear Regression via Subgradient Optimization and Relatives

In this paper we analyze boosting algorithms in linear regression from a...

Making ordinary least squares linear classfiers more robust

In the field of statistics and machine learning, the sums-of-squares, co...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.