AdaBoost and robust one-bit compressed sensing

05/05/2021
by   Geoffrey Chinot, et al.
2

This paper studies binary classification in robust one-bit compressed sensing with adversarial errors. It is assumed that the model is overparameterized and that the parameter of interest is effectively sparse. AdaBoost is considered, and, through its relation to the max-ℓ_1-margin-classifier, risk bounds are derived. In particular, this provides an explanation why interpolating adversarial noise can be harmless for classification problems. Simulations illustrate the presented theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2017

One-Bit ExpanderSketch for One-Bit Compressed Sensing

Is it possible to obliviously construct a set of hyperplanes H such that...
research
05/08/2018

Binary Sparse Bayesian Learning Algorithm for One-bit Compressed Sensing

In this letter, a binary sparse Bayesian learning (BSBL) algorithm is pr...
research
12/17/2018

Robust one-bit compressed sensing with partial circulant matrices

We present optimal sample complexity estimates for one-bit compressed se...
research
10/29/2018

Parameter instability regimes for sparse proximal denoising programs

Compressed sensing theory explains why Lasso programs recover structured...
research
05/26/2021

Compressed Sensing Measurement of Long-Range Correlated Noise

Long-range correlated errors can severely impact the performance of NISQ...
research
11/25/2017

Multivariate Copula Spatial Dependency in One Bit Compressed Sensing

In this letter, the problem of sparse signal reconstruction from one bit...
research
06/20/2021

Generative Model Adversarial Training for Deep Compressed Sensing

Deep compressed sensing assumes the data has sparse representation in a ...

Please sign up or login with your details

Forgot password? Click here to reset