Signal Decomposition Using Masked Proximal Operators

02/18/2022
by   Bennet E. Meyers, et al.
15

We consider the well-studied problem of decomposing a vector time series signal into components with different characteristics, such as smooth, periodic, nonnegative, or sparse. We propose a simple and general framework in which the components are defined by loss functions (which include constraints), and the signal decomposition is carried out by minimizing the sum of losses of the components (subject to the constraints). When each loss function is the negative log-likelihood of a density for the signal component, our method coincides with maximum a posteriori probability (MAP) estimation; but it also includes many other interesting cases. We give two distributed optimization methods for computing the decomposition, which find the optimal decomposition when the component class loss functions are convex, and are good heuristics when they are not. Both methods require only the masked proximal operator of each of the component loss functions, a generalization of the well-known proximal operator that handles missing entries in its argument. Both methods are distributed, i.e., handle each component separately. We derive tractable methods for evaluating the masked proximal operators of some loss functions that, to our knowledge, have not appeared in the literature.

READ FULL TEXT

page 12

page 13

page 14

page 35

page 36

page 37

page 42

research
04/12/2016

A Convex Surrogate Operator for General Non-Modular Loss Functions

Empirical risk minimization frequently employs convex surrogates to unde...
research
02/25/2019

Matrix denoising for weighted loss functions and heterogeneous signals

We consider the problem of recovering a low-rank matrix from a noisy obs...
research
02/27/2021

Learning with Smooth Hinge Losses

Due to the non-smoothness of the Hinge loss in SVM, it is difficult to o...
research
06/06/2019

Toward a Characterization of Loss Functions for Distribution Learning

In this work we study loss functions for learning and evaluating probabi...
research
09/15/2022

Omnipredictors for Constrained Optimization

The notion of omnipredictors (Gopalan, Kalai, Reingold, Sharan and Wiede...
research
05/16/2018

Perspective Maximum Likelihood-Type Estimation via Proximal Decomposition

We introduce an optimization model for maximum likelihood-type estimatio...
research
11/26/2022

Calculus rules for proximal ε-subdifferentials and inexact proximity operators for weakly convex functions

We investigate inexact proximity operators for weakly convex functions. ...

Please sign up or login with your details

Forgot password? Click here to reset