Forward-Backward Greedy Algorithms for General Convex Smooth Functions over A Cardinality Constraint

12/31/2013
by   Ji Liu, et al.
0

We consider forward-backward greedy algorithms for solving sparse feature selection problems with general convex smooth functions. A state-of-the-art greedy method, the Forward-Backward greedy algorithm (FoBa-obj) requires to solve a large number of optimization problems, thus it is not scalable for large-size problems. The FoBa-gdt algorithm, which uses the gradient information for feature selection at each forward iteration, significantly improves the efficiency of FoBa-obj. In this paper, we systematically analyze the theoretical properties of both forward-backward greedy algorithms. Our main contributions are: 1) We derive better theoretical bounds than existing analyses regarding FoBa-obj for general smooth convex functions; 2) We show that FoBa-gdt achieves the same theoretical performance as FoBa-obj under the same condition: restricted strong convexity condition. Our new bounds are consistent with the bounds of a special case (least squares) and fills a previously existing theoretical gap for general convex smooth functions; 3) We show that the restricted strong convexity condition is satisfied if the number of independent samples is more than k̅ d where k̅ is the sparsity number and d is the dimension of the variable; 4) We apply FoBa-gdt (with the conditional random field objective) to the sensor selection problem for human indoor activity recognition and our results show that FoBa-gdt outperforms other methods (including the ones based on forward greedy selection and L1-regularization).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2011

On Learning Discrete Graphical Models Using Greedy Methods

In this paper, we address the problem of learning the structure of a pai...
research
07/10/2017

An Interactive Greedy Approach to Group Sparsity in High Dimension

Sparsity learning with known grouping structures has received considerab...
research
12/26/2019

Sparse Optimization on General Atomic Sets: Greedy and Forward-Backward Algorithms

We consider the problem of sparse atomic optimization, where the notion ...
research
05/30/2017

Forward-Backward Selection with Early Dropping

Forward-backward selection is one of the most basic and commonly-used fe...
research
04/23/2014

Forward - Backward Greedy Algorithms for Atomic Norm Regularization

In many signal processing applications, the aim is to reconstruct a sign...
research
02/11/2021

SLS (Single ℓ_1 Selection): a new greedy algorithm with an ℓ_1-norm selection rule

In this paper, we propose a new greedy algorithm for sparse approximatio...
research
06/06/2021

On the Optimality of Backward Regression: Sparse Recovery and Subset Selection

Sparse recovery and subset selection are fundamental problems in varied ...

Please sign up or login with your details

Forgot password? Click here to reset