An Algorithm for Graph-Fused Lasso Based on Graph Decomposition

08/06/2019
by   Feng Yu, et al.
0

This work proposes a new algorithm for solving the graph-fused lasso (GFL), a method for parameter estimation that operates under the assumption that the signal tends to be locally constant over a predefined graph structure. The proposed method applies the alternating direction method of multipliers (ADMM) algorithm and is based on the decomposition of the objective function into two components. While ADMM has been widely used in this problem, existing works such as network lasso decompose the objective function into the loss function component and the total variation penalty component. In comparison, this work proposes to decompose the objective function into two components, where one component is the loss function plus part of the total variation penalty, and the other component is the remaining total variation penalty. Compared with the network lasso algorithm, this method has a smaller computational cost per iteration and converges faster in most simulations numerically.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2015

A Fast and Flexible Algorithm for the Graph-Fused Lasso

We propose a new algorithm for solving the graph-fused lasso (GFL), a me...
research
05/28/2019

Distributed Linear Model Clustering over Networks: A Tree-Based Fused-Lasso ADMM Approach

In this work, we consider to improve the model estimation efficiency by ...
research
06/04/2018

On the total variation regularized estimator over the branched path graph

We generalize to tree graphs obtained by connecting path graphs an oracl...
research
03/08/2012

An ADMM Algorithm for a Class of Total Variation Regularized Estimation Problems

We present an alternating augmented Lagrangian method for convex optimiz...
research
01/03/2019

Element-wise estimation error of a total variation regularized estimator for change point detection

This work studies the total variation regularized ℓ_2 estimator (fused l...
research
11/05/2016

Alternating Direction Method of Multipliers for Sparse Convolutional Neural Networks

The storage and computation requirements of Convolutional Neural Network...
research
09/08/2018

Computational Sufficiency, Reflection Groups, and Generalized Lasso Penalties

We study estimators with generalized lasso penalties within the computat...

Please sign up or login with your details

Forgot password? Click here to reset