Estimating Networks With Jumps

12/17/2010
by   Mladen Kolar, et al.
0

We study the problem of estimating a temporally varying coefficient and varying structure (VCVS) graphical model underlying nonstationary time series data, such as social states of interacting individuals or microarray expression profiles of gene networks, as opposed to i.i.d. data from an invariant model widely considered in current literature of structural estimation. In particular, we consider the scenario in which the model evolves in a piece-wise constant fashion. We propose a procedure that minimizes the so-called TESLA loss (i.e., temporally smoothed L1 regularized regression), which allows jointly estimating the partition boundaries of the VCVS model and the coefficient of the sparse precision matrix on each block of the partition. A highly scalable proximal gradient method is proposed to solve the resultant convex optimization problem; and the conditions for sparsistent estimation and the convergence rate of both the partition boundaries and the network structure are established for the first time for such estimators.

READ FULL TEXT

page 18

page 20

research
03/01/2020

Estimating Multiple Precision Matrices with Cluster Fusion Regularization

We propose a penalized likelihood framework for estimating multiple prec...
research
05/09/2012

Group Sparse Priors for Covariance Estimation

Recently it has become popular to learn sparse Gaussian graphical models...
research
11/14/2019

Estimation of dynamic networks for high-dimensional nonstationary time series

This paper is concerned with the estimation of time-varying networks for...
research
12/16/2014

Estimation of Large Covariance and Precision Matrices from Temporally Dependent Observations

We consider the estimation of large covariance and precision matrices fr...
research
05/19/2019

Estimating variances in time series linear regression models using empirical BLUPs and convex optimization

We propose a two-stage estimation method of variance components in time ...
research
03/15/2018

Proximal SCOPE for Distributed Sparse Learning: Better Data Partition Implies Faster Convergence Rate

Distributed sparse learning with a cluster of multiple machines has attr...
research
05/27/2021

Lattice partition recovery with dyadic CART

We study piece-wise constant signals corrupted by additive Gaussian nois...

Please sign up or login with your details

Forgot password? Click here to reset