
Minimax Rates of Estimation for Sparse PCA in High Dimensions
We study sparse principal components analysis in the highdimensional se...
read it

Rademacher complexity of stationary sequences
We show how to control the generalization error of time series models wh...
read it

Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices
The first part of this paper is devoted to the decisiontheoretic analys...
read it

Sharper bounds for uniformly stable algorithms
The generalization bounds for stable algorithms is a classical question ...
read it

Finitetime Identification of Stable Linear Systems: Optimality of the LeastSquares Estimator
We provide a new finitetime analysis of the estimation error of stable ...
read it

Active Learning for Identification of Linear Dynamical Systems
We propose an algorithm to actively estimate the parameters of a linear ...
read it

Minimax Estimation of PartiallyObserved Vector AutoRegressions
To understand the behavior of large dynamical systems like transportatio...
read it
Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification
We prove that the ordinary leastsquares (OLS) estimator attains nearly minimax optimal performance for the identification of linear dynamical systems from a single observed trajectory. Our upper bound relies on a generalization of Mendelson's smallball method to dependent data, eschewing the use of standard mixingtime arguments. Our lower bounds reveal that these upper bounds match up to logarithmic factors. In particular, we capture the correct signaltonoise behavior of the problem, showing that more unstable linear systems are easier to estimate. This behavior is qualitatively different from arguments which rely on mixingtime calculations that suggest that unstable systems are more difficult to estimate. We generalize our technique to provide bounds for a more general class of linear response timeseries.
READ FULL TEXT
Comments
There are no comments yet.