Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification

by   Max Simchowitz, et al.

We prove that the ordinary least-squares (OLS) estimator attains nearly minimax optimal performance for the identification of linear dynamical systems from a single observed trajectory. Our upper bound relies on a generalization of Mendelson's small-ball method to dependent data, eschewing the use of standard mixing-time arguments. Our lower bounds reveal that these upper bounds match up to logarithmic factors. In particular, we capture the correct signal-to-noise behavior of the problem, showing that more unstable linear systems are easier to estimate. This behavior is qualitatively different from arguments which rely on mixing-time calculations that suggest that unstable systems are more difficult to estimate. We generalize our technique to provide bounds for a more general class of linear response time-series.


page 1

page 2

page 3

page 4


Minimax Rates of Estimation for Sparse PCA in High Dimensions

We study sparse principal components analysis in the high-dimensional se...

Rademacher complexity of stationary sequences

We show how to control the generalization error of time series models wh...

Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices

The first part of this paper is devoted to the decision-theoretic analys...

Learning with little mixing

We study square loss in a realizable time-series framework with martinga...

Sharper bounds for uniformly stable algorithms

The generalization bounds for stable algorithms is a classical question ...

Finite-time Identification of Stable Linear Systems: Optimality of the Least-Squares Estimator

We provide a new finite-time analysis of the estimation error of stable ...

Minimax Estimation of Partially-Observed Vector AutoRegressions

To understand the behavior of large dynamical systems like transportatio...