Dimension-free Bounds for Sums of Independent Matrices and Simple Tensors via the Variational Principle
We consider the deviation inequalities for the sums of independent d by d random matrices, as well as rank one random tensors. Our focus is on the non-isotropic case and the bounds that do not depend explicitly on the dimension d, but rather on the effective rank. In an elementary and unified manner, we show the following results: 1) A deviation bound for the sums of independent positive-semi-definite matrices of any rank. This result generalizes the dimension-free bound of Koltchinskii and Lounici [Bernoulli, 23(1): 110-133, 2017] on the sample covariance matrix in the sub-Gaussian case. 2) A dimension-free version of the bound of Adamczak, Litvak, Pajor and Tomczak-Jaegermann [Journal Of Amer. Math. Soc,. 23(2), 535-561, 2010] on the sample covariance matrix in the log-concave case. 3) Dimension-free bounds for the operator norm of the sums of random tensors of rank one formed either by sub-Gaussian or by log-concave random vectors. This complements the result of Guédon and Rudelson [Adv. in Math., 208: 798-823, 2007]. 4) A non-isotropic version of the result of Alesker [Geom. Asp. of Funct. Anal., 77: 1-4, 1995] on the deviation of the norm of sub-exponential random vectors. 5) A dimension-free lower tail bound for sums of positive semi-definite matrices with heavy-tailed entries, sharpening the bound of Oliveira [Prob. Th. and Rel. Fields, 166: 1175-1194, 2016]. Our approach is based on the duality formula between entropy and moment generating functions. In contrast to the known proofs of dimension-free bounds, we avoid Talagrand's majorizing measure theorem, as well as generic chaining bounds for empirical processes. Some of our tools were pioneered by O. Catoni and co-authors in the context of robust statistical estimation.
READ FULL TEXT