Asymptotics of sums of regression residuals under multiple ordering of regressors
We prove theorems about the Gaussian asymptotics of an empirical bridge built from linear model regressors with multiple regressor ordering. We study the testing of the hypothesis of a linear model for the components of a random vector: one of the components is a linear combination of the others up to an error that does not depend on the other components of the random vector. The results of observations of independent copies of a random vector are sequentially ordered in ascending order of several of its components. The result is a sequence of vectors of higher dimension, consisting of induced order statistics (concomitants) corresponding to different orderings. For this sequence of vectors, without the assumption of a linear model for the components, we prove a lemma of weak convergence of the distributions of an appropriately centered and normalized process to a centered Gaussian process with almost surely continuous trajectories. Assuming a linear relationship of the components, standard least squares estimates are used to compute regression residuals, the difference between response values and the predicted ones by the linear model. We prove a theorem of weak convergence of the process of regression residuals under the necessary normalization to a centered Gaussian process. Then we prove a theorem of the same convergence for the empirical bridge, a self-centered and self-normalized process of regression residuals.
READ FULL TEXT