Compressed Subspace Learning Based on Canonical Angle Preserving Property
A standard way to tackle the challenging task of learning from high-dimensional data is to exploit its underlying low-dimensional structure. Union of Subspaces (UoS) is a popular and powerful model to describe such structure which assumes that the data lies in the union of a collection of low-dimensional subspaces. Extracting useful information from UoS structure of data has become the task of the newly-emerged field of subspace learning. In this paper, we investigate how random projection, an efficient and commonly-used method for dimensionality reduction, distorts the UoS structure of data. Here the fine details of UoS structure are described in terms of canonical angles (also known as principal angles) between subspaces, which is a well-known characterization for relative subspace positions by a sequence of angles. It is proved that random projection with the so-called Johnson-Lindenstrauss (JL) property approximately preserves canonical angles between subspaces. As canonical angles completely determine the relative position of subspaces, our result indicates that random projection approximately preserves structure of a union of subspaces. Inspired by this result, we propose in this paper the framework of Compressed Subspace Learning (CSL), which enables to extract useful information from the UoS structure of data in a greatly reduced dimension and has the advantage of lower computational cost and memory requirements. We demonstrate the effectiveness of CSL in various subspace-related tasks such as subspace visualization, active subspace detection, and subspace clustering.
READ FULL TEXT