Projective Decomposition and Matrix Equivalence up to Scale

01/04/2019
by   Max Robinson, et al.
0

A data matrix may be seen simply as a means of organizing observations into rows ( e.g., by measured object) and into columns ( e.g., by measured variable) so that the observations can be analyzed with mathematical tools. As a mathematical object, a matrix defines a linear mapping between points representing weighted combinations of its rows (the row vector space) and points representing weighted combinations of its columns (the column vector space). From this perspective, a data matrix defines a relationship between the information that labels its rows and the information that labels its columns, and numerical methods are used to analyze this relationship. A first step is to normalize the data, transforming each observation from scales convenient for measurement to a common scale, on which addition and multiplication can meaningfully combine the different observations. For example, z-transformation rescales every variable to the same scale, standardized variation from an expected value, but ignores scale differences between measured objects. Here we develop the concepts and properties of projective decomposition, which applies the same normalization strategy to both rows and columns by separating the matrix into row- and column-scaling factors and a scale-normalized matrix. We show that different scalings of the same scale-normalized matrix form an equivalence class, and call the scale-normalized, canonical member of the class its scale-invariant form that preserves all pairwise relative ratios. Projective decomposition therefore provides a means of normalizing the broad class of ratio-scale data, in which relative ratios are of primary interest, onto a common scale without altering the ratios of interest, and simultaneously accounting for scale effects for both organizations of the matrix values. Both of these properties distinguish it from z-transformation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2023

Elimination and Factorization

If a matrix A has rank r, then its row echelon form (from elimination) c...
research
06/13/2021

Two-way Spectrum Pursuit for CUR Decomposition and Its Application in Joint Column/Row Subset Selection

The problem of simultaneous column and row subset selection is addressed...
research
01/16/2023

Ae^2I: A Double Autoencoder for Imputation of Missing Values

The most common strategy of imputing missing values in a table is to stu...
research
10/07/2019

Small Youden Rectangles and Their Connections to Other Row-Column Designs

In this paper we study Youden rectangles of small orders. We have enumer...
research
10/16/2018

Co-manifold learning with missing data

Representation learning is typically applied to only one mode of a data ...
research
03/11/2016

Matrix factoring by fraction-free reduction

We consider exact matrix decomposition by Gauss-Bareiss reduction. We in...
research
01/03/2012

The RegularGcc Matrix Constraint

We study propagation of the RegularGcc global constraint. This ensures t...

Please sign up or login with your details

Forgot password? Click here to reset