Numerical Matrix Decomposition and its Modern Applications: A Rigorous First Course

by   Jun Lu, et al.

Matrix decomposition has become a core technology in machine learning, largely due to the development of back propagation algorithm in fitting a neural network. The sole aim of this survey is to give a self-contained introduction to concepts and mathematical tools in numerical linear algebra and matrix analysis in order to seamlessly introduce matrix decomposition techniques and their applications in subsequent sections. However, we clearly realize our inability to cover all the useful and interesting results concerning matrix decomposition and given the paucity of scope to present this discussion, e.g., the separated analysis of the Euclidean space, Hermitian space, and Hilbert space. We refer the reader to literature in the field of linear algebra for a much detailed introduction to the related fields. Some excellent examples include Trefethen and Bau III (1997); Strang (2009); Golub and Van Loan (2013); Beck (2017); Gallier and Quaintance (2017); Boyd and Vandenberghe (2018); Strang (2019); van de Geijn and Myers (2020); Strang (2021). This survey is primarily a summary of purpose, significance of important matrix decomposition methods, e.g., LU, QR, and SVD, and most importantly the origin and complexity of the methods which shed light on their modern applications. Again, this is a decomposition-based survey, thus we will introduce the related background when it is needed. The mathematical prerequisite is a first course in linear algebra. Other than this modest background, the development is self-contained, with rigorous proof provided throughout. Keywords: Matrix decomposition, Computing process, Complexity, Floating point operations (flops), Low-rank approximation, Pivot, LU decomposition for nonzero leading principle minors, CR decomposition, CUR/Skeleton decomposition, Biconjugate decomposition, Coordinate transformation, Hessenberg decomposition.



There are no comments yet.


page 9

page 10

page 11

page 20

page 24

page 35

page 40

page 41


Matrix Decomposition and Applications

In 1954, Alston S. Householder published Principles of Numerical Analysi...

A survey on Bayesian inference for Gaussian mixture model

Clustering has become a core technology in machine learning, largely due...

Revisit the Fundamental Theorem of Linear Algebra

This survey is meant to provide an introduction to the fundamental theor...

On the Column and Row Ranks of a Matrix

Every m by n matrix A with rank r has exactly r independent rows and r i...

Compressed Randomized UTV Decompositions for Low-Rank Approximations and Big Data Applications

Low-rank matrix approximations play a fundamental role in numerical line...

Efficient Algorithms for Constructing an Interpolative Decomposition

Low-rank approximations are essential in modern data science. The interp...

Visualizing WSPDs and their applications

Introduced by Callahan and Kosaraju back in 1995, the concept of well-se...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.