# Numerical Matrix Decomposition and its Modern Applications: A Rigorous First Course

Matrix decomposition has become a core technology in machine learning, largely due to the development of back propagation algorithm in fitting a neural network. The sole aim of this survey is to give a self-contained introduction to concepts and mathematical tools in numerical linear algebra and matrix analysis in order to seamlessly introduce matrix decomposition techniques and their applications in subsequent sections. However, we clearly realize our inability to cover all the useful and interesting results concerning matrix decomposition and given the paucity of scope to present this discussion, e.g., the separated analysis of the Euclidean space, Hermitian space, and Hilbert space. We refer the reader to literature in the field of linear algebra for a much detailed introduction to the related fields. Some excellent examples include Trefethen and Bau III (1997); Strang (2009); Golub and Van Loan (2013); Beck (2017); Gallier and Quaintance (2017); Boyd and Vandenberghe (2018); Strang (2019); van de Geijn and Myers (2020); Strang (2021). This survey is primarily a summary of purpose, significance of important matrix decomposition methods, e.g., LU, QR, and SVD, and most importantly the origin and complexity of the methods which shed light on their modern applications. Again, this is a decomposition-based survey, thus we will introduce the related background when it is needed. The mathematical prerequisite is a first course in linear algebra. Other than this modest background, the development is self-contained, with rigorous proof provided throughout. Keywords: Matrix decomposition, Computing process, Complexity, Floating point operations (flops), Low-rank approximation, Pivot, LU decomposition for nonzero leading principle minors, CR decomposition, CUR/Skeleton decomposition, Biconjugate decomposition, Coordinate transformation, Hessenberg decomposition.

READ FULL TEXT
Comments

There are no comments yet.