What is an Orthonormal Vector?
A vector is said to be normal if it has a length of one. Two vectors are said to be orthogonal if they're at right angles to each other (their dot product is zero). A set of vectors is said to be orthonormal if they are all normal, and each pair of vectors in the set is orthogonal.
Orthonormal vectors are usually used as a basis on a vector space. Establishing an orthonormal basis for data makes calculations significantly easier; for example, the length of a vector is simply the square root of the sum of the squares of the coordinates of that vector relative to some orthonormal basis.
QR Decomposition
A QR Decomposition of a real square matrix A is the process of finding two matrices Q and R such that:
- A = QR
Q is an orthogonal matrix
- R is an upper-triangular matrix
(if A is a complex square matrix, or is a rectangular matrix, then Q will be a unitary matrix.)
There are a number of methods for computing a QR decomposition for a matrix, including the Gram-Schmidt process, Householder transformations, or Givens rotations. Each method has pros and cons, so implementers should study each of these algorithms carefully for a given problem.
QR decomposition is often used in the solution to the linear least squares problem. It's also the basis of an eigenvector-finding algorithm, aptly named the QR algorithm (although ironically, the modern form of the algorithm doesn't actually involve calculating a QR decomposition!)