What is an Eigenvalue?
An eigenvalue, normally denoted by the greek lower case letter lambda (λ), is a number such that when a linear operator is applied to a vector, the vector’s line of action is unchanged but the vector is transformed by changing size or reversing direction. This linear operator is generally a square matrix, meaning it has the same number of rows as it does columns, and the vector with an unchanged line of action is a special vector called an eigenvector. Mathematically, the eigenvalue is the number by which the eigenvector is multiplied and produces the same result as if the matrix were multiplied with the vector as shown in Equation 1.
Where A is the square matrix, λ is the eigenvalue and x is the eigenvector. The eigenvalues of A are calculated by passing all terms to one side and factoring out the eigenvector x (Equation 2). Notice there is now an identity matrix, called I, multiplied by λ. Taking the determinant of the terms within the parenthesis (Equation 3) and solving the resulting system of linear equations will provide the eigenvalues. There will be as many eigenvalues as there are rows (or columns) in A. Details of how to calculate the determinant of a matrix can be found in a linear algebra textbook.
Practical Uses of a Eigenvalues
– Prerequisite to determining the eigenvectors and eigenspaces of a matrix is the calculation of the eigenvalues.
Eigenvalues and Eigenvectors in Machine Learning
In machine learning, it is important to choose features which represent large numbers of data points and give lots of information. Picking the features which represent that data and eliminating less useful features is an example of dimensionality reduction. We can use eigenvalues and vectors to identify those dimensions which are most useful and prioritize our computational resources toward them.