Eigenvalues and Eigenvectors
For a matrix transformation , a non-zero vector is called its eigenvector if for some scalar . This means that applying the matrix transformation to the vector only scales the vector. The corresponding value of for is an eigenvalue of .
The matrix transformation acts on the eigenvector , scaling it by a factor of the eigenvalue .[1]
Contents
Matrix Transformation Example
To understand how matrix transformations act geometrically, it is important to understand their effects on individual vectors.
Consider the matrix transformation
This corresponds to the map given by . The action of is to scale the -axis by a factor of 2 and the -axis by a factor of 3. In particular, this is equivalent to requiring that send and . Assuming this, linearity of implies that
Thus, it was possible to identify a basis for on which acts very simply, by scaling. From this knowledge, one can deduce how works on every vector in .
The natural question that sprouts out of this discussion is whether or not it is possible for all matrices to determine such a basis of eigenvectors. At the very least, when it is possible to do so, the matrix at hand may be considered especially well-behaved. For instance, the spectral theorem of linear algebra states that whenever is a symmetric matrix, i.e. , there is a basis consisting of eigenvectors for .
Computing Eigenvalues and Eigenvectors
It is not too difficult to compute eigenvalues and their corresponding eigenvectors when the matrix transformation at hand has a clear geometric interpretation. For examples, consider the diagonal matrix discussed above and the reflection matrix below:
Consider the reflection matrix transformation which reflects a vector across the -axis. Find the eigenvectors and the corresponding eigenvalues of .
The vectors that get scaled after a reflection across the -axis are either parallel to the -axis, i.e. in the span of , or parallel to the -axis, i.e. in the span of .
In the first case, the vector is not changed at all:
and therefore the eigenvalue for is .
In the second case, after applying the transformation, the length of the vector remains the same, but the direction reverses:
and therefore the eigenvalue for is .
For arbitrary matrices, however, a more general procedure is necessary.
Let be an -by- matrix, so that it corresponds to a transformation . If is an eigenvalue for , then there is a vector such that . Rearranging this equation shows that , where denotes the -by- identity matrix. This implies the null space of the matrix is nonzero, so has determinant zero.
Note that for every matrix has 0 as an eigenvalue, with eigenvector . Generally, one is only concerned with the nonzero eigenvectors associated to an eigenvalue, so convention dictates that is considered an eigenvalue of only when the null space of is nonzero equivalently, when divides
Accordingly, any eigenvalue of must be a root of the polynomial . This is called the characteristic polynomial of . Observe that this implies has only finitely many eigenvalues (in fact, at most eigenvalues).
In computations, the characteristic polynomial is extremely useful. To determine the eigenvalues of a matrix , one solves for the roots of , and then checks if each root is an eigenvalue.
Consider the matrix
Compute its nonzero eigenvalues and their corresponding eigenvectors.
The characteristic polynomial of may be computed as
This factors as , so the possible for eigenvalues of are and .
To prove is an eigenvalue for , it suffices to exhibit a vector with . This means that there exists a vector that satisfies .
Note that does the trick. Similarly, we find eigenvectors corresponding to by finding the null space of
This gives just one equation : . Two vectors that are linearly independent that work are and .
Solving the system of linear equations corresponding to shows that any eigenvector satisfying this equation is a multiple of . Similarly, solving the system corresponding to demonstrates every eigenvector satisfying this equation is a linear combination of and . Thus, we have found all the eigenvectors of .
Connections with Trace and Determinant
Suppose is a square matrix. Then
- the sum of its eigenvalues is equal to the trace of
- the product of its eigenvalues is equal to the determinant of
The proof of these properties requires the investigation of the characteristic polynomial of , which is found by taking the determinant of . We have
Observe that .
Let be the roots of an -order polynomial.
Since the eigenvalues are the roots of a matrix polynomial, we can match to . Therefore, it is clear that
and
Applications
In addition to their theoretical significance, eigenvalues and eigenvectors have important applications in various branches of applied mathematics, including signal processing, machine learning, and social network analysis.
For a simple application in network analysis, consider a network modeled by the finite connected graph . A computer will store the information of this graph in the form of an adjacency matrix. This is an -by- matrix , where and if vertices and have an edge between them, and otherwise.
The eigenvalues of detect information about the structure of . For instance, let and denote the smallest and largest eigenvalues of , respectively. Then, is bipartite if and only if . Accordingly, a computer could determine whether or not is bipartite by computing the eigenvalues of ; this is far more computationally feasible than attempting to check is bipartite by brute force.
References
- Lantonov, L. Eigenvalue-equation. Retrieved August 24, 2016, from https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#/media/File:Eigenvalue_equation.svg