Eigenvectors and Eigenvalues
|This article/section deals with mathematical concepts appropriate for late high school or early college.|
In linear algebra, when a transformation of a space is carried out, some vectors (points in the space) are not rotated, but only extended or shrunk (moved farther or closer to the origin). The vectors are called the eigenvectors of the transformation, and the amount of extension or shrinkage carried out on that eigenvector is called the eigenvalue of the transformation corresponding to the given eigenvector.
Characteristic Property of Eigenvalues and Eigenvectors
An eigenvalue of a square n × n matrix with real entries is a scalar such that
for some non-zero vector known as a eigenvector. The eigenvalues are the zeroes of a matrix's characteristic polynomial, the degree of the corresponding root is called the algebraic multiplicity of the eigenvalue.
The same definition is valid for n × n matrices over any field F: Then and .
The eigenvectors represent directions that are preserved by linear transformations of a vector space.
If the characteristic polynomial splits into linear factors, then he product of all the eigenvalues of a matrix counted with their algebraic multiplicities equals the value of the matrix's determinant. Since a matrix is invertible if and only if the determinant is non-zero, it is invertible if and only if zero is not an eigenvalue.
Eigenvectors and Eigenvalues in Physics
In Physics, eigenvectors and eigenvalues play an important role. For example, solving Schrödinger's equation (one of the fundamental equations of Quantum mechanics) is a problem of finding Eigenvectors and Eigenvalues for a very complex linear operator. The quantization (which means that discrete values are acceptable in physics, but not all of them: for example, an electron can have spin 1/2 or -1/2, but never 1/3 or 1/10) that was previously a postulated of Quantum mechanics can be derived from the fact that the eigenvalues of the operators come in discrete sets.
Finding Eigenvalues and Eigenvectors
This section describes a method for finding the eigenvalues and eigenvectors of a matrix .
The eigenvectors and eigenvalues satisfy the relationship: . This can be rearranged to give:
where is the identity matrix. Solutions where are trivial (simple) and are ignored. Therefore non trivial solutions occur only if is a singular matrix. A singular matrix does not have an inverse, so its determinant must be 0. Setting the determinant to 0 produces . Expanding this gives the characteristic polynomial. This is a polynomial in and equal in degree that of the matrix A (i.e. a 3x3 matrix produces a cubic (degree 3) characteristic polynomial). The solutions to this equation are the eigenvalues of .
The eigenvectors are then found by substituting in each value of in turn and solving the system of linear equations that results. The equations will not be linearly independent, meaning there will be an infinite number of solutions and that the solution for the eigenvector will be in terms of a parameter. This is best seen in the following example.
Consider the 2x2 matrix:
The eigenvalues come from solving so the characteristic polynomial comes from:
This has solutions of 2 and 3 so the eigenvalues are 2 and 3. To find the eigenvectors, first consider the eigenvalue equal to 2. This produces the equations:
This is just the equation , so the eigenvectors are:
for any . Similarly, the eigenvalue of 3 produces eigenvectors of the form: