Section 9.1 Eigenvalues and Eigenvectors
Dissect the action of a linear transformation \(\vec{x} \mapsto A\vec{x}\) into elements which are easily visualized.
Eigenvalues and eigenvectors have applications in pure and applied mathematics, and appear in settings far more general than we consider here. Applications to differential equations and continuous dynamical systems, physics and chemistry.
Consider a linear transformation in \(\mathbb{R}^2\text{.}\) In general, such a trasformation will perform some kind of stretch, rotation, and/or reflection to an input vector. However, sometimes, there are particular vectors \(\vec{x}\) for which the transformation only scales the vector, say to \(\lambda \vec{x}\) (here, \(\lambda\) is the Greek letter lambda). In \(\mathbb{R}^2\text{,}\) this means that the output vector is parallel to the input. In matrix form, this means that,
\begin{equation*}
A\vec{x} = \lambda \vec{x}
\end{equation*}
These special vectors \(\vec{x}\) are called eigenvectors for the matrix. Of course, the zero vector trivially satisfies this condition, because \(A\vec{0} = \lambda \vec{0}\) for any matrix \(A\text{,}\) and for any scalar \(\lambda\text{.}\) Thus, we are concerned with non-zero vectors \(\vec{x}\) which satisfy this equation. The scalar \(\lambda\) is called an eigenvalue.
Subsection 9.1.1 Eigenvalues and Eigenvectors
Definition 9.1.1.
Let \(A\) be an \(n \times n\) matrix. Then, a non-zero vector \(\vec{x} \in \mathbb{R}^n\) is an eigenvector, associated with eigenvalue \(\lambda\text{,}\) if
\begin{equation*}
A\vec{x} = \lambda \vec{x}
\end{equation*}
The term eigenvalue and eigenvector comes from the German word Eigenwert, meaning “proper value”.
Again, in \(\mathbb{R}^2\) (or \(\mathbb{R}^3\)), an eigenvector of \(A\) is a vector which is parallel to itself after transformation by \(A\text{.}\) More precisely, if \(\lambda > 0\text{,}\) then \(A\vec{x}\) is parallel to \(\vec{x}\text{,}\) and if \(\lambda \l 0\text{,}\) then \(A\vec{x}\) is anti-parallel to \(\vec{x}\text{.}\)
Subsection 9.1.2 Determining Eigenvalues and Eigenvectors
Let \(A\) be an \(n \times n\) matrix, and consider the eigenvectors of \(A\text{.}\) Let \(\vec{x}\) be an eigenvector of \(A\text{,}\) with eigenvalue \(\lambda\text{.}\) Then, by definition, \(A\vec{x} = \lambda \vec{x}\text{.}\) Rearranging this equation,
\begin{equation*}
A\vec{x} - \lambda \vec{x} = \vec{0}
\end{equation*}
The two terms on the left-hand side can be combined by writing \(\lambda \vec{x}\) as \(\lambda I_n \vec{x}\) (where \(I_n\) is the \(n \times n\) identity matrix), and “factoring out” \(\vec{x}\text{,}\)
\begin{equation*}
(A - \lambda I_n) \vec{x} = \vec{0}
\end{equation*}
The expression \(A - \lambda I_n\) is a matrix, so this forms a homogeneous system of linear equations. Then, recall that a homogeneous system has a non-zero soloution (here, for \(\vec{x}\)) if and only if the coefficient matrix \(A - \lambda I_n\) is not invertible, which recall occurs when its determinant is equal to 0, or
\begin{equation*}
\det{(A - \lambda I_n)} = 0
\end{equation*}
We can reverse all of this reasoning. If \(\det{(A - \lambda I_n)} = 0\) for some particular choice of \(\lambda\text{,}\) then \((A - \lambda I_n) \vec{x} = \vec{0}\) has a non-zero solution, and so this scalar \(\lambda\) and solution \(\vec{x}\) are associated eigenvalue and eigenvector.
Thus, to determine eigenvalues, solve the equation \(\det{(A - \lambda I_n)} = 0\) for \(\lambda\text{.}\) Then, for each eigenvalue \(\lambda\text{,}\) to find an associated eigenvector, solve the homogeneous system \((A - \lambda I_n) \vec{x} = \vec{0}\) for a vector \(\vec{x}\text{.}\) In summary,
Theorem 9.1.2. Determining eigenvalues and eigenvectors.
Let \(A\) be an \(n \times n\) matrix. Then, \(\lambda \in \mathbb{R}\) is an eigenvalue of \(A\) if,
\begin{equation*}
\det{(A - \lambda I_n)} = 0
\end{equation*}
and, the eigenvectors associated with \(\lambda\) are the non-zero solutions for \(\vec{x}\) in the homogeneous system,
\begin{equation*}
(A - \lambda I_n) \vec{x} = \vec{0}
\end{equation*}
Note that the matrix \(A - \lambda I_n\) is just \(A\) with \(\lambda\) subtracted from each entry on the main diagonal.
Subsection 9.1.3 The Eigenspace
For an eigenvalue \(\lambda\text{,}\) the set of all solutions to the equation,
\begin{equation*}
(A - \lambda I_n) \vec{x} = \vec{0}
\end{equation*}
is just the null space of the matrix \(A - \lambda I_n\text{.}\) In particular, this set is a subspace of \(\mathbb{R}\text{,}\) and is called the eigensapce of \(A\) corresponding to \(\lambda\text{.}\)
Subsection 9.1.4 Computing Eigenvalues
Theorem 9.1.3.
The eigenvalues of a triangular matrix are the entries on its main diagonal.
Subsection 9.1.5 The Characteristic Equation
Recall that eigenvalues are solutions \(\lambda\) to the equation \(\det{(A - \lambda I_n)} = 0\text{.}\) Then, evaluating a determinant involves some sum and products of the entries of the matrix, so in this case involves sums and products of expressions involving \(\lambda\text{.}\) In this way, the resulting equation is a polynomial equation in \(\lambda\text{.}\) In particular, it is an \(n\)th degree polynomial equation.
Definition 9.1.4.
Let \(A\) be an \(n \times n\) matrix. Then, the characteristic polynomial of \(A\text{,}\) is given by,
\begin{equation*}
p_A(\lambda) = \det{(A - \lambda I_n)}
\end{equation*}
Then, eigenvalues are precisely the roots of the characteristic polynomial, and the equation \(p_A(\lambda) = 0\) is called the characteristic euquation.
Polynomial equations can be solved exactly for low degree polnomials, however, for even moderately large \(n\) (say \(n = 5\)), polynomial equations can't be solved exactly, in general. For these and larger systems, numerical root-finding methods are used to solve for eigenvalues approximately.
Also, for this reason, eigenvalues should not be computed by hand, except for possibly \(2 \times 2\) matrices, which result in quadratic equations. For even \(3 \times 3\) matrices, these lead to a cubic equation, which in general is difficult to solve by hand, unless the numbers are chosen precisely.