2.5 Eigenvalues and eigenvectors
Definition 2.16 Let A be a square matrix.
- an eigenvalue of A is a number \(\lambda\) such that there exists a nonzero vector \(\mathbf{v}\) such that \(A\mathbf{v} = \lambda \mathbf{v}\).
- a vector \(\mathbf{v}\) with this property is called a \(\lambda\)-eigenvector of A, or just an eigenvector of A.
If we think of \(\lambda\) as an unknown equation \(\det (A-\lambda I_n)=0\) is a polynomial of degree n. This is called the characteristic polynomial \(\chi_A(\lambda)\) of A, and the last lemma shows that the roots of \(\chi_A\) are exactly the eigenvalues of A.
Once we know that a certain number \(\lambda\) is an eigenvalue of A, we can find the eigenvectors corresponding to that eigenvalue by solving the matrix equation \[\begin{equation*} (A-\lambda I_n)\mathbf{x}= \mathbf{0} \end{equation*}\] by putting the augmented matrix into RRE form or otherwise.
Example 2.9 The characteristic polynomial of \(A= \begin{pmatrix} 1&1\\1&1 \end{pmatrix}\) is \[\begin{equation*} \det (A-\lambda I_2)= (1-\lambda)(1-\lambda)-1=\lambda^2-2\lambda=\lambda(\lambda-2) \end{equation*}\] and so the eigenvalues of A are 0 and 2. We can find the 0-eigenvectors by looking for nonzero solutions of the matrix equation \[\begin{equation*} (A-0I_2) \mathbf{x} = \mathbf{0}, \end{equation*}\] that is, \(A \mathbf{x}= \mathbf{0}\). By putting this into RRE form or just solving directly, you will find that the 0-eigenvectors are the vectors \(\begin{pmatrix} a\\-a \end{pmatrix}\) for \(a \neq 0\).
To find the 2-eigenvectors we look for nonzero solutions of the matrix equation \[\begin{equation*} (A-2I_2) \mathbf{x}=\mathbf{0} \end{equation*}\] which is \[\begin{equation*} \begin{pmatrix} -1&1\\1&-1 \end{pmatrix} \begin{pmatrix} x\\y \end{pmatrix} = \mathbf{0}. \end{equation*}\] Again by putting the augmented matrix into RRE form, or otherwise, you’ll find that the 2-eigenvectors are the vectors \(\begin{pmatrix} b\\b \end{pmatrix}\) for \(b\neq 0\).