Skip to main content
Logo image

Elementary Linear Algebra: For University of Lethbridge Math 1410

Section 7.2 Properties of Eigenvalues and Eigenvectors

In this section we’ll explore how the eigenvalues and eigenvectors of a matrix relate to other properties of that matrix. This section is essentially a hodgepodge of interesting facts about eigenvalues; the goal here is not to memorize various facts about matrix algebra, but to again be amazed at the many connections between mathematical concepts.
We’ll begin our investigations with an example that will give a foundation for other discoveries.

Example 7.2.1. Eigenvalues of a triangular matrix.

Let \(\tta = \bbm 1\amp 2\amp 3\\0\amp 4\amp 5\\0\amp 0\amp 6\ebm\text{.}\) Find the eigenvalues of \(\tta\text{.}\)
Solution.
To find the eigenvalues, we compute \(\det{\tta-\lda\tti}\text{:}\)
\begin{align*} \det{\tta-\lda\tti} \amp = \bvm 1-\lda \amp 2\amp 3\\0\amp 4-\lda\amp 5\\0\amp 0\amp 6-\lda \evm\\ \amp = (1-\lda)(4-\lda)(6-\lda)\text{.} \end{align*}
Since our matrix is triangular, the determinant is easy to compute; it is just the product of the diagonal elements. Therefore, we found (and factored) our characteristic polynomial very easily, and we see that we have eigenvalues of \(\lda = 1, 4\text{,}\) and 6.
This examples demonstrates a wonderful fact for us: the eigenvalues of a triangular matrix are simply the entries on the diagonal. Finding the corresponding eigenvectors still takes some work, but finding the eigenvalues is easy.
With that fact in the backs of our minds, let us proceed to the next example where we will come across some more interesting facts about eigenvalues and eigenvectors.

Example 7.2.2. Exploring properties of eigenvalues.

Let \(\tta = \bbm -3 \amp 15 \\ 3 \amp 9 \ebm\) and let \(\ttb = \bbm -7 \amp -2 \amp 10 \\-3 \amp 2 \amp 3\\ -6 \amp -2 \amp 9\ebm\) (as used in Examples Example 7.1.6 and Example 7.1.12, respectively). Find the following:
  1. The eigenvalues and eigenvectors of \(\tta\) and \(\ttb\)
  2. The eigenvalues and eigenvectors of \(\ttai\) and \(\ttbi\)
  3. eigenvalues and eigenvectors of \(\ttat\) and \(\ttbt\)
  4. The trace of \(\tta\) and \(\ttb\)
  5. The determinant of \(\tta\) and \(\ttb\)
Solution.
We’ll answer each in turn.
  1. We already know the answer to these for we did this work in previous examples. Therefore we just list the answers.
    For \(\tta\text{,}\) we have eigenvalues \(\lda = -6\) and \(12\text{,}\) with eigenvectors
    \begin{equation*} \vx = x_2\bbm -5\\1\ebm \ \text{ and }\ x_2\bbm 1\\1\ebm\text{, respectively}\text{.} \end{equation*}
    For \(\ttb\text{,}\) we have eigenvalues \(\lda = -1,\ 2,\) and \(3\) with eigenvectors
    \begin{equation*} \vx = x_3 \bbm 3\\1\\2\ebm,\ x_3 \bbm 2\\1\\2\ebm\ \text{ and }\ x_3 \bbm 1\\0\\1\ebm \text{, respectively}\text{.} \end{equation*}
  2. We first compute the inverses of \(\tta\) and \(\ttb\text{.}\) They are:
    \begin{equation*} \ttai = \bbm -1/8 \amp 5/24 \\ 1/24 \amp 1/24 \ebm \quad \text{and} \quad \ttbi = \bbm -4 \amp 1/3 \amp 13/3 \\ -3/2 \amp 1/2 \amp 3/2 \\ -3 \amp 1/3 \amp 10/3 \ebm\text{.} \end{equation*}
    Finding the eigenvalues and eigenvectors of these matrices is not terribly hard, but it is not “easy,” either. Therefore, we omit showing the intermediate steps and go right to the conclusions.
    For \(\ttai\text{,}\) we have eigenvalues \(\lda = -1/6\) and \(1/12\text{,}\) with eigenvectors
    \begin{equation*} \vx = x_2 \bbm -5\\1\ebm\ \text{ and }\ x_2 \bbm 1\\1\ebm\text{, respectively}\text{.} \end{equation*}
    For \(\ttbi\text{,}\) we have eigenvalues \(\lda = -1\text{,}\) \(1/2\) and \(1/3\) with eigenvectors
    \begin{equation*} \vx = x_3\bbm 3\\1\\2\ebm,\ x_3\bbm 2\\1\\2\ebm \ \text{ and }\ x_3 \bbm 1\\0\\1\ebm\text{, respectively}\text{.} \end{equation*}
  3. Of course, computing the transpose of \(\tta\) and \(\ttb\) is easy; computing their eigenvalues and eigenvectors takes more work. Again, we omit the intermediate steps.
    For \(\ttat\text{,}\) we have eigenvalues \(\lda = -6\) and \(12\) with eigenvectors
    \begin{equation*} \vx = x_2 \bbm -1\\1\ebm\ \text{ and }\ x_2 \bbm 5\\1\ebm \text{, respectively}\text{.} \end{equation*}
    For \(\ttbt\text{,}\) we have eigenvalues \(\lda = -1,\ 2\) and \(3\) with eigenvectors
    \begin{equation*} \vx = x_3 \bbm -1\\0\\1\ebm,\ x_3 \bbm -1\\1\\1\ebm\ \text{ and }\ x_3\bbm 0\\-2\\1\ebm\text{, respectively}\text{.} \end{equation*}
  4. The trace of \(\tta\) is 6; the trace of \(\ttb\) is 4.
  5. The determinant of \(\tta\) is \(-72\text{;}\) the determinant of \(\ttb\) is \(-6\text{.}\)
Now that we have completed the “grunt work,” let’s analyze the results of the previous example. We are looking for any patterns or relationships that we can find.

The eigenvalues and eigenvectors of \(\tta\) and \(\ttai\).

In our example, we found that the eigenvalues of \(\tta\) are \(-6\) and 12; the eigenvalues of \(\ttai\) are \(-1/6\) and \(1/12\text{.}\) Also, the eigenvalues of \(\ttb\) are \(-1\text{,}\) 2 and 3, whereas the eigenvalues of \(\ttbi\) are \(-1\text{,}\) \(1/2\) and \(1/3\text{.}\) There is an obvious relationship here; it seems that if \(\lda\) is an eigenvalue of \(\tta\text{,}\) then \(1/\lda\) will be an eigenvalue of \(\ttai\text{.}\) We can also note that the corresponding eigenvectors matched, too.
Why is this the case? Consider an invertible matrix \(\tta\) with eigenvalue \(\lda\neq 0\) and eigenvector\(\vx\text{.}\) Then, by definition, we know that \(\tta\vx = \lda\vx\text{.}\) Now multiply both sides by \(\ttai\text{:}\)
\begin{align*} \tta\vx \amp = \lda\vx\\ \ttai\tta\vx \amp = \ttai\lda\vx\\ \vx \amp = \lda\ttai\vx\\ \frac{1}{\lda}\vx \amp = \ttai\vx \text{.} \end{align*}
We have just shown that \(\ttai\vx = 1/\lda\vx\text{;}\) this, by definition, shows that \(\vx\) is an eigenvector of \(\ttai\) with eigenvalue \(1/\lda\text{.}\) This explains the result we saw above. Of course, this all falls apart if \(\lambda = 0\text{,}\) but this is impossible for an invertible matrix: see Theorem 7.2.5 below.

The eigenvalues and eigenvectors of \(\tta\) and \(\ttat\).

Our example showed that \(\tta\) and \(\ttat\) had the same eigenvalues but different (but somehow similar) eigenvectors; it also showed that \(\ttb\) and \(\ttbt\) had the same eigenvalues but unrelated eigenvectors. Why is this?
We can answer the eigenvalue question relatively easily; it follows from the properties of the determinant and the transpose. Recall the following two facts:
  1. \((\tta +\ttb)^T = \ttat + \ttbt\) (Theorem 6.1.9)
  2. \(\det{\tta} = \det{\ttat}\) (Theorem 6.4.12)
We find the eigenvalues of a matrix by computing the characteristic polynomial; that is, we find \(\det{\tta - \lda\tti}\text{.}\) What is the characteristic polynomial of \(\ttat\text{?}\) Consider:
\begin{align*} \det{\ttat - \lda\tti} \amp = \det{\ttat - \lda\tti^T} \quad \text{since } \tti = \tti^T\\ \amp = \det{(\tta - \lda\tti)^T} \quad (\knowl{./knowl/xref/thm-transpose.html}{\text{Theorem 6.1.9}})\\ \amp = \det{\tta-\lda\tti} \quad (\knowl{./knowl/xref/thm-determinant_properties.html}{\text{Theorem 6.4.12}})\text{.} \end{align*}
So we see that the characteristic polynomial of \(\ttat\) is the same as that for \(\tta\text{.}\) Therefore they have the same eigenvalues.
What about their respective eigenvectors? Is there any relationship? The simple answer is “No.”

The eigenvalues and eigenvectors of \(\tta\) and the trace.

Note that the eigenvalues of \(\tta\) are \(-6\) and 12, and the trace is 6; the eigenvalues of \(\ttb\) are \(-1\text{,}\) 2 and 3, and the trace of \(\ttb\) is 4. Do we notice any relationship?
It seems that the sum of the eigenvalues is the trace! Why is this the case?
The answer to this is a bit out of the scope of this text; we can justify part of this fact, and another part we’ll just state as being true without justification.
Suppose \(A\) is an \(n\times n\) matrix with no complex eigenvalues (that is, the characteristic polynomial can be completely factored over the real numbers). When this is the case, it turns out that we can find a square matrix \(\ttp\) such that \(\ttpi\tta\ttp\) is an upper triangular matrix with the eigenvalues of \(\tta\) on the diagonal. (We unfortunately do not have the time to prove this, or to explain how the matrix \(P\) is determined.)
Now, recall from Theorem 6.2.3 that \(\tr(\tta\ttb) = \tr(\ttb\tta)\text{.}\) Since \(\ttpi\tta\ttp\) is upper-triangular, we know that \(\tr(\ttpi\tta\ttp)\) is the sum of the eigenvalues; also, using our Theorem 6.2.3, we know that \(\tr(\ttpi\tta\ttp) = \tr(\ttpi\ttp\tta) = \tr(\tta)\text{.}\) Thus the trace of \(\tta\) is the sum of the eigenvalues. It turns out that this result remains true when \(\tta\) has complex eigenvalues, and the proof is similar, except that the entries of the matrix \(P\) will be complex numbers.

The eigenvalues and eigenvectors of \(\tta\) and the determinant.

Again, the eigenvalues of \(\tta\) are \(-6\) and 12, and the determinant of \(\tta\) is \(-72\text{.}\) The eigenvalues of \(\ttb\) are \(-1\text{,}\) 2 and 3; the determinant of \(\ttb\) is \(-6\text{.}\) It seems as though the product of the eigenvalues is the determinant.
This is indeed true; we defend this with our argument from above. We know that the determinant of a triangular matrix is the product of the diagonal elements. Therefore, given a matrix \(\tta\text{,}\) we can find \(\ttp\) such that \(\ttpi\tta\ttp\) is upper triangular with the eigenvalues of \(\tta\) on the diagonal. Thus \(\det{\ttpi\tta\ttp}\) is the product of the eigenvalues. Using Theorem 6.4.12, we know that \(\det{\ttpi\tta\ttp} = \det{\ttpi\ttp\tta} = \det{\tta}\text{.}\) Thus the determinant of \(\tta\) is the product of the eigenvalues.
We summarize the results of our example with the following theorem.
There is one more concept concerning eigenvalues and eigenvectors that we will explore. We do so in the context of an example.

Example 7.2.4. Eigenvalues of a non-invertible matrix.

Find the eigenvalues and eigenvectors of the matrix \(\tta = \bbm 1\amp 2\\1\amp 2\ebm\text{.}\)
Solution.
To find the eigenvalues, we compute \(\det{\tta-\lda\tti}\text{:}\)
\begin{align*} \det{\tta-\lda\tti} \amp = \bvm 1-\lda \amp 2\\1\amp 2-\lda \evm\\ \amp = (1-\lda)(2-\lda)-2\\ \amp = \lda^2-3\lda\\ \amp = \lda(\lda-3) \end{align*}
Our eigenvalues are therefore \(\lda = 0, 3\text{.}\)
For \(\lda = 0\text{,}\) we find the eigenvectors:
\begin{equation*} \bbm 1 \amp 2 \amp 0\\1\amp 2\amp 0\ebm \quad \overrightarrow{\text{rref}}\quad \bbm 1\amp 2\amp 0\\0\amp 0\amp 0\ebm\text{.} \end{equation*}
This shows that \(x_1 = -2x_2\text{,}\) and so our eigenvectors \(\vx\) are
\begin{equation*} \vx = x_2\bbm-2\\1\ebm\text{.} \end{equation*}
For \(\lda = 3\text{,}\) we find the eigenvectors:
\begin{equation*} \bbm-2 \amp 2 \amp 0\\ 1\amp -1\amp 0 \ebm \quad \overrightarrow{\text{rref}}\quad \bbm 1\amp -1\amp 0\\0\amp 0\amp 0\ebm\text{.} \end{equation*}
This shows that \(x_1 = x_2\text{,}\) and so our eigenvectors \(\vx\) are
\begin{equation*} \vx = x_2\bbm 1\\1\ebm\text{.} \end{equation*}
One interesting thing about the above example is that we see that 0 is an eigenvalue of \(\tta\text{;}\) we have not officially encountered this before. Does this mean anything significant?
Think about what an eigenvalue of 0 means: there exists an nonzero vector \(\vx\) where \(\tta\vx = 0\vx = \zero\text{.}\) That is, we have a nontrivial solution to \(\tta\vx = \zero\text{.}\) We know this only happens when \(\tta\) is not invertible.
So if \(\tta\) is invertible, there is no nontrivial solution to \(\tta\vx=\zero\text{,}\) and hence 0 is not an eigenvalue of \(\tta\text{.}\) If \(\tta\) is not invertible, then there is a nontrivial solution to \(\tta\vx=\zero\text{,}\) and hence 0 is an eigenvalue of \(\tta\text{.}\) This leads us to our final addition to the Invertible Matrix Theorem.
This section is about the properties of eigenvalues and eigenvectors. Of course, we have not investigated all of the numerous properties of eigenvalues and eigenvectors; we have just surveyed some of the most common (and most important) concepts. One of the more important topics — which we will describe briefly in our final section — is the question of diagonalization. For some \(n\times n\) matrices \(A\text{,}\) it is possible to find an invertible matrix \(P\) such that \(P^{-1}AP\) is a diagonal matrix, and the diagonal entries of this matrix are precisely the eigenvalues of \(A\text{.}\) When is this possible? When is it not? To a large extent, this comes down to the question of multiplicity: when we have a repeated eigenvalue, how many independent eigenvectors are associated with it?
Finally, we have found the eigenvalues of matrices by finding the roots of the characteristic polynomial. We have limited our examples to quadratic and cubic polynomials; one would expect for larger sized matrices that a computer would be used to factor the characteristic polynomials. However, in general, this is not how the eigenvalues are found. Factoring high order polynomials is too unreliable, even with a computer — round off errors can cause unpredictable results. Also, to even compute the characteristic polynomial, one needs to compute the determinant, which is also expensive (as discussed in the previous chapter).
So how are eigenvalues found? There are iterative processes that can progressively transform a matrix \(\tta\) into another matrix that is almost an upper triangular matrix (the entries below the diagonal are almost zero) where the entries on the diagonal are the eigenvalues. The more iterations one performs, the better the approximation is.
These methods are so fast and reliable that some computer programs convert polynomial root finding problems into eigenvalue problems!
Most textbooks on Linear Algebra will provide direction on exploring the above topics and give further insight to what is going on. We have mentioned all the eigenvalue and eigenvector properties in this section for the same reasons we gave in the previous section. First, knowing these properties helps us solve numerous real world problems, and second, it is fascinating to see how rich and deep the theory of matrices is.

Exercises Exercises

Exercise Group.

A matrix \(\tta\) is given.
  1. Find the eigenvalues of \(\tta\text{,}\) and for each eigenvalue, find an eigenvector.
  2. Do the same for \(\ttat\text{.}\)
  3. Do the same for \(\ttai\text{.}\)
  4. Find \(\tr(\tta)\text{.}\)
  5. Find \(\det{\tta}\text{.}\)
Use Theorem 7.2.3 to verify your results.
1.
\(\bbm 0 \amp 4\\-1 \amp 5\ebm\)
2.
\(\bbm-2 \amp -14\\-1 \amp 3\ebm\)
3.
\(\bbm 5 \amp 30\\-1 \amp -6\ebm\)
4.
\(\bbm-4 \amp 72\\-1 \amp 13\ebm\)
5.
\(\bbm 5 \amp -9 \amp 0\\1\amp -5\amp 0\\2\amp 4\amp 3\ebm\)
6.
\(\bbm 0 \amp 25 \amp 0\\1\amp 0\amp 0\\1\amp 1\amp -3\ebm\)