Section SD: Similarity and Diagonalization

From A First Course in Linear Algebra
Version 2.22
© 2004.
Licensed under the GNU Free Documentation License.
http://linear.ups.edu/

This page, which is an unofficial excerpt of Rob Beezer's Linear Algebra textbook,
was created by AIM to demonstrate the usefulness of knowls.
Please wait for MathJax to finish rendering the page before clicking on the knowls.
Questions or comments to   knowls@aimath.org

This section's topic will perhaps seem out of place at first, but we will make the connection soon with eigenvalues and eigenvectors. This is also our first look at one of the central ideas of chapter R.

Similar Matrices

The notion of matrices being ``similar'' is a lot like saying two matrices are row-equivalent. Two similar matrices are not equal, but they share many important properties. This section, and later sections in chapter R will be devoted in part to discovering just what these common properties are.

First, the main definition for this section.

Definition (Similar Matrices) Suppose $A$ and $B$ are two square matrices of size $n$. Then $A$ and $B$ are similar if there exists a nonsingular matrix of size $n$, $S$, such that $A=S^{-1} B S$.

We will say ``$A$ is similar to $B$ via $S$'' when we want to emphasize the role of $S$ in the relationship between $A$ and $B$. Also, it doesn't matter if we say $A$ is similar to $B$, or $B$ is similar to $A$. If one statement is true then so is the other, as can be seen by using $S^{-1}$ in place of $S$ (see theorem SER for the careful proof). Finally, we will refer to $S^{-1} B S$ as a similarity transformation when we want to emphasize the way $S$ changes $B$. OK, enough about language, let's build a few examples.

Example SMS5: Similar matrices of size 5.   If you wondered if there are examples of similar matrices, then it won't be hard to convince you they exist. Define

\begin{align*} B=\begin{bmatrix} -4 & 1 & -3 & -2 & 2 \\ 1 & 2 & -1 & 3 & -2 \\ -4 & 1 & 3 & 2 & 2 \\ -3 & 4 & -2 & -1 & -3 \\ 3 & 1 & -1 & 1 & -4 \end{bmatrix} && S=\begin{bmatrix} 1 & 2 & -1 & 1 & 1 \\ 0 & 1 & -1 & -2 & -1 \\ 1 & 3 & -1 & 1 & 1 \\ -2 & -3 & 3 & 1 & -2 \\ 1 & 3 & -1 & 2 & 1\\ \end{bmatrix} \end{align*}

Check that $S$ is nonsingular and then compute

\begin{align*} A&=S^{-1} B S\\ &\\ &= \begin{bmatrix} 10 & 1 & 0 & 2 & -5 \\ -1 & 0 & 1 & 0 & 0 \\ 3 & 0 & 2 & 1 & -3 \\ 0 & 0 & -1 & 0 & 1 \\ -4 & -1 & 1 & -1 & 1 \end{bmatrix} \begin{bmatrix} -4 & 1 & -3 & -2 & 2 \\ 1 & 2 & -1 & 3 & -2 \\ -4 & 1 & 3 & 2 & 2 \\ -3 & 4 & -2 & -1 & -3 \\ 3 & 1 & -1 & 1 & -4 \end{bmatrix} \begin{bmatrix} 1 & 2 & -1 & 1 & 1 \\ 0 & 1 & -1 & -2 & -1 \\ 1 & 3 & -1 & 1 & 1 \\ -2 & -3 & 3 & 1 & -2 \\ 1 & 3 & -1 & 2 & 1 \end{bmatrix}\\ &\\ &= \begin{bmatrix} -10 & -27 & -29 & -80 & -25 \\ -2 & 6 & 6 & 10 & -2 \\ -3 & 11 & -9 & -14 & -9 \\ -1 & -13 & 0 & -10 & -1 \\ 11 & 35 & 6 & 49 & 19 \end{bmatrix} \end{align*}

So by this construction, we know that $A$ and $B$ are similar.

Let's do that again.

Example SMS3: Similar matrices of size 3.   Define

\begin{align*} B=\begin{bmatrix} -13 & -8 & -4 \\ 12 & 7 & 4 \\ 24 & 16 & 7 \end{bmatrix} && S=\begin{bmatrix} 1 & 1 & 2 \\ -2 & -1 & -3 \\ 1 & -2 & 0 \end{bmatrix} \end{align*}

Check that $S$ is nonsingular and then compute

\begin{align*} A&=S^{-1} B S\\ &= \begin{bmatrix} -6 & -4 & -1 \\ -3 & -2 & -1 \\ 5 & 3 & 1 \end{bmatrix} \begin{bmatrix} -13 & -8 & -4 \\ 12 & 7 & 4 \\ 24 & 16 & 7 \end{bmatrix} \begin{bmatrix} 1 & 1 & 2 \\ -2 & -1 & -3 \\ 1 & -2 & 0 \end{bmatrix}\\ &= \begin{bmatrix} -1 & 0 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & -1 \end{bmatrix} \end{align*}

So by this construction, we know that $A$ and $B$ are similar. But before we move on, look at how pleasing the form of $A$ is. Not convinced? Then consider that several computations related to $A$ are especially easy. For example, in the spirit of example DUTM, $det(A)=(-1)(3)(-1)=3$. Similarly, the characteristic polynomial is straightforward to compute by hand, $p_{A}(x)=(-1-x)(3-x)(-1-x)=-(x-3)(x+1)^2$ and since the result is already factored, the eigenvalues are transparently $\lambda=3,\,-1$. Finally, the eigenvectors of $A$ are just the standard unit vectors (definition SUV).

Properties of Similar Matrices

Similar matrices share many properties and it is these theorems that justify the choice of the word ``similar.'' First we will show that similarity is an equivalence relation. Equivalence relations are important in the study of various algebras and can always be regarded as a kind of weak version of equality. Sort of alike, but not quite equal. The notion of two matrices being row-equivalent is an example of an equivalence relation we have been working with since the beginning of the course (see exercise RREF.T11). Row-equivalent matrices are not equal, but they are a lot alike. For example, row-equivalent matrices have the same rank. Formally, an equivalence relation requires three conditions hold: reflexive, symmetric and transitive. We will illustrate these as we prove that similarity is an equivalence relation.

Theorem (Similarity is an Equivalence Relation) Suppose $A$, $B$ and $C$ are square matrices of size $n$. Then

  1. $A$ is similar to $A$. (Reflexive)
  2. If $A$ is similar to $B$, then $B$ is similar to $A$. (Symmetric)
  3. If $A$ is similar to $B$ and $B$ is similar to $C$, then $A$ is similar to $C$. (Transitive)

Proof: To see that $A$ is similar to $A$, we need only demonstrate a nonsingular matrix that effects a similarity transformation of $A$ to $A$. $I_n$ is nonsingular (since it row-reduces to the identity matrix, theorem NMRRI), and \begin{equation*} I_n^{-1} A I_n=I_nAI_n=A \end{equation*}

If we assume that $A$ is similar to $B$, then we know there is a nonsingular matrix $S$ so that $A=S^{-1} B S$ by definition SIM. By theorem MIMI, $S^{-1}$ is invertible, and by theorem NI is therefore nonsingular. So

\begin{align*} (S^{-1})^{-1}{A}{(S^{-1})}&=SAS^{-1}&&\text{theorem MIMI}\\ &=SS^{-1} B SS^{-1}&&\text{definition SIM}\\ &=\left(SS^{-1}\right)B\left(SS^{-1}\right)&&\text{theorem MMA}\\ &=I_nBI_n&&\text{definition MI}\\ &=B&&\text{theorem MMIM} \end{align*}

and we see that $B$ is similar to $A$.

Assume that $A$ is similar to $B$, and $B$ is similar to $C$. This gives us the existence of two nonsingular matrices, $S$ and $R$, such that $A=S^{-1} B S$ and $B=R^{-1} C R$, by definition SIM. (Notice how we have to assume $S\neq R$, as will usually be the case.) Since $S$ and $R$ are invertible, so too $RS$ is invertible by theorem SS and then nonsingular by theorem NI. Now

\begin{align*} (RS)^{-1} C (RS)&=S^{-1}{R^{-1} C R}{S}&&\text{theorem SS}\\ &=S^{-1}{\left(R^{-1} C R\right)}{S}&&\text{theorem MMA}\\ &=S^{-1} B S&&\text{definition SIM}\\ &=A \end{align*}

so $A$ is similar to $C$ via the nonsingular matrix $RS$.

Here's another theorem that tells us exactly what sorts of properties similar matrices share.

Theorem (Similar Matrices have Equal Eigenvalues) Suppose $A$ and $B$ are similar matrices. Then the characteristic polynomials of $A$ and $B$ are equal, that is, $p_{A}(x)=p_{B}(x)$.

Proof: Let $n$ denote the size of $A$ and $B$. Since $A$ and $B$ are similar, there exists a nonsingular matrix $S$, such that $A=S^{-1} B S$ (definition SIM). Then

\begin{align*} p_{A}(x)&=det(A-xI_n)&&\text{definition CP}\\ &=det(S^{-1} B S-xI_n)&&\text{definition SIM}\\ &=det(S^{-1} B S-xS^{-1} I_n S)&&\text{theorem MMIM}\\ &=det(S^{-1} B S-S^{-1}xI_nS)&&\text{theorem MMSMM}\\ &=det(S^{-1} \left(B-xI_n\right) S)&&\text{theorem MMDAA}\\ &=det(S^{-1})det(B-xI_n)det(S)&&\text{theorem DRMM}\\ &=det(S^{-1})det(S)det(B-xI_n)&&\text{property CMCN}\\ &=det(S^{-1}S)det(B-xI_n)&&\text{theorem DRMM}\\ &=det(I_n)det(B-xI_n)&&\text{definition MI}\\ &=1\, det(B-xI_n)&&\text{definition DM}\\ &=p_{B}(x)&&\text{definition CP}\\ \end{align*}

So similar matrices not only have the same set of eigenvalues, the algebraic multiplicities of these eigenvalues will also be the same. However, be careful with this theorem. It is tempting to think the converse is true, and argue that if two matrices have the same eigenvalues, then they are similar. Not so, as the following example illustrates.

Example EENS: Equal eigenvalues, not similar.   Define

\begin{align*} A&=\begin{bmatrix}1&1\\0&1\end{bmatrix} & B&=\begin{bmatrix}1&0\\0&1\end{bmatrix} \end{align*}

and check that \begin{equation*} p_{A}(x)=p_{B}(x)=1-2x+x^2=(x-1)^2 \end{equation*} and so $A$ and $B$ have equal characteristic polynomials. If the converse of theorem SMEE were true, then $A$ and $B$ would be similar. Suppose this is the case. More precisely, suppose there is a nonsingular matrix $S$ so that $A=S^{-1} B S$. Then \begin{equation*} A=S^{-1} B S=S^{-1} I_2 S=S^{-1}S=I_2 \end{equation*} Clearly $A\neq I_2$ and this contradiction tells us that the converse of theorem SMEE is false.

The subsections on Diagonalization and the Fibonacci sequence have been omitted from this demonstration