## Subsection9.2.4Properties of eigenvalues and vectors

No video for this unit

This unit reminds us of various properties of eigenvalue and eigenvectors through a sequence of homeworks.

###### Homework9.2.4.1.

Show that eigenvectors are not unique.

Solution

$x$ is an eigenvector of $A$ if for some scalar $\lambda$ $A x = \lambda x$ and $x \neq 0 \text{.}$ For any nonzero scalar $\alpha$ we find that

\begin{equation*} A (\alpha x ) =\alpha A x = \alpha \lambda x = \lambda ( \alpha x ). \end{equation*}

Hence any (nonzero) scalar multiple of $x$ is also an eigenvector. This again demonstrates that we care about the direction of an eigenvector rather than its length.

###### Homework9.2.4.2.

Let $\lambda$ be an eigenvalue of $A$ and let ${\cal E}_\lambda( A ) = \{ x \in \C^m \vert A x = \lambda x \} \text{.}$ This is the set of all eigenvectors of $A$ associated with $\lambda$ but also includes the zero vector. Show that ${\cal E}_\lambda( A )$ is a subspace.

Solution

A set ${\cal S} \subset \Cm$ is a subspace if and only if for all $\alpha \in \C$ and $x,y \in \Cm$ two conditions hold:

• $x \in {\cal S}$ implies that $\alpha x \in {\cal S} \text{.}$

• $x, y \in {\cal S}$ implies that $x + y \in {\cal S} \text{.}$

• $x \in {\cal E}_{\lambda}( A )$ implies $\alpha x \in {\cal E}_{\lambda}( A )\text{:}$

$x \in {\cal E}_{\lambda}(A)$ means that $A x = \lambda x \text{.}$ If $\alpha \in \C$ then $\alpha A x = \alpha \lambda x$ which, by commutivity and associativity means that $A ( \alpha x ) = \lambda ( \alpha x ) \text{.}$ Hence $(\alpha x) \in {\cal E}_{\lambda}(A) \text{.}$

• $x,y \in {\cal E}_{\lambda}( A )$ implies $x+y \in {\cal E}_{\lambda}( A )\text{:}$

\begin{equation*} A( x + y ) = A x + A y = \lambda A + \lambda y = \lambda( x + y ) . \end{equation*}

While there are an infinite number of eigenvectors associated with an eigenvalue, the fact that they form a subspace (provided the zero vector is added) means that they can be described by a finite number of vectors, namely a basis for that space.

###### Homework9.2.4.3.

Let $D \in \Cmxm$ be a diagonal matrix. Give all eigenvalues of $D \text{.}$ For each eigenvalue, give a convenient eigenvector.

Solution

Let

\begin{equation*} D = \left( \begin{array}{c c c c} \delta_0 \amp 0 \amp \cdots \amp 0 \\ 0 \amp \delta_1 \amp \cdots \amp 0 \\ \vdots \amp \vdots \amp \ddots \amp \vdots \\ 0 \amp 0 \amp \cdots \amp \delta_{m-1} \end{array} \right). \end{equation*}

Then

\begin{equation*} \lambda I - D = \left( \begin{array}{c c c c} \lambda - \delta_0 \amp 0 \amp \cdots \amp 0 \\ 0 \amp \lambda - \delta_1 \amp \cdots \amp 0 \\ \vdots \amp \vdots \amp \ddots \amp \vdots \\ 0 \amp 0 \amp \cdots \amp \lambda - \delta_{m-1} \end{array} \right) \end{equation*}

is singular if and only if $\lambda = \delta_i$ for some $i \in \{ 0, \ldots , m-1 \} \text{.}$ Hence $\Lambda( D ) = \{ \delta_0, \delta_1, \ldots, \delta_{m-1} \} \text{.}$

Now,

\begin{equation*} D e_j = \mbox{ the column of } D \mbox{ indexed with } j = \delta_j e_j \end{equation*}

and hence $e_j$ is an eigenvector associated with $\delta_j \text{.}$

###### Homework9.2.4.4.

Let $U \in \Cmxm$ be an upper triangular matrix. Give all eigenvalues of $U \text{.}$ For each eigenvalue, give a convenient eigenvector.

Solution

Let

\begin{equation*} U = \left( \begin{array}{c c c c} \upsilon_{0,0} \amp \upsilon_{0,1} \amp \cdots \amp \upsilon_{0,m-1} \\ 0 \amp \upsilon_{1,1} \amp \cdots \amp \upsilon_{1,m-1} \\ \vdots \amp \vdots \amp \ddots \amp \vdots \\ 0 \amp 0 \amp \cdots \amp \upsilon_{m-1,m-1} \end{array} \right). \end{equation*}

Then

\begin{equation*} \lambda I - U = \left( \begin{array}{c c c c} \lambda - \upsilon_{0,0} \amp \upsilon_{0,1} \amp \cdots \amp \upsilon_{0,m-1} \\ 0 \amp \lambda - \upsilon_{1,1} \amp \cdots \amp \upsilon_{1,m-1} \\ \vdots \amp \vdots \amp \ddots \amp \vdots \\ 0 \amp 0 \amp \cdots \amp \lambda - \upsilon_{m-1,m-1} \end{array} \right). \end{equation*}

is singular if and only if $\lambda = \upsilon_{i,i}$ for some $i \in \{ 0, \ldots , m-1 \} \text{.}$ Hence $\Lambda( D ) = \{ \upsilon_{0,0}, \upsilon_{1,1}, \ldots, \upsilon_{m-1,m-1} \} \text{.}$

Let $\lambda$ be an eigenvalue of $U \text{.}$ Things get a little tricky if $\lambda$ has multiplicity greater than one. Partition

\begin{equation*} U = \left( \begin{array}{c c c} U_{00} \amp u_{01} \amp U_{02} \\ 0 \amp \upsilon_{11} \amp u_{12}^T \\ 0 \amp 0 \amp U_{22} \end{array} \right) \end{equation*}

where $\upsilon_{11} = \lambda \text{.}$ We are looking for $x \neq 0$ such that $( \lambda I - U ) x = 0$ or, partitioning $x \text{,}$

\begin{equation*} \left( \begin{array}{c c c} \upsilon_{11} I - U_{00} \amp u_{01} \amp U_{02} \\ 0 \amp 0 \amp u_{12}^T \\ 0 \amp 0 \amp \upsilon_{11} I - U_{22} \end{array} \right) \left( \begin{array}{c} x_0 \\ \chi_1 \\ x_2 \end{array} \right) = \left( \begin{array}{c} 0 \\ 0 \\ 0 \end{array} \right). \end{equation*}

If we choose $x_2 = 0$ and $\chi_1 =1 \text{,}$ then

\begin{equation*} ( \upsilon I - U_{22} ) x_0 + u_{01} = 0 \end{equation*}

and hence $x_0$ must satisfy

\begin{equation*} ( \upsilon I - U_{22} ) x_0 = - u_{01}. \end{equation*}

If $\upsilon I - U_{22}$ is nonsingular, then there is a unique solution to this equation, and

\begin{equation*} \left( \begin{array}{c} - ( \upsilon I - U_{22} )^{-1} u_{01} \\ 1 \\ 0 \end{array} \right) \end{equation*}

is the desired eigenvalue. HOWEVER, this means that the partitioning

\begin{equation*} U = \left( \begin{array}{c c c} U_{00} \amp u_{01} \amp U_{02} \\ 0 \amp \upsilon_{11} \amp u_{12}^T \\ 0 \amp 0 \amp U_{22} \end{array} \right) \end{equation*}

must be such that $\upsilon_{11}$ is the FIRST diagonal element that equals $\lambda \text{.}$

###### Homework9.2.4.5.

Let $\lambda, \mu \in \Lambda( A ) \text{,}$ $A x = \lambda x \text{,}$ and $A y = \mu y$ with $x \neq 0$ and $y \ne 0 \text{.}$

This should be moved to after the discussion of the Schur decomposition>