# Lecture 22 ## Chapter V Eigenvalue and Eigenvectors ### Upper Triangular Matrices 5C #### Theorem 5.44 Let $T\in \mathscr{L}(V)$ be a linear operator, then $T$ has an upper triangular matrix (with respect to some basis), if the minimal polynomial is $(z-\lambda_1)...(z-\lambda_m)$ for $\lambda_1,..,\lambda_m\in \mathbb{F}$ Proof: $\implies$ easy $\impliedby$ Suppose the minimal polynomial of $T$ is $(z-\lambda_1)...(z-\lambda_m)$ Then we do induction on $m$. Base case: $m=1$, then $T-\lambda_1 I=0$, $T=\lambda I$ but $\lambda I$ has an upper triangular matrix, Induction step: $m>1$, Suppose the results holds for smaller $m$. Let $u=range(T-\lambda_m I)$, $U$ is invariant under $T$, consider $T\vert_u$. Note that if $u\in U$, $(T-\lambda_1 I)...(T-\lambda_{m-1} I)u=(T-\lambda_1 I)...(T-\lambda_m I)v=0$. Thus the minimal polynomial of $T\vert_U$ divides $(z-\lambda_1)...(z-\lambda_{m-1})$ #### Corollary 5.47 (staring point for Jordan Canonical Form) Suppose $V$ is a finite dimensional complex vector space, and $T\in \mathscr{L}(V)$, then $T$ has an upper triangular matrix with respect to some basis. Recall: $T$ is upper triangular $\iff$ $Tv_k\in Span\ (v_1,...,v_k)$. where $v_1,...,v_n$ is a basis. Let $u_1,...,u_r$ be a basis for $U$ such that $Tu_k\in Span\ (v_1,...,v_k)$ (such thing exists because $T$ is upper triangular. Extend to a basis of $V$, $u_1,..,u_r,v_1,...,v_s$, then $$ Tv_k=((T-\lambda_m I)+\lambda_m I)v_k=(T-\lambda_m I)v_k+\lambda_m v_k $$ and $(T-\lambda_m I)v_k\in U, \lambda_m v_k\in Span\ (u_1,..,u_r,v_k)$ Thus with respect to the same basis $u_1,..,u_r,v_1,...,v_s$ $T$ is upper triangular. $$ M(T)=\begin{pmatrix} M(T\vert_U) &\vert & *\\ \rule{2cm}{1pt}&&\rule{4cm}{0.4pt}\\ 0 & \vert&\lambda \textup{ on the diagonal line} \end{pmatrix} $$ Example: $M(T)=\begin{pmatrix} 2&0&1\\ 0&2&1\\ 1&1&3 \end{pmatrix}$ and the minimal polynomial is $(z-2)(z-2)(z-3)$ $v_1=(1,-1,0), v_2=(1,0,-1), v_3=(-1,1,0)$ $M(T,(v_1,v_2,v_3))=\begin{pmatrix} 2&1&0\\ 0&2&0\\ 0&0&3 \end{pmatrix}$ which is upper triangular. ### 5D Diagonalizable Operations #### Definition 5.48 A **Diagonal matrix** is a matrix where all entries except the diagonal is zero Example: $I,0,\begin{pmatrix} 2&0&0\\ 0&2&0\\ 0&0&3 \end{pmatrix}$ #### Definition 5.50 An operator $T\in\mathscr{L}(V)$ is diagonalizable if $M(T)$ is diagonalizable with respect to some basis. Example: $T:\mathbb{F}->\mathbb{F^2}$ $M(T)=\begin{pmatrix} 3&-1\\ -1&3& \end{pmatrix} v_1=(1,-1), v_2=(1,1)$, $T(v_1)=(4,-4)=4v_1, T(v_2)=(2,2)=2v_2$, so the eigenvalues are $2$ with eigenvector $v_2$, and $4$ with eigenvector $v_1$. The eigenvectors for $z$ are $Span (v_2)\ \{0\}$ $M(T,(v_1,v_2))=\begin{pmatrix} 4&0\\ 0&2 \end{pmatrix}$ and $T$ is diagonalizable. #### Definition 5.52 Let $T\in \mathscr{L}(V),\lambda \in \mathbb{F}$. the **eigenspace** of $T$ corresponding to $\lambda$ is the subspace $E(\lambda, T)\in V$ defined by $$ E(\lambda, T)=null\ (T-\lambda I)=\{ v\in V\vert Tv=\lambda v\} $$ Example: $E(2,T)=Span\ (v_2)$ $E(4,T)=Span\ (v_1)$, $E(3,T)=\{0 \}$ #### Theorem 5.54 Suppose $T\in \mathscr{L}(V)$ $\lambda_1,...,\lambda_m$ are distinct eigenvalues of $T$, Then $$ E(\lambda_1, T)+...+E(\lambda_m,T) $$ is a direct sum. In particular if $V$ is finite dimensional. $$ dim\ (E(\lambda_1, T))+...+dim\ (E(\lambda_m,T))\leq dim\ V $$ Proof: Need to show that if $v_k\in E(\lambda_k,T)$ for $k=1,...,m$ then $v_1+...+v_m=0\iff v_k=0$ for $k=1,...,m$. i.e eigenvectors for distinct eigenvalues are linearly independent. (Prop 5.11)