89 lines
3.2 KiB
Markdown
89 lines
3.2 KiB
Markdown
# Lecture 24
|
|
|
|
## Chapter V Eigenvalue and Eigenvectors
|
|
|
|
### 5E Commuting Operators
|
|
|
|
#### Definition 5.71
|
|
|
|
* For $T,S\in\mathscr{L}(V)$, $T$ and $S$ commute if $ST=TS$.
|
|
* For $A,B\in \mathbb{F}^{n,n}$, $A$ and $B$ commute if $AB=BA$.
|
|
|
|
Example:
|
|
For $p,q\in \mathscr{P}(\mathbb{F})$, $p(T)$ and $q(T)$ commute
|
|
|
|
* Partial Derivatives $\frac{d}{dx},\frac{d}{dy}:\mathscr{P}_m(\mathbb{R}^2)\to \mathscr{P}_m(\mathbb{R}^2)$, $\frac{d}{dy}\frac{d}{dd}=\frac{d}{dx}\frac{d}{dy}$
|
|
* Diagonal matrices commute with each other
|
|
|
|
#### Proposition 5.74
|
|
|
|
Given $S,T\in \mathscr{L}(V)$, $S,T$ commute if and only if $M(S), M(T)$ commute.
|
|
|
|
Proof: $ST=TS\iff M(ST)=M(TS)\iff M(S)M(T)=M(T)M(S)$
|
|
|
|
#### Proposition 5.75
|
|
|
|
Suppose $S,T\in \mathscr{L}(V)$ commute and $\lambda\in \mathbb{F}$, then the eigenspace $E(\lambda, S)$ is invariant under $T$.
|
|
|
|
Proof:
|
|
Suppose $V\in E(\lambda, S), S(Tv)=(ST)v=(TS)v=T(Sv)=T\lambda v=\lambda Tv$
|
|
|
|
$Tv$ is an eigenvector with eigenvalue $\lambda$ (or $Tv=0$) $Tv\in E(\lambda, S)$
|
|
|
|
#### Theorem 5.76
|
|
|
|
Let $S,T\in \mathscr{L}(V)$ be diagonalizable operators. Then $S,T$ are **diagonalizable with respect to the same basis** (simultaneously diagonalizable) if and only if $S,T$ commute.
|
|
|
|
Proof:
|
|
|
|
$\Rightarrow$
|
|
|
|
diagonal matrices commute
|
|
|
|
$\Leftarrow$
|
|
|
|
Since $S$ is diagonalizable, $V=E(\lambda, S)\oplus...\oplus E(\lambda_m,S)$, where $\lambda_1,...,\lambda_m$ are the (distinct) eigenvalues of $S$. consider $T\vert_{E(\lambda_k,S)}$ (**Theorem 5.65**) this operator is diagonalizable because $T$ is diagonalizable. We can chose a basis of $E(\lambda_k,S)$ such that $T\vert_{E(\lambda_k,S)}$ gives a diagonal matrix. Take the basis of $V$ given by concatenating the bases of $E(\lambda_k,S)$ the elements of this basis eigenvectors of $S$ and $T$ so $S$ and $T$ are diagonalizable with respect to this basis.
|
|
|
|
#### Proposition 5.78
|
|
|
|
Every pair of commuting operators on a finite dimensional complex nonzero vector spaces has at least one common eigenvectors.
|
|
|
|
Proof: apply (5.75) and the fact that operator on complex vector spaces has at least one eigenvector.
|
|
|
|
#### Theorem 5.80
|
|
|
|
If $S,T$ are commuting operators, then there is a basis where both have upper triangular matrices.
|
|
|
|
Proof:
|
|
|
|
Induction on $n=dim\ V$.
|
|
|
|
$n=1$, clear
|
|
|
|
$n>1$, use (5.78) to find $v_1$ and eigenvector for $S$ and $T$. Decompose $V$ as $V=Span(v_1)\oplus W$ then defined a map $P:V\to W,P(av_1+w)=w$ define $\hat{S}:W\to W$ as $\hat{S}(w)=P(S(w))$ similarly $\hat{T}(w)=P(T(w))$ now apply the inductive hypothesis to $\hat{S}$ and $\hat{T}$. get a basis $v_2,...,v_n$ where they are both upper triangular and then exercise: $S,T$ are upper triangular with respect to the basis $v_1,...,v_n$.
|
|
|
|
#### Theorem 5.81
|
|
|
|
For $V$ finite dimensional $S,T\in \mathscr{L}(V)$ commuting operators then every eigenvalue of $S+T$ is a sum of an eigenvector of $S$ and an eigenvalue of $T$; every eigenvalue of $S\cdot T$ is a product of an eigenvector of $S$ and an eigenvalue of $T$.
|
|
|
|
Proof:
|
|
|
|
For upper triangular matrices
|
|
|
|
$$
|
|
\begin{pmatrix}
|
|
\lambda_1 & & *\\
|
|
& \ddots & \\
|
|
0 & & \lambda_m
|
|
\end{pmatrix}+
|
|
\begin{pmatrix}
|
|
\mu_1 & & *\\
|
|
& \ddots & \\
|
|
0 & & \mu_m
|
|
\end{pmatrix}=
|
|
\begin{pmatrix}
|
|
\lambda_1+\mu_1 & & *\\
|
|
& \ddots & \\
|
|
0 & & \lambda_m+\mu_m
|
|
\end{pmatrix}
|
|
$$ |