Files
NoteNextra-origin/content/Math429/Math429_L38.md
2025-07-06 12:40:25 -05:00

120 lines
3.1 KiB
Markdown

# Lecture 38
## Chapter VIII Operators on complex vector spaces
### Trace 8D
#### Definition 8.47
For a square matrix $A$, the **trace of** $A$ is the sum of the diagonal entries denoted $tr(A)$.
#### Theorem 8.49
Suppose $A$ is $m\times n$, $B$ is $n\times m$ matrices, then $tr(AB)=tr(BA)$.
Proof:
By pure computation.
#### Theorem 8.50
Suppose $T\in \mathscr{L}(V)$ and $u_1,...,u_n$ and $v_1,...,v_n$ are bases of $V$.
$$
tr(M(T,(u_1,...,u_n)))=tr(M(T,(v_1,...,v_n)))
$$
Proof:
Let $A=tr(M(T,(u_1,...,u_n)))$ and $B=tr(M(T,(v_1,...,v_n)))$, then there exists $C$, invertible such that $A=CBC^{-1}$,
$$
tr(A)=tr((CB)C^{-1})=tr(C^{-1}(CB))=tr(B)
$$
#### Definition 8.51
Given $T\in \mathscr{L}(V)$ the trace of $T$ denoted $tr(T)$ is given by $tr(T)=tr(M(T))$.
Note: For an upper triangular matrix, the diagonal entries are the eigenvalues with multiplicity
#### Theorem 8.52
Suppose $V$ is a complex vector space such that $T\in \mathscr{L}(V)$, then $tr(T)$ is the sum of the eigenvalues counted with multiplicity.
Proof:
Over $\mathbb{C}$, there is a basis where $M(T)$ is upper triangular.
#### Theorem 8.54
Suppose $V$ is a complex vector space, $n=dim\ V$.$T\in \mathscr{L}(V)$. Then the coefficient on $z^{n-1}$ in the characteristic polynomial is $tr(T)$.
Proof:
$(z-\lambda_1)\dots(z-\lambda_n)=z^{n}-(\lambda_1+...+\lambda_n)z^{n-1}+\dots$
#### Theorem 8.56
Trance is linear
Proof:
- Additivity
$tr(T+S)=tr(M(T)+M(S))=tr(T)+tr(S)$
- Homogeneity
$tr(cT)=ctr(M(T))=ctr(T)$
#### Theorem/Example 8.10
Trace is the unique linear functional $\mathscr{L}\to \mathbb{F}$ such that $tr(ST)=tr(TS)$ and $tr(I)=dim\ V$
Proof:
Let $\varphi:\mathscr{L}(V)\to \mathbb{F}$ be a linear functional such that $\varphi(ST)=\varphi(TS)$ and $\varphi(I)=n$ where $n=dim\ V$. Let $v_1,...,v_n$ be a basis for $V$ define $P_{j,k}$ to be the operator $M(P_{j,k})=\begin{pmatrix}
0&0&0\\
0&1&0\\
0&0&0
\end{pmatrix}$. Note $P_{j,k}$ form a basis of $L(V)$, now we must show $\varphi(P_{j,k})=tr(P_{j,k})=\begin{cases}1\textup{ if }j=k\\0\textup{ if }j \neq k\end{cases}$
- For $j\neq k$
$\varphi(P_{j,j}P_{j,k})=\varphi(P_{j,k})=0$
$\varphi(P_{j,k}P_{j,j})=\varphi(P_{j,k})=0$
- For $j=k$
$\varphi(P_{k,j},P_{j,k})=\varphi(P_{k,k})=1$
$\varphi(P_{j,k},P_{k,j})=\varphi(P_{j,j})=1$
So $\varphi(I)=\varphi(P_{1,1}+...+P_{n,n})=\varphi(P_{1,1})+...+\varphi(P_{n,n})=n$
#### Theorem 8.57
Suppose $V$ is finite dimensional vector space, then there does not exists $S,T\in \mathscr{L}(V)$ such that $ST-TS=I$. ($ST-TS$ is called communicator)
Proof:
$tr(ST-TS)=tr(ST)-tr(TS)=tr(ST)-tr(ST)=0$, since $tr(I)=dim\ V$, so $ST-TS\neq I$
Note: **requires finite dimensional.**
## Chapter ? Multilinear Algebra and Determinants
### Determinants ?A
#### Definition ?.1
The determinant of $T\in \mathscr{L}(V)$ is the product of eigenvalues counted with multiplicity.
#### Definition ?.2
The determinant of a matrix is given by
$$
det(A)=\sum_{\sigma\in perm(n)}A_{\sigma(1),1}\cdot ...\cdot A_{\sigma(n),n}\cdot sign(\sigma)
$$
$perm(\sigma)=$ all recordings of $1,...,n$, number of swaps needed to write $\sigma$
$$