Files
NoteNextra-origin/pages/Math429/Math429_L11.md
2024-11-18 14:16:15 -06:00

154 lines
3.3 KiB
Markdown

# Lecture 11
## Chapter III Linear maps
**Assumption: $U,V,W$ are vector spaces (over $\mathbb{F}$)**
### Matrices 3C
#### Definition 3.31
Suppose $T\in \mathscr{L}(V,W)$, $v_1,...,v_n$ a basis for $V$ $w_1,...,w_m$ a basis for $W$. Then $M(T)=M(T,(v_1,...,v_n),(w_1,...,w_m))$ is given by $M(T)=A$n where
$$
T_{v_k}=A_{1,k}w_1+...+A_{m,k}w_m
$$
$$
\begin{matrix}
& & v_1& & v_2&&...&v_n&
\end{matrix}\\
\begin{matrix}
w_1\\w_2\\.\\.\\.\\w_m
\end{matrix}
\begin{pmatrix}
A_{1,1} & A_{1,2} &...& A_{1,n}\\
A_{2,1} & A_{2,2} &...& A_{2,n}\\
. & . &...&.\\
. & . &...&.\\
. & . &...&.\\
A_{m,1} & A_{m,2} &...& A_{m,n}\\
\end{pmatrix}
$$
Example:
* $T:\mathbb{F}^2\to \mathbb{F}^3$
$T(x,y)=(x+3y,2x+5y,7x+9y)$
$M(T)=\begin{pmatrix}
1&3\\
2&5\\
7&9
\end{pmatrix}$
* Let $D:\mathscr{P}_3(\mathbb{F})\to \mathscr{P}_2(\mathbb{F})$ be differentiation
$M(T)=\begin{pmatrix}
0&1&0&0\\
0&0&2&0\\
0&0&0&3\\
\end{pmatrix}$
#### Lemma 3.35
$S,T\in \mathscr{L}(V,W)$, $M(S+T)=M(S)+M(T)$
#### Lemma 3.38
$\forall \lambda\in \mathbb{F},T\in \mathscr{L}(V,W)$, $M(\lambda T)=\lambda M(T)$
$M:\mathscr{L}(V,W)\to \mathbb{F}^{n,m}$ is a linear map
#### Matrix multiplication
#### Definition 3.41
$$
(AB)_{j,k}=\sum^{n}_{r=1} A_{j,r}B_{r,k}
$$
#### Theorem 3.42
$T\in \mathscr{L}(U,V), S\in\mathscr{L}(V,W)$ then $M(S,T)=M(S)M(T)$ ($dim (U)=p, dim(V)=n, dim(W)=m$)
Proof:
Let $w_1,...,v_n$ be a basis for $V$, $w_1,..,w_m$ be a basis for $W$ $u_1,..,u_p$ be a basis of $U$.
Let $A=M(S),B=M(T)$
Compute $M(ST)$ by **Definition 3.31**
$$
\begin{aligned}
(ST)u_k&=S(T(u_k))\\
&=S(\sum^n_{r=1}B_{r,k}v_r)\\
&=\sum^n_{r=1} B_{r,k}(S_{v_r})\\
&=\sum^n_{r=1} B_{r,k}(\sum^j_{j=1}A_{j,r} w_j)\\
&=\sum^n_{r=1} (\sum^j_{j=1}A_{j,r}B_{r,k})w_j\\
&=\sum^n_{r=1} (M(ST)_{j,k})w_j\\
\end{aligned}
$$
$$
\begin{aligned}
(M(ST))_{j,k}&=\sum^n_{r=1}A_{j,r}B_{r,k}\\
&=(AB)_{j,k}
\end{aligned}
$$
$$
M(ST)=AB=M(S)M(T)
$$
#### Notation 3.44
Suppose $A$ is an $m\times n$ matrix
then
1. $A_{j,\cdot}$ denotes the $1\times n$ matrix at the $j$th column.
2. $A_{\cdot,k}$ denotes the $m\times 1$ matrix at the $k$th column.
#### Proposition 3.46
Suppose $A$ is a $m\times n$ matrix and $B$ is a $n\times p$ matrix, then
$$
(AB)_{j,k}=(A_{j,\cdot})\cdot (B_{\cdot,k})
$$
Proof:
$(AB)_{j,k}=A_{j,1}B_{1,k}+...+A_{j,n}B_{n,k}$
$(A_{j,\cdot})\cdot (B_{\cdot,k})=(A_{j,\cdot})_{1,1}(B_{\cdot,k})_{1,1}+...+(A_{j,\cdot})_{1,n}(B_{\cdot,k})_{n,1}=A_{j,1}B_{1,k}+...+A_{j,n}B_{n,k}$
#### Proposition 3.48
Suppose $A$ is an $m\times n$ matrix and $B$ is an $n\times p$ matrix, then
$$
(A,B)_{\cdot,k}=A(B_{\cdot,k})
$$
#### Proposition 3.56
Let $A$ is an $m\times n$ $b=\begin{pmatrix}
b_1\\...\\b_n
\end{pmatrix}$ a $x\times 1$ matrix. Then $Ab=b_1A_{\cdot,1}+...+b_nA_{\cdot,n}$
i.e. $Ab$ is a linear combination of the columns of $A$
#### Proposition 3.51
Let $C$ be a $m\times c$ matrix and $R$ be a $c\times n$ matrix, then
1. column $k$ of $CR$ is a linear combination of the columns of $C$ with coefficients given by $R_{\cdot,k}$
*putting the propositions together...*
2. row $j$ of $CR$ is a linear combination of the rows of $R$ with coefficients given by $C_{j,\cdot}$