Files
NoteNextra-origin/content/Math4501/Math4501_L1.md
Trance-0 802993833b update
2025-08-27 15:52:24 -05:00

1.5 KiB

Math4501 Lecture 1

In many practical problems (ODEs (ordinary differential equations), PdEs (partial differential equations), System of equations)

closed-form analytical solutions are unknown.

-> resort ot computational algorithms (approximation)

For example,

Deep learning classifiers

Root finding


f(x)=\sum_{i=1}^n a_i x^i

for n\geq 5.

find all roots x\in \mathbb{R} of f(x)=0.

Investment

Invest a dollars every month return with the rate r.

g(r)=a\sum_{i=1}^n (1+r)^i=a\left[\frac{(1+r)^{n+1}-(1+r)}{r}\right]

Say want g(r)=b for some b.

f(r)=a(1+n)^{n+1}-a(1+n)-br=0

use Newton's method to find r such that f(r)=0.

Since f is non-linear, that is f(x+y)\neq f(x)+f(y).

Let


f_1(x_1,\dots, x_m)=0\\
\vdots\\
f_m(x_1,\dots, x_m)=0

be a system of m equations \vec{f} \mathbb{R}^m \to \mathbb{R}^m. and f_1(\vec{x})=\vec{0}.

If \vec{f} is linear, note that


\begin{aligned}
\vec{f}(\vec{x})&=\vec{f}(\begin{bmatrix}x_1\\ \vdots\\ x_m\end{bmatrix})\\
&=\vec{f}(x_1\begin{bmatrix}1\\ 0\\ \vdots\\ 0\end{bmatrix}+x_2\begin{bmatrix}0\\ 1\\ \vdots\\ 0\end{bmatrix}+\cdots+x_m\begin{bmatrix}0\\ 0\\ \vdots\\ 1\end{bmatrix})\\
&=x_1\vec{f}(\begin{bmatrix}1\\ 0\\ \vdots\\ 0\end{bmatrix})+x_2\vec{f}(\begin{bmatrix}0\\ 1\\ \vdots\\ 0\end{bmatrix})+\cdots+x_m\vec{f}(\begin{bmatrix}0\\ 0\\ \vdots\\ 1\end{bmatrix})\\
&=A\vec{x}
\end{aligned}

where \vec{e}_i is the $i$-th standard basis vector.

Gaussian elimination (LU factorization)