update
This commit is contained in:
@@ -3,6 +3,8 @@
|
||||
|
||||
FROM node:18-alpine AS base
|
||||
|
||||
ENV NODE_OPTIONS="--max-old-space-size=8192"
|
||||
|
||||
# 1. Install dependencies only when needed
|
||||
FROM base AS deps
|
||||
# Check https://github.com/nodejs/docker-node/tree/b4117f9333da4138b03a546ec926ef50a31506c3#nodealpine to understand why libc6-compat might be needed.
|
||||
|
||||
1
Jenkinsfile
vendored
1
Jenkinsfile
vendored
@@ -2,6 +2,7 @@ pipeline {
|
||||
environment {
|
||||
registry = "trance0/notenextra"
|
||||
version = "1.0"
|
||||
NODE_OPTIONS = "--max-old-space-size=8192"
|
||||
}
|
||||
|
||||
agent any
|
||||
|
||||
@@ -3,7 +3,7 @@ services:
|
||||
build:
|
||||
context: ./
|
||||
dockerfile: ./Dockerfile
|
||||
image: trance0/notenextra:v1.1.6
|
||||
image: trance0/notenextra:v1.1.7
|
||||
restart: on-failure:5
|
||||
ports:
|
||||
- 13000:3000
|
||||
|
||||
77
pages/CSE559A/CSE559A_L14.md
Normal file
77
pages/CSE559A/CSE559A_L14.md
Normal file
@@ -0,0 +1,77 @@
|
||||
# CSE559A Lecture 14
|
||||
|
||||
## Neural Network Training
|
||||
|
||||
## Object Detection
|
||||
|
||||
AP (Average Precision)
|
||||
|
||||
### Benchmarks
|
||||
|
||||
#### PASCAL VOC Challenge
|
||||
|
||||
20 Challenge classes.
|
||||
|
||||
CNN increases the accuracy of object detection.
|
||||
|
||||
#### COCO dataset
|
||||
|
||||
Common objects in context.
|
||||
|
||||
Semantic segmentation. Every pixel is classified to tags.
|
||||
|
||||
Instance segmentation. Every pixel is classified and grouped into instances.
|
||||
|
||||
### Object detection: outline
|
||||
|
||||
Proposal generation
|
||||
|
||||
Object recognition
|
||||
|
||||
#### R-CNN
|
||||
|
||||
Proposal generation
|
||||
|
||||
Use CNN to extract features from proposals.
|
||||
|
||||
with SVM to classify proposals.
|
||||
|
||||
Use selective search to generate proposals.
|
||||
|
||||
Use AlexNet finetuned on PASCAL VOC to extract features.
|
||||
|
||||
Pros:
|
||||
|
||||
- Much more accurate than previous approaches
|
||||
- Andy deep architecture can immediately be "plugged in"
|
||||
|
||||
Cons:
|
||||
|
||||
- Not a single end-to-end trainable system
|
||||
- Fine-tune network with softmax classifier (log loss)
|
||||
- Train post-hoc linear SVMs (hinge loss)
|
||||
- Train post-hoc bounding box regressors (least squares)
|
||||
- Training is slow 2000CNN passes for each image
|
||||
- Inference (detection) was slow
|
||||
|
||||
#### Fast R-CNN
|
||||
|
||||
Proposal generation
|
||||
|
||||
Use CNN to extract features from proposals.
|
||||
|
||||
##### ROI pooling and ROI alignment
|
||||
|
||||
ROI pooling:
|
||||
|
||||
- Pooling is applied to the feature map.
|
||||
- Pooling is applied to the proposal.
|
||||
|
||||
ROI alignment:
|
||||
|
||||
- Align the proposal to the feature map.
|
||||
- Align the proposal to the feature map.
|
||||
|
||||
Use bounding box regression to refine the proposal.
|
||||
|
||||
|
||||
@@ -16,4 +16,5 @@ export default {
|
||||
CSE559A_L11: "Computer Vision (Lecture 11)",
|
||||
CSE559A_L12: "Computer Vision (Lecture 12)",
|
||||
CSE559A_L13: "Computer Vision (Lecture 13)",
|
||||
CSE559A_L14: "Computer Vision (Lecture 14)",
|
||||
}
|
||||
|
||||
@@ -1 +1,123 @@
|
||||
# Lecture 22
|
||||
# Math 4121 Lecture 22
|
||||
|
||||
## Continue on Arzela-Osgood Theorem
|
||||
|
||||
|
||||
|
||||
Proof:
|
||||
|
||||
Part 2: Control the integral on $\mathcal{U}$
|
||||
|
||||
If $[x_i,x_{i+1}]\cap G_k\neq \emptyset$, then $\inf_{x\in [x_i,x_{i+1}]} |f_n(x)| < \frac{\alpha}{2}$ for all $n\geq K$. Denote such set as $P_1$.
|
||||
|
||||
Otherwise, we denote such set as $P_2$.
|
||||
|
||||
So $\ell(\mathcal{U})=\ell(P_1)+\ell(P_2)\geq c_e(G_K)+\ell(P_2)$.
|
||||
|
||||
This implies $\ell(P_2)\leq \frac{\alpha}{4B}$ since $c_e(G_K)\leq c_e(\mathcal{U})+\frac{\alpha}{2B}$.
|
||||
|
||||
Thus, for $n\geq K$,
|
||||
|
||||
$$
|
||||
L(P,f_n)\leq \ell(P_1)\frac{\alpha}{2}+\ell(P_2)B
|
||||
$$
|
||||
|
||||
So
|
||||
|
||||
$$
|
||||
\int_\mathcal{U} |f_n(x)| dx \leq c_e(\mathcal{U})\frac{\alpha}{2}+\frac{\alpha}{2}
|
||||
$$
|
||||
|
||||
All in all,
|
||||
|
||||
$$
|
||||
\begin{aligned}
|
||||
\left\vert \int_\mathcal{U} f_n(x) dx\right\vert &\leq \frac{\alpha}{2}+\frac{\alpha}{2}\\
|
||||
&= \int_0^1 |f_n(x)| dx\\
|
||||
&\leq \int_\mathcal{U} |f_n(x)| dx + \int_\mathcal{C} |f_n(x)|dx\\
|
||||
&\leq c_e(\mathcal{U})\frac{\alpha}{2}+\frac{\alpha}{2}+c_e(\mathcal{C})\frac{\alpha}{2}\\
|
||||
&= \alpha
|
||||
\end{aligned}
|
||||
$$
|
||||
|
||||
$\forall N\geq K$.
|
||||
|
||||
QED
|
||||
|
||||
### Baire Category Theorem
|
||||
|
||||
Nowhere dense sets can be large, but they canot cover an open (or closed) interval.
|
||||
|
||||
#### Theorem 4.7 (Baire Category Theorem)
|
||||
|
||||
An open interval cannot be covered by a countable union of nowhere dense sets.
|
||||
|
||||
Proof:
|
||||
|
||||
Suppose $(0,1)\subset \bigcup_{n=1}^\infty S_n$ where each $S_n$ is nowhere dense. In particular, $\exists I_1$ closed interval such that $I_1\subset (0,1)$ and $I_1\cap S_1=\emptyset$.
|
||||
|
||||
Now for each $k\geq 2$, $S_k$ is not dense in $I_{k-1}$ so $\exists I_k\subsetneq I_{k-1}$ such that $I_k\cap S_k=\emptyset$ for all $j\leq k$.
|
||||
|
||||
By nested interval property, $\exists x\in \bigcap_{n=1}^\infty I_n$.
|
||||
|
||||
Then $x\in (0,1)$ and $x\notin \bigcup_{n=1}^\infty S_n$.
|
||||
|
||||
Contradiction with the assumption that $(0,1)\subset \bigcup_{n=1}^\infty S_n$.
|
||||
|
||||
QED
|
||||
|
||||
#### Definition First Category
|
||||
|
||||
A countable union of nowhere dense sets is called a set of **first category**.
|
||||
|
||||
#### Corollary 4.8
|
||||
|
||||
Complement of a set of first category in $\mathbb{R}$ is dense in $\mathbb{R}$.
|
||||
|
||||
Proof:
|
||||
|
||||
We need to show that for every interval $I$, $\exists x\in I\cap S^c$. ($\exists x\in I$ and $x\notin S$)
|
||||
|
||||
This is equivalent to the Baire Category Theorem.
|
||||
|
||||
QED
|
||||
|
||||
Recall a function is pointwise discontinuous if $\mathcal{C}=\{c\in [a,b]: f\text{ is continuous at } c\}$ is dense in $[a,b]$.
|
||||
|
||||
$\mathcal{D}=[a,b]\setminus \mathcal{C}$ is called the set of points of discontinuity of $f$.
|
||||
|
||||
#### Corollary 4.9
|
||||
|
||||
$f$ is pointwise discontinuous if and only if $\mathcal{D}$ is of first category.
|
||||
|
||||
Proof:
|
||||
|
||||
Part 1: If $\mathcal{D}$ is of first category, then $f$ is pointwise discontinuous.
|
||||
|
||||
Immediate from Corollary 4.8.
|
||||
|
||||
Part 2: If $f$ is pointwise discontinuous, then $\mathcal{D}$ is of first category.
|
||||
|
||||
Let $P_k=\{x\in [a,b]: w(f;x)\geq \frac{1}{k}\}$, $\mathcal{D}=\bigcup_{k=1}^\infty P_k$.
|
||||
|
||||
Need to show that each $P_k$ is nowhere dense. (under the assumption that $\mathcal{C)$ is dense).
|
||||
|
||||
Let $I\subseteq [a,b]$ so $\exists c\in \mathcal{C}\cap I$. So by definition of $w(f;c)$, $\exists J\subseteq I$ and $c\in J$ such that $w(f;J)\leq \frac{1}{k}$ so for all $x\in J$, $w(f;x)\leq \frac{1}{k}$. so $J\subseteq P_k=\emptyset$.
|
||||
|
||||
Thus, $P_k$ is nowhere dense.
|
||||
|
||||
QED
|
||||
|
||||
#### Corollary 4.10
|
||||
|
||||
Let $\{f_n\}$ be a sequence of pointwise discontinuous functions. The set of points at which all $f_n$ are simultaneously continuous is dense (it's also uncountable).
|
||||
|
||||
Proof:
|
||||
|
||||
$$
|
||||
\bigcap_{n=1}^\infty \mathcal{C}_n=\left(\bigcup_{n=1}^\infty \mathcal{D}_n\right)^c
|
||||
$$
|
||||
|
||||
The complement of a set of first category is dense.
|
||||
|
||||
QED
|
||||
|
||||
Reference in New Issue
Block a user