bugfix
Some checks failed
Sync from Gitea (main→main, keep workflow) / mirror (push) Has been cancelled
Some checks failed
Sync from Gitea (main→main, keep workflow) / mirror (push) Has been cancelled
This commit is contained in:
@@ -140,6 +140,7 @@ $$
|
||||
\begin{aligned}
|
||||
H(Y|X=x)&=-\sum_{y\in \mathcal{Y}} \log_2 \frac{1}{Pr(Y=y|X=x)} \\
|
||||
&=-\sum_{y\in \mathcal{Y}} Pr(Y=y|X=x) \log_2 Pr(Y=y|X=x) \\
|
||||
\end{aligned}
|
||||
$$
|
||||
|
||||
The conditional entropy $H(Y|X)$ is defined as:
|
||||
@@ -150,6 +151,7 @@ H(Y|X)&=\mathbb{E}_{x\sim X}[H(Y|X=x)] \\
|
||||
&=-\sum_{x\in \mathcal{X}} Pr(X=x)H(Y|X=x) \\
|
||||
&=-\sum_{x\in \mathcal{X}, y\in \mathcal{Y}} Pr(X=x, Y=y) \log_2 Pr(Y=y|X=x) \\
|
||||
&=-\sum_{x\in \mathcal{X}, y\in \mathcal{Y}} Pr(x)\sum_{y\in \mathcal{Y}} Pr(Y=y|X=x) \log_2 Pr(Y=y|X=x) \\
|
||||
\end{aligned}
|
||||
$$
|
||||
|
||||
Notes:
|
||||
|
||||
@@ -196,7 +196,7 @@ $\operatorname{Pr}(s_\mathcal{Z}|m_1, \cdots, m_{t-z}) = \operatorname{Pr}(U_1,
|
||||
|
||||
Conclude similarly by the law of total probability.
|
||||
|
||||
$\operatorname{Pr}(s_\mathcal{Z}|m_1, \cdots, m_{t-z}) = \operatorname{Pr}(s_\mathcal{Z}) \implies I(S_\mathcal{Z}; M_1, \cdots, M_{t-z}) = 0.
|
||||
$\operatorname{Pr}(s_\mathcal{Z}|m_1, \cdots, m_{t-z}) = \operatorname{Pr}(s_\mathcal{Z}) \implies I(S_\mathcal{Z}; M_1, \cdots, M_{t-z}) = 0$.
|
||||
|
||||
### Conditional mutual information
|
||||
|
||||
@@ -246,14 +246,14 @@ A: Fix any $\mathcal{T} = \{i_1, \cdots, i_t\} \subseteq [n]$ of size $t$, and l
|
||||
$$
|
||||
\begin{aligned}
|
||||
H(M) &= I(M; S_\mathcal{T}) + H(M|S_\mathcal{T}) \text{(by def. of mutual information)}\\
|
||||
&= I(M; S_\mathcal{T}) \text{(since S_\mathcal{T} suffice to decode M)}\\
|
||||
&= I(M; S_{i_t}, S_\mathcal{Z}) \text{(since S_\mathcal{T} = S_\mathcal{Z} ∪ S_{i_t})}\\
|
||||
&= I(M; S_\mathcal{T}) \text{(since }S_\mathcal{T}\text{ suffice to decode M)}\\
|
||||
&= I(M; S_{i_t}, S_\mathcal{Z}) \text{(since }S_\mathcal{T} = S_\mathcal{Z} ∪ S_{i_t})\\
|
||||
&= I(M; S_{i_t}|S_\mathcal{Z}) + I(M; S_\mathcal{Z}) \text{(chain rule)}\\
|
||||
&= I(M; S_{i_t}|S_\mathcal{Z}) \text{(since \mathcal{Z} ≤ z, it reveals nothing about M)}\\
|
||||
&= I(M; S_{i_t}|S_\mathcal{Z}) \text{(since }\mathcal{Z}\leq z \text{, it reveals nothing about M)}\\
|
||||
&= I(S_{i_t}; M|S_\mathcal{Z}) \text{(symmetry of mutual information)}\\
|
||||
&= H(S_{i_t}|S_\mathcal{Z}) - H(S_{i_t}|M,S_\mathcal{Z}) \text{(def. of conditional mutual information)}\\
|
||||
\leq H(S_{i_t}|S_\mathcal{Z}) \text{(entropy is non-negative)}\\
|
||||
\leq H(S_{i_t}|S_\mathcal{Z}) \text{(conditioning reduces entropy). \\
|
||||
&\leq H(S_{i_t}|S_\mathcal{Z}) \text{(entropy is non-negative)}\\
|
||||
&\leq H(S_{i_t}|S_\mathcal{Z}) \text{(conditioning reduces entropy)} \\
|
||||
\end{aligned}
|
||||
$$
|
||||
|
||||
|
||||
Reference in New Issue
Block a user