change epsilon expression

This commit is contained in:
Zheyuan Wu
2024-12-05 21:23:34 -06:00
parent d18e86852c
commit 75ef366b1c
10 changed files with 37 additions and 37 deletions

View File

@@ -32,7 +32,7 @@ $$
Let $e$ be the exponents Let $e$ be the exponents
$$ $$
P[p,q\gets \Pi_n;N\gets p\cdot q;e\gets \mathbb{Z}_{\phi(N)}^*;y\gets \mathbb{N}_n;x\gets \mathcal{A}(N,e,y);x^e=y\mod N]<\varepsilon(n) P[p,q\gets \Pi_n;N\gets p\cdot q;e\gets \mathbb{Z}_{\phi(N)}^*;y\gets \mathbb{N}_n;x\gets \mathcal{A}(N,e,y);x^e=y\mod N]<\epsilon(n)
$$ $$
#### Theorem RSA Algorithm #### Theorem RSA Algorithm
@@ -190,7 +190,7 @@ $\mathcal{F}=\{f_i:D_i\to R_i\}_{i\in I}$
2. $(i,t)\gets Gen(1^n)$ efficient. ($i\in I$ paired with $t$), $t$ is the "trapdoor info" 2. $(i,t)\gets Gen(1^n)$ efficient. ($i\in I$ paired with $t$), $t$ is the "trapdoor info"
3. $\forall i,D_i$ can be sampled efficiently. 3. $\forall i,D_i$ can be sampled efficiently.
4. $\forall i,\forall x,f_i(x)$ can be computed in polynomial time. 4. $\forall i,\forall x,f_i(x)$ can be computed in polynomial time.
5. $P[(i,t)\gets Gen(1^n);y\gets R_i:f_i(\mathcal{A}(1^n,i,y))=y]<\varepsilon(n)$ (note: $\mathcal{A}$ is not given $t$) 5. $P[(i,t)\gets Gen(1^n);y\gets R_i:f_i(\mathcal{A}(1^n,i,y))=y]<\epsilon(n)$ (note: $\mathcal{A}$ is not given $t$)
6. (trapdoor) There is a p.p.t. $B$ such that given $i,y,t$, B always finds x such that $f_i(x)=y$. $t$ is the "trapdoor info" 6. (trapdoor) There is a p.p.t. $B$ such that given $i,y,t$, B always finds x such that $f_i(x)=y$. $t$ is the "trapdoor info"
#### Theorem RSA is a trapdoor #### Theorem RSA is a trapdoor

View File

@@ -45,7 +45,7 @@ Let $\{X_n\}_n$ and $\{Y_n\}_n$ be probability ensembles (separate of dist over
$\{X_n\}_n$ and $\{Y_n\}_n$ are computationally **in-distinguishable** if for all non-uniform p.p.t adversary $D$ ("distinguishers") $\{X_n\}_n$ and $\{Y_n\}_n$ are computationally **in-distinguishable** if for all non-uniform p.p.t adversary $D$ ("distinguishers")
$$ $$
|P[x\gets X_n:D(x)=1]-P[y\gets Y_n:D(y)=1]|<\varepsilon(n) |P[x\gets X_n:D(x)=1]-P[y\gets Y_n:D(y)=1]|<\epsilon(n)
$$ $$
this basically means that the probability of finding any pattern in the two array is negligible. this basically means that the probability of finding any pattern in the two array is negligible.

View File

@@ -9,7 +9,7 @@ $$
$$ $$
- If $\mu(n)\geq \frac{1}{p(n)}\gets poly(n)$ for infinitely many n, then $\{X_n\}$ and $\{Y_n\}$ are distinguishable. - If $\mu(n)\geq \frac{1}{p(n)}\gets poly(n)$ for infinitely many n, then $\{X_n\}$ and $\{Y_n\}$ are distinguishable.
- Otherwise, indistinguishable ($|diff|<\varepsilon(n)$) - Otherwise, indistinguishable ($|diff|<\epsilon(n)$)
Property: Closed under efficient procedures. Property: Closed under efficient procedures.
@@ -58,7 +58,7 @@ $$
### Next bit test (NBT) ### Next bit test (NBT)
We say $\{X_n\}$ passes the next bit test if $\forall i\in\{0,1,...,l(n)-1\}$ on $\{0,1\}^{l(n)}$ and for all adversaries $\mathcal{A}:P[t\gets X_n:\mathcal{A}(t_1,t_2,...,t_i)=t_{i+1}]\leq \frac{1}{2}+\varepsilon(n)$ (given first $i$ bit, the probability of successfully predicts $i+1$ th bit is almost random $\frac{1}{2}$) We say $\{X_n\}$ passes the next bit test if $\forall i\in\{0,1,...,l(n)-1\}$ on $\{0,1\}^{l(n)}$ and for all adversaries $\mathcal{A}:P[t\gets X_n:\mathcal{A}(t_1,t_2,...,t_i)=t_{i+1}]\leq \frac{1}{2}+\epsilon(n)$ (given first $i$ bit, the probability of successfully predicts $i+1$ th bit is almost random $\frac{1}{2}$)
Note that for any $\mathcal{A}$, and any $i$, Note that for any $\mathcal{A}$, and any $i$,
@@ -71,7 +71,7 @@ If $\{X_n\}\approx\{U_{l(n)}\}$ (pseudorandom), then $X_n$ must pass NBT for all
Otherwise $\exists \mathcal{A},i$ where for infinitely many $n$, Otherwise $\exists \mathcal{A},i$ where for infinitely many $n$,
$$ $$
P[t\gets X_n:\mathcal{A}(t_1,t_2,...,t_i)=t_{i+1}]\leq \frac{1}{2}+\varepsilon(n) P[t\gets X_n:\mathcal{A}(t_1,t_2,...,t_i)=t_{i+1}]\leq \frac{1}{2}+\epsilon(n)
$$ $$
We can build a distinguisher $D$ from $\mathcal{A}$. We can build a distinguisher $D$ from $\mathcal{A}$.
@@ -147,6 +147,6 @@ $f(x)||x$
Not all bits of $x$ would be hard to predict. Not all bits of $x$ would be hard to predict.
**Hard-core bit:** One bit of information about $x$ which is hard to determine from $f(x)$. $P[$ success $]\leq \frac{1}{2}+\varepsilon(n)$ **Hard-core bit:** One bit of information about $x$ which is hard to determine from $f(x)$. $P[$ success $]\leq \frac{1}{2}+\epsilon(n)$
Depends on $f(x)$ Depends on $f(x)$

View File

@@ -50,13 +50,13 @@ $P_k[Dec_k(Enc_k(m))=m]=1$
## Negligible function ## Negligible function
$\varepsilon:\mathbb{N}\to \mathbb{R}$ is a negligible function if $\forall c>0$, $\exists N\in\mathbb{N}$ such that $\forall n\geq N, \varepsilon(n)<\frac{1}{n^c}$ $\epsilon:\mathbb{N}\to \mathbb{R}$ is a negligible function if $\forall c>0$, $\exists N\in\mathbb{N}$ such that $\forall n\geq N, \epsilon(n)<\frac{1}{n^c}$
Idea: for any polynomial, even $n^{100}$, in the long run $\varepsilon(n)\leq \frac{1}{n^{100}}$ Idea: for any polynomial, even $n^{100}$, in the long run $\epsilon(n)\leq \frac{1}{n^{100}}$
Example: $\varepsilon (n)=\frac{1}{2^n}$, $\varepsilon (n)=\frac{1}{n^{\log (n)}}$ Example: $\epsilon (n)=\frac{1}{2^n}$, $\epsilon (n)=\frac{1}{n^{\log (n)}}$
Non-example: $\varepsilon (n)=O(\frac{1}{n^c})\forall c$ Non-example: $\epsilon (n)=O(\frac{1}{n^c})\forall c$
## One-way function ## One-way function
@@ -74,10 +74,10 @@ $$
f:\{0,1\}^n\to \{0,1\}^*(n\to \infty) f:\{0,1\}^n\to \{0,1\}^*(n\to \infty)
$$ $$
There is a negligible function $\varepsilon (n)$ such that for any adversary $a$ (n.u.p.p.t) There is a negligible function $\epsilon (n)$ such that for any adversary $a$ (n.u.p.p.t)
$$ $$
P[x\gets\{0,1\}^n;y=f(x):f(a(y))=y,a(y)=x']\leq\varepsilon(n) P[x\gets\{0,1\}^n;y=f(x):f(a(y))=y,a(y)=x']\leq\epsilon(n)
$$ $$
_Probability of guessing correct message is negligible_ _Probability of guessing correct message is negligible_
@@ -95,7 +95,7 @@ Example: Suppose $f$ is one-to-one, then $a$ must find our $x$, $P[x'=x]=\frac{1
Why do we allow $a$ to get a different $x'$? Why do we allow $a$ to get a different $x'$?
> Suppose the definition is $P[x\gets\{0,1\}^n;y=f(x):a(y)=x]\neq\varepsilon(n)$, then a trivial function $f(x)=x$ would also satisfy the definition. > Suppose the definition is $P[x\gets\{0,1\}^n;y=f(x):a(y)=x]\neq\epsilon(n)$, then a trivial function $f(x)=x$ would also satisfy the definition.
To be technically fair, $a(y)=a(y,1^n)$, size of input $\approx n$, let them use $poly(n)$ operations. To be technically fair, $a(y)=a(y,1^n)$, size of input $\approx n$, let them use $poly(n)$ operations.

View File

@@ -2,16 +2,16 @@
## Recap ## Recap
Negligible function $\varepsilon(n)$ if $\forall c>0,\exist N$ such that $n>N$, $\varepsilon (n)<\frac{1}{n^c}$ Negligible function $\epsilon(n)$ if $\forall c>0,\exist N$ such that $n>N$, $\epsilon (n)<\frac{1}{n^c}$
Ex: $\varepsilon(n)=2^{-n},\varepsilon(n)=\frac{1}{n^{\log (\log n)}}$ Ex: $\epsilon(n)=2^{-n},\epsilon(n)=\frac{1}{n^{\log (\log n)}}$
### Strong One-Way Function ### Strong One-Way Function
1. $\exists$ a P.P.T. that computes $f(x),\forall x\in\{0,1\}^n$ 1. $\exists$ a P.P.T. that computes $f(x),\forall x\in\{0,1\}^n$
2. $\forall a$ adversaries, $\exists \varepsilon(n),\forall n$. 2. $\forall a$ adversaries, $\exists \epsilon(n),\forall n$.
$$ $$
P[x\gets \{0,1\}^n;y=f(x):f(a(y,1^n))=y]<\varepsilon(n) P[x\gets \{0,1\}^n;y=f(x):f(a(y,1^n))=y]<\epsilon(n)
$$ $$
_That is, the probability of success guessing should decreasing as encrypted message increase..._ _That is, the probability of success guessing should decreasing as encrypted message increase..._
@@ -28,7 +28,7 @@ Negation:
$\exists a$, $P[x\gets \{0,1\}^n;y=f(x):f(a(y,1^n))=y]=\mu_a(n)$ is not a negligible function. $\exists a$, $P[x\gets \{0,1\}^n;y=f(x):f(a(y,1^n))=y]=\mu_a(n)$ is not a negligible function.
That is, $\exists c>0,\forall N \exists n>N \varepsilon(n)>\frac{1}{n^c}$ That is, $\exists c>0,\forall N \exists n>N \epsilon(n)>\frac{1}{n^c}$
$\mu_a(n)>\frac{1}{n^c}$ for infinitely many $n$. or infinitely often. $\mu_a(n)>\frac{1}{n^c}$ for infinitely many $n$. or infinitely often.
@@ -41,7 +41,7 @@ $\mu_a(n)>\frac{1}{n^c}$ for infinitely many $n$. or infinitely often.
$f:\{0,1\}^n\to \{0,1\}^*$ $f:\{0,1\}^n\to \{0,1\}^*$
1. $\exists$ a P.P.T. that computes $f(x),\forall x\in\{0,1\}^n$ 1. $\exists$ a P.P.T. that computes $f(x),\forall x\in\{0,1\}^n$
2. $\forall a$ adversaries, $\exists \varepsilon(n),\forall n$. 2. $\forall a$ adversaries, $\exists \epsilon(n),\forall n$.
$$ $$
P[x\gets \{0,1\}^n;y=f(x):f(a(y,1^n))=y]<1-\frac{1}{p(n)} P[x\gets \{0,1\}^n;y=f(x):f(a(y,1^n))=y]<1-\frac{1}{p(n)}
$$ $$
@@ -116,14 +116,14 @@ The only way to efficiently factorizing the product of prime is to iterate all t
In other words: In other words:
$\forall a\exists \varepsilon(n)$ such that $\forall n$. $P[p_1\gets \prod n_j]$ $\forall a\exists \epsilon(n)$ such that $\forall n$. $P[p_1\gets \prod n_j]$
We'll show this is a weak one-way function under the Factoring Assumption. We'll show this is a weak one-way function under the Factoring Assumption.
$\forall a,\exists \varepsilon(n)$ such that $\forall n$, $\forall a,\exists \epsilon(n)$ such that $\forall n$,
$$ $$
P[p_1\gets \Pi_n;p_2\gets \Pi_n;N=p_1\cdot p_2:a(n)=\{p_1,p_2\}]<\varepsilon(n) P[p_1\gets \Pi_n;p_2\gets \Pi_n;N=p_1\cdot p_2:a(n)=\{p_1,p_2\}]<\epsilon(n)
$$ $$
where $\Pi_n=\{$ all primes $p<2^n\}$ where $\Pi_n=\{$ all primes $p<2^n\}$

View File

@@ -2,10 +2,10 @@
Proving that there are one-way functions relies on assumptions. Proving that there are one-way functions relies on assumptions.
Factoring Assumption: $\forall a, \exist \varepsilon (n)$, let $p,q\in prime,p,q<2^n$ Factoring Assumption: $\forall a, \exist \epsilon (n)$, let $p,q\in prime,p,q<2^n$
$$ $$
P[p\gets \Pi_n;q\gets \Pi_n;N=p\cdot q:a(N)\in \{p,q\}]<\varepsilon(n) P[p\gets \Pi_n;q\gets \Pi_n;N=p\cdot q:a(N)\in \{p,q\}]<\epsilon(n)
$$ $$
Evidence: To this point, best known procedure to always factor has run time $O(2^{\sqrt{n}\sqrt{log(n)}})$ Evidence: To this point, best known procedure to always factor has run time $O(2^{\sqrt{n}\sqrt{log(n)}})$

View File

@@ -87,8 +87,8 @@ $F=\{f_i:D_i\to R_i\},i\in I$, $I$ is the index set.
1. We can effectively choose $i\gets I$ using $Gen$. 1. We can effectively choose $i\gets I$ using $Gen$.
2. $\forall i$ we ca efficiently sample $x\gets D_i$. 2. $\forall i$ we ca efficiently sample $x\gets D_i$.
3. $\forall i\forall x\in D_i,f_i(x)$ is efficiently computable 3. $\forall i\forall x\in D_i,f_i(x)$ is efficiently computable
4. For any n.u.p.p.t $a$, $\exists$ negligible function $\varepsilon (n)$. 4. For any n.u.p.p.t $a$, $\exists$ negligible function $\epsilon (n)$.
$P[i\gets Gen(1^n);x\gets D_i;y=f_i(x):f(a(y,i,1^n))=y]\leq \varepsilon(n)$ $P[i\gets Gen(1^n);x\gets D_i;y=f_i(x):f(a(y,i,1^n))=y]\leq \epsilon(n)$
#### Theorem #### Theorem
@@ -107,7 +107,7 @@ Algorithm for sampling a random prime $p\gets \Pi_n$
- Deterministic poly-time procedure - Deterministic poly-time procedure
- In practice, a much faster randomized procedure (Miller-Rabin) used - In practice, a much faster randomized procedure (Miller-Rabin) used
$P[x\cancel{\in} prime|test\ said\ x\ prime]<\varepsilon(n)$ $P[x\cancel{\in} prime|test\ said\ x\ prime]<\epsilon(n)$
3. If not, repeat. Do this for polynomial number of times 3. If not, repeat. Do this for polynomial number of times

View File

@@ -108,7 +108,7 @@ Denote safe prime as $\tilde{\Pi}_n=\{p\in \Pi_n:q=\frac{p-1}{2}\in \Pi_{n-1}\}$
Then Then
$$ $$
P\left[p\gets \tilde{\Pi_n};a\gets\mathbb{Z}_p^*;g=a^2\neq 1;x\gets \mathbb{Z}_q;y=g^x\mod p:\mathcal{A}(y)=x\right]\leq \varepsilon(n) P\left[p\gets \tilde{\Pi_n};a\gets\mathbb{Z}_p^*;g=a^2\neq 1;x\gets \mathbb{Z}_q;y=g^x\mod p:\mathcal{A}(y)=x\right]\leq \epsilon(n)
$$ $$
$p\gets \tilde{\Pi_n};a\gets\mathbb{Z}_p^*;g=a^2\neq 1$ is the function condition when we do the encryption on cyclic groups. $p\gets \tilde{\Pi_n};a\gets\mathbb{Z}_p^*;g=a^2\neq 1$ is the function condition when we do the encryption on cyclic groups.

View File

@@ -82,7 +82,7 @@ $$
#### Definition 27.2 Negligible function #### Definition 27.2 Negligible function
A function $\varepsilon(n)$ is negligible if for every $c$. there exists some $n_0$ such that for all $n>n_0$, $\epsilon (n)\leq \frac{1}{n^c}$. A function $\epsilon(n)$ is negligible if for every $c$. there exists some $n_0$ such that for all $n>n_0$, $\epsilon (n)\leq \frac{1}{n^c}$.
#### Definition 27.3 Strong One-Way Function #### Definition 27.3 Strong One-Way Function

View File

@@ -106,19 +106,19 @@ To avoid confusion with sets, we use $(p_n)_{n=1}^\infty$ or $(p_n)$
Let $(X,d)$ be a metric space. Let $(p_n)$ be a sequence in $X$. Let $(X,d)$ be a metric space. Let $(p_n)$ be a sequence in $X$.
Let $p\in X$. We say $(p_x)$ **converges** to $p$ if $\forall \varepsilon>0,\exists N\in\mathbb{N}$ such that $\forall n\geq N$, $d(p_n,p)<\varepsilon$. ($p_n\in B_\varepsilon (p)$) Let $p\in X$. We say $(p_x)$ **converges** to $p$ if $\forall \epsilon>0,\exists N\in\mathbb{N}$ such that $\forall n\geq N$, $d(p_n,p)<\epsilon$. ($p_n\in B_\epsilon (p)$)
Notation $\lim_{n\to \infty} p_n=p$, $p_n\to p$ Notation $\lim_{n\to \infty} p_n=p$, $p_n\to p$
We say $(p_n)$ converges if $\exists p\in X$ such that $p_n\to p$. We say $(p_n)$ converges if $\exists p\in X$ such that $p_n\to p$.
i.e. $\exists p\in X$ such that $\forall\varepsilon>0,\exists N\in\mathbb{N}$ such that $\forall n\geq N,d(p_n,p)<\varepsilon$ i.e. $\exists p\in X$ such that $\forall\epsilon>0,\exists N\in\mathbb{N}$ such that $\forall n\geq N,d(p_n,p)<\epsilon$
We say $(p_n)$ **diverges** if $(p_n)$ doesn't converge. We say $(p_n)$ **diverges** if $(p_n)$ doesn't converge.
i.e. $\forall p\in X$, $p_n\cancel{\to} p$ i.e. $\forall p\in X$, $p_n\cancel{\to} p$
i.e. $\forall p\in X$ such that $\exists \varepsilon>0,\forall N\in\mathbb{N}$ such that $\exists n\geq N,d(p_n,p)\geq\varepsilon$ i.e. $\forall p\in X$ such that $\exists \epsilon>0,\forall N\in\mathbb{N}$ such that $\exists n\geq N,d(p_n,p)\geq\epsilon$
#### Definition 3.2 #### Definition 3.2
@@ -128,16 +128,16 @@ Example:
$X=\mathbb{C}$, $s_n=\frac{1}{n}$ $X=\mathbb{C}$, $s_n=\frac{1}{n}$
Then $s_n\to 0$ i.e. $\forall \varepsilon>0 \exists N\in \mathbb{N}$ such that $\forall n\geq N$, $|s_n-0|<\varepsilon$. Then $s_n\to 0$ i.e. $\forall \epsilon>0 \exists N\in \mathbb{N}$ such that $\forall n\geq N$, $|s_n-0|<\epsilon$.
Proof: Proof:
Let $\varepsilon >0$ (arbitrary) Let $\epsilon >0$ (arbitrary)
Let $N\in \mathbb{N}$ be greater than $\frac{1}{\varepsilon}$ (by Archimedean property) e.g. $N=\frac{1}{\varepsilon}+1$ (we choose $N$) Let $N\in \mathbb{N}$ be greater than $\frac{1}{\epsilon}$ (by Archimedean property) e.g. $N=\frac{1}{\epsilon}+1$ (we choose $N$)
Let $n\geq N$ (arbitrary) Let $n\geq N$ (arbitrary)
Then $|s_n-q|=\frac{1}{n}\leq \frac{1}{N}\leq \varepsilon$ Then $|s_n-q|=\frac{1}{n}\leq \frac{1}{N}\leq \epsilon$
EOP EOP