Files
NoteNextra-origin/content/CSE442T/CSE442T_L12.md
2025-09-17 14:27:46 -05:00

4.6 KiB

CSE442T Introduction to Cryptography (Lecture 12)

Chapter 3: Indistinguishability and Pseudorandomness

\{X_n\} and \{Y_n\} are distinguishable by \mu(n) if \exists distinguisher \mathcal{D}


|P[x\gets X_n:\mathcal{D}(x)=1]-P[y\gets Y_n:\mathcal{D}(y)=1]|\geq \mu(n)
  • If \mu(n)\geq \frac{1}{p(n)}\gets poly(n) for infinitely many n, then \{X_n\} and \{Y_n\} are distinguishable.
  • Otherwise, indistinguishable (|diff|<\epsilon(n))

Property: Closed under efficient procedures.

If M is any n.u.p.p.t. which can take a ample from t from X_n,Y_n as input M(X_n)

If \{X_n\}\approx\{Y_n\}, then so are \{M(X_n)\}\approx\{M(Y_n)\}

Proof:

If \mathcal{D} distinguishes M(X_n) and M(Y_n) by \mu(n) then \mathcal{D}(M(\cdot)) is also a polynomial-time distinguisher of X_n,Y_n.

Hybrid Lemma

Let X^0_n,X^1_n,\dots,X^m_n are ensembles indexed from 1,..,m

If \mathcal{D} distinguishes X_n^0 and X_n^m by \mu(n), then \exists i,1\leq i\leq m where X_{n}^{i-1} and X_n^i are distinguished by \mathcal{D} by \frac{\mu(n)}{m}

Proof: (we use triangle inequality.) Let p_i=P[t\gets X_n^i:\mathcal{D}(t)=1],0\leq i\leq m. We have |p_0-p_m|\geq m(n)

Using telescoping tricks:


\begin{aligned}
|p_0-p_m|&=|p_0-p_1+p_1-p_2+\dots +p_{m-1}-p_m|\\
&\leq |p_0-p_1|+|p_1-p_2|+\dots+|p_{m-1}-p_m|\\
\end{aligned}

If all |p_{i-1}-p_i|<\frac{\mu(n)}{m},|p_0-p_m|<\mu_n contradiction.

In applications, only useful if m\leq q(n) polynomial

If X^0_n and X^m_n are distinguishable by \frac{1}{p(n)}, then 2 inner "hybrids" are distinguishable \frac{1}{p(n)q(n)}=\frac{1}{poly(n)}

Example:

For some Brian in Week 1 and Week 50, a distinguisher \mathcal{D} outputs 1 if hair is considered "long".

There is some week i,1\leq i\leq 50 |p_{i-1}-p_i|\geq 0.02

By prediction lemma, there is a machine that could


P[b\to \{0,1\};pic\gets X^{i-1+b}:\mathcal{A}(pic)=b]\geq \frac{1}{2}+\frac{0.02}{2}=0.51

Next bit test (NBT)

We say \{X_n\} passes the next bit test if \forall i\in\{0,1,...,l(n)-1\} on \{0,1\}^{l(n)} and for all adversaries \mathcal{A}:P[t\gets X_n:\mathcal{A}(t_1,t_2,...,t_i)=t_{i+1}]\leq \frac{1}{2}+\epsilon(n) (given first i bit, the probability of successfully predicts i+1 th bit is almost random \frac{1}{2})

Note that for any \mathcal{A}, and any i,


P[t\gets U_{l(n)}:\mathcal{A}(t_1,...t_i)=t_{i+1}]=\frac{1}{2}

If \{X_n\}\approx\{U_{l(n)}\} (pseudorandom), then X_n must pass NBT for all i.

Otherwise \exists \mathcal{A},i where for infinitely many n,


P[t\gets X_n:\mathcal{A}(t_1,t_2,...,t_i)=t_{i+1}]\leq \frac{1}{2}+\epsilon(n)

We can build a distinguisher \mathcal{D} from \mathcal{A}.

The converse if True!

The NBT(Next bit test) is complete.

If \{X_n\} on \{0,1\}^{l(n)} passes NBT, then it's pseudorandom.

Ideas of proof

Full proof is on the text.

Our idea is that we want to create H^{l(n)}_n=\{X_n\} and H^0_n=\{U_{l(n)}\}

We construct "random" bit stream:


H_n^i=\{x\gets X_n;u\gets U_{l(n)};t=x_1x_2\dots x_i u_{i+1}u_{i+2}\dots u_{l(n)}\}

If \{X_n\} were not pseudorandom, there is a D


|P[x\gets X_n:\mathcal{D}(x)=1]-P[u\gets U_{l(n)}:\mathcal{D}(u)=1]|=\mu(n)\geq \frac{1}{p(n)}

By hybrid lemma, there is i,1\leq i\leq l(n) where:


|P[t\gets H^{i-1}:\mathcal{D}(t)=1]-P[t\gets H^i:\mathcal{D}(t)=1]|\geq \frac{1}{p(n)l(n)}=\frac{1}{poly(n)}

l(n) is the step we need to take transform X to X^n

Let,


H^i=x_1\dots x_i u_{i+1}\dots u_{l(n)}\\
H^i=x_1\dots x_i x_{i+1}\dots u_{l(n)}

notice that only two bits are distinguished in the procedure.

\mathcal{D} can distinguish x_{i+1} from a truly random U_{i+1}, knowing the first i bits x_i\dots x_i came from x\gets x_n

So \mathcal{D} can predict x_{i+1} from x_1\dots x_i (contradicting with that X passes NBT)

Pseudorandom Generator

Suppose G:\{0,1\}^*\to\{0,1\}^* is a pseudorandom generator if the following is true:

  1. G is efficiently computable.
  2. |G(x)|\geq |x|\forall x (expansion)
  3. \{x\gets U_n:G(x)\}_n is pseudorandom

n truly random bits \to n^2 pseudorandom bits

PRG exists if and only if one-way function exists

The other part of proof will be your homework, damn.

If one-way function exists, then Pseudorandom Generator exists.

Ideas of proof:

Let f:\{0,1\}^n\to \{0,1\}^n be a strong one-way permutation (bijection).

x\gets U_n

f(x)||x

Not all bits of x would be hard to predict.

Hard-core bit: One bit of information about x which is hard to determine from f(x). P[\text{success}]\leq \frac{1}{2}+\epsilon(n)

Depends on f(x)