# Math 401 Paper 1, Side note 3: Levy's concentration theorem ## Basic definitions ### Lipschitz function #### $\eta$-Lipschitz function Let $(X,\operatorname{dist}_X)$ and $(Y,\operatorname{dist}_Y)$ be two metric spaces. A function $f:X\to Y$ is said to be $\eta$-Lipschitz if there exists a constant $L\in \mathbb{R}$ such that $$ \operatorname{dist}_Y(f(x),f(y))\leq L\operatorname{dist}_X(x,y) $$ for all $x,y\in X$. And $\eta=\|f\|_{\operatorname{Lip}}=\inf_{L\in \mathbb{R}}L$. That basically means that the function $f$ should not change the distance between any two pairs of points in $X$ by more than a factor of $L$. ## Levy's concentration theorem in _High-dimensional probability_ by Roman Vershynin ### Levy's concentration theorem (Vershynin's version) > This theorem is exactly the 5.1.4 on the _High-dimensional probability_ by Roman Vershynin. #### Isoperimetric inequality on $\mathbb{R}^n$ Among all subsets $A\subset \mathbb{R}^n$ with a given volume, the Euclidean ball has the minimal area. That is, for any $\epsilon>0$, Euclidean balls minimize the volume of the $\epsilon$-neighborhood of $A$. Where the volume of the $\epsilon$-neighborhood of $A$ is defined as $$ A_\epsilon(A)\coloneqq \{x\in \mathbb{R}^n: \exists y\in A, \|x-y\|_2\leq \epsilon\}=A+\epsilon B_2^n $$ Here the $\|\cdot\|_2$ is the Euclidean norm. (The theorem holds for both geodesic metric on sphere and Euclidean metric on $\mathbb{R}^n$.) #### Isoperimetric inequality on the sphere Let $\sigma_n(A)$ denotes the normalized area of $A$ on $n$ dimensional sphere $S^n$. That is $\sigma_n(A)\coloneqq\frac{\operatorname{Area}(A)}{\operatorname{Area}(S^n)}$. Let $\epsilon>0$. Then for any subset $A\subset S^n$, given the area $\sigma_n(A)$, the spherical caps minimize the volume of the $\epsilon$-neighborhood of $A$. The above two inequalities is not proved in the Book _High-dimensional probability_. To continue prove the theorem, we use sub-Gaussian concentration *(Chapter 3 of _High-dimensional probability_ by Roman Vershynin)* of sphere $\sqrt{n}S^n$. This will leads to some constant $C>0$ such that the following lemma holds: #### The "Blow-up" lemma Let $A$ be a subset of sphere $\sqrt{n}S^n$, and $\sigma$ denotes the normalized area of $A$. Then if $\sigma\geq \frac{1}{2}$, then for every $t\geq 0$, $$ \sigma(A_t)\geq 1-2\exp(-ct^2) $$ where $A_t=\{x\in S^n: \operatorname{dist}(x,A)\leq t\}$ and $c$ is some positive constant. #### Proof of the Levy's concentration theorem Proof: Without loss of generality, we can assume that $\eta=1$. Let $M$ denotes the median of $f(X)$. So $\operatorname{Pr}[|f(X)\leq M|]\geq \frac{1}{2}$, and $\operatorname{Pr}[|f(X)\geq M|]\geq \frac{1}{2}$. Consider the sub-level set $A\coloneqq \{x\in \sqrt{n}S^n: |f(x)|\leq M\}$. Since $\operatorname{Pr}[X\in A]\geq \frac{1}{2}$, by the blow-up lemma, we have $$ \operatorname{Pr}[X\in A_t]\geq 1-2\exp(-ct^2) $$ And since $$ \operatorname{Pr}[X\in A_t]\leq \operatorname{Pr}[f(X)\leq M+t] $$ Combining the above two inequalities, we have $$ \operatorname{Pr}[f(X)\leq M+t]\geq 1-2\exp(-ct^2) $$ ## Levy's concentration theorem in _Metric Structures for Riemannian and Non-Riemannian Spaces_ by M. Gromov ### Levy's concentration theorem (Gromov's version) > The Levy's lemma can also be found in _Metric Structures for Riemannian and Non-Riemannian Spaces_ by M. Gromov. $3\frac{1}{2}.19$ The Levy concentration theory. #### Theorem $3\frac{1}{2}.19$ Levy concentration theorem: An arbitrary 1-Lipschitz function $f:S^n\to \mathbb{R}$ concentrates near a single value $a_0\in \mathbb{R}$ as strongly as the distance function does. That is $$ \mu\{x\in S^n: |f(x)-a_0|\geq\epsilon\} < \kappa_n(\epsilon)\leq 2\exp(-\frac{(n-1)\epsilon^2}{2}) $$ where $$ \kappa_n(\epsilon)=\frac{\int_\epsilon^{\frac{\pi}{2}}\cos^{n-1}(t)dt}{\int_0^{\frac{\pi}{2}}\cos^{n-1}(t)dt} $$ $a_0$ is the **Levy mean** of function $f$, that is the level set of $f^{-1}:\mathbb{R}\to S^n$ divides the sphere into equal halves, characterized by the following equality: $$ \mu(f^{-1}(-\infty,a_0])\geq \frac{1}{2} \text{ and } \mu(f^{-1}[a_0,\infty))\geq \frac{1}{2} $$ Hardcore computing may generates the bound but M. Gromov did not make the detailed explanation here. > Detail proof by Takashi Shioya. > > The central idea is to draw the connection between the given three topological spaces, $S^{2n+1}$, $CP^n$ and $\mathbb{R}$. First, we need to introduce the following distribution and lemmas/theorems: **OBSERVATION** consider the orthogonal projection from $\mathbb{R}^{n+1}$, the space where $S^n$ is embedded, to $\mathbb{R}^k$, we denote the restriction of the projection as $\pi_{n,k}:S^n(\sqrt{n})\to \mathbb{R}^k$. Note that $\pi_{n,k}$ is a 1-Lipschitz function (projection will never increase the distance between two points). We denote the normalized Riemannian volume measure on $S^n(\sqrt{n})$ as $\sigma^n(\cdot)$, and $\sigma^n(S^n(\sqrt{n}))=1$. #### Definition of Gaussian measure on $\mathbb{R}^k$ We denote the Gaussian measure on $\mathbb{R}^k$ as $\gamma^k$. $$ d\gamma^k(x)\coloneqq\frac{1}{\sqrt{2\pi}^k}\exp(-\frac{1}{2}\|x\|^2)dx $$ $x\in \mathbb{R}^k$, $\|x\|^2=\sum_{i=1}^k x_i^2$ is the Euclidean norm, and $dx$ is the Lebesgue measure on $\mathbb{R}^k$. Basically, you can consider the Gaussian measure as the normalized Lebesgue measure on $\mathbb{R}^k$ with standard deviation $1$. #### Maxwell-Boltzmann distribution law > It is such a wonderful fact for me, that the projection of $n+1$ dimensional sphere with radius $\sqrt{n}$ to $\mathbb{R}^k$ is a Gaussian distribution as $n\to \infty$. For any natural number $k$, $$ \frac{d(\pi_{n,k})_*\sigma^n(x)}{dx}\to \frac{d\gamma^k(x)}{dx} $$ where $(\pi_{n,k})_*\sigma^n$ is the push-forward measure of $\sigma^n$ by $\pi_{n,k}$. In other words, $$ (\pi_{n,k})_*\sigma^n\to \gamma^k\text{ weakly as }n\to \infty $$
Proof We denote the $n$ dimensional volume measure on $\mathbb{R}^k$ as $\operatorname{vol}_k$. Observe that $\pi_{n,k}^{-1}(x),x\in \mathbb{R}^k$ is isometric to $S^{n-k}(\sqrt{n-\|x\|^2})$, that is, for any $x\in \mathbb{R}^k$, $\pi_{n,k}^{-1}(x)$ is a sphere with radius $\sqrt{n-\|x\|^2}$ (by the definition of $\pi_{n,k}$). So, $$ \begin{aligned} \frac{d(\pi_{n,k})_*\sigma^n(x)}{dx}&=\frac{\operatorname{vol}_{n-k}(\pi_{n,k}^{-1}(x))}{\operatorname{vol}_k(S^n(\sqrt{n}))}\\ &=\frac{(n-\|x\|^2)^{\frac{n-k}{2}}}{\int_{\|x\|\leq \sqrt{n}}(n-\|x\|^2)^{\frac{n-k}{2}}dx}\\ \end{aligned} $$ as $n\to \infty$. $(n-\|x\|^2)^{\frac{n-k}{2}}=\left(n(1-\frac{\|x\|^2}{n})\right)^{\frac{n-k}{2}}\to n^{\frac{n-k}{2}}\exp(-\frac{\|x\|^2}{2})$
## References - [High-dimensional probability by Roman Vershynin](https://www.math.uci.edu/~rvershyn/papers/HDP-book/HDP-2.pdf) - [Metric Structures for Riemannian and Non-Riemannian Spaces by M. Gromov](https://www.amazon.com/Structures-Riemannian-Non-Riemannian-Progress-Mathematics/dp/0817638989/ref=tmm_hrd_swatch_0?_encoding=UTF8&dib_tag=se&dib=eyJ2IjoiMSJ9.Tp8dXvGbTj_D53OXtGj_qOdqgCgbP8GKwz4XaA1xA5PGjHj071QN20LucGBJIEps.9xhBE0WNB0cpMfODY5Qbc3gzuqHnRmq6WZI_NnIJTvc&qid=1750973893&sr=8-1) - [Metric Measure Geometry by Takashi Shioya](https://arxiv.org/pdf/1410.0428)