# The law of the iterated logarithm

Jordan Bell
June 3, 2014

There are surprisingly few books that have complete proofs of the law of the iterated logarithm. This is like many theorems that are hard to prove, and which are often given a squalid sketch that might be useful to the author as a memory aid but which is unlikely to help someone seeing the result for the first time, especially when the result is not even stated with the meticulous precision a newcomer needs. Indeed, what a serious beginner needs most are as many details as can be given, rather than a presentation that the master imagines gets to the point.

Aside from some books making assumptions that are not needed for the result to be true, such as having finite third moments, most books present the Hartman-Wintner law of the iterated logarithm, which talks about the limit superior and limit inferior of a sequence. There is a version due to Strassen that makes a single assertion about the set of limit points of a sequence rather than merely its limit inferior and limit superior. For a sequence of real numbers $x_{n}$, we denote the set of limit points of the sequence by

 $H(x_{n})$

Then,

 $\liminf_{n\to\infty}x_{n}=\inf H(x_{n}),\qquad\limsup_{n\to\infty}x_{n}=\sup H% (x_{n}).$

The following is the statement due to Strassen of the law of the iterated logarithm, given in Bauer.11 1 Heinz Bauer, Probability Theory, p. 283, chapter VII, Theorem 33.1. The proof indeed involves a lot of machinery, but the machinery is laid out cleanly in Bauer’s presentation. We write

 $L(x)=\begin{cases}1&\log x\leq e\\ \log\log x&\log x>e.\end{cases}$
###### Theorem 1 (Law of the iterated logarithm).

Suppose that $X_{n}:(\Omega,\mathscr{F},P)\to\mathbb{R}$ is a sequence of independent identically distributed $L^{2}$ random variables with $E(X_{1})=0$ and $\sigma^{2}=\mathrm{Var}(X_{1})=E(X_{1}^{2})$. For $P$-almost all $\omega\in\Omega$,

 $H\left(\frac{S_{n}(\omega)}{\sqrt{2nL(n)}}\right)=[-\sigma,\sigma],$

where $S_{n}=\sum_{k=1}^{n}X_{k}$.