Kolmogorov's three-series theorem

In probability theory, Kolmogorov's Three-Series Theorem, named after Andrey Kolmogorov, gives a criterion for the almost sure convergence of an infinite series of random variables in terms of the convergence of three different series involving properties of their probability distributions. Kolmogorov's three-series theorem, combined with Kronecker's lemma, can be used to give a relatively easy proof of the Strong Law of Large Numbers.[1]

Statement of the theorem

edit

Let   be independent random variables. The random series   converges almost surely in   if the following conditions hold for some  , and only if the following conditions hold for any  :

  1.   converges.
  2. Let  . Then  , the series of expected values of  , converges.
  3.   converges, where   is defined as in the second condition.

Proof

edit

Sufficiency of conditions ("if")

edit

Condition (i) and Borel–Cantelli give that   for   large, almost surely. Hence   converges if and only if   converges. Conditions (ii)-(iii) and Kolmogorov's Two-Series Theorem give the almost sure convergence of  .

Necessity of conditions ("only if")

edit

Suppose that   converges almost surely.

Without condition (i), by Borel–Cantelli there would exist some   such that   for infinitely many  , almost surely. But then the series would diverge. Therefore, we must have condition (i).

We see that condition (iii) implies condition (ii): Kolmogorov's two-series theorem along with condition (i) applied to the case   gives the convergence of  . So given the convergence of  , we have   converges, so condition (ii) is implied.

Thus, it only remains to demonstrate the necessity of condition (iii), and we will have obtained the full result. It is equivalent to check condition (iii) for the series   where for each  ,   and   are IID—that is, to employ the assumption that  , since   is a sequence of random variables bounded by 2, converging almost surely, and with  . So we wish to check that if   converges, then   converges as well. This is a special case of a more general result from martingale theory with summands equal to the increments of a martingale sequence and the same conditions ( ; the series of the variances is converging; and the summands are bounded).[2][3][4]

Example

edit

As an illustration of the theorem, consider the example of the harmonic series with random signs:

 

Here, " " means that each term   is taken with a random sign that is either   or   with respective probabilities  , and all random signs are chosen independently. Let   in the theorem denote a random variable that takes the values   and   with equal probabilities. With   the summands of the first two series are identically zero and var(Yn)= . The conditions of the theorem are then satisfied, so it follows that the harmonic series with random signs converges almost surely. On the other hand, the analogous series of (for example) square root reciprocals with random signs, namely

 

diverges almost surely, since condition (3) in the theorem is not satisfied for any A. Note that this is different from the behavior of the analogous series with alternating signs,  , which does converge.

Notes

edit
  1. ^ Durrett, Rick. "Probability: Theory and Examples." Duxbury advanced series, Third Edition, Thomson Brooks/Cole, 2005, Section 1.8, pp. 60–69.
  2. ^ Sun, Rongfeng. Lecture notes. http://www.math.nus.edu.sg/~matsr/ProbI/Lecture4.pdf Archived 2018-04-17 at the Wayback Machine
  3. ^ M. Loève, "Probability theory", Princeton Univ. Press (1963) pp. Sect. 16.3
  4. ^ W. Feller, "An introduction to probability theory and its applications", 2, Wiley (1971) pp. Sect. IX.9