5.11. Let X1, X2, ... be independent random variables with mean zero and finite variance. Put Sn = X1 + ... + X. (a) Let
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am
5.11. Let X1, X2, ... be independent random variables with mean zero and finite variance. Put Sn = X1 + ... + X. (a) Let
5.11. Let X1, X2, ... be independent random variables with mean zero and finite variance. Put Sn = X1 + ... + X. (a) Let e > 0 and set A {w : maxishisn |S,(W) > €}. Define Ak {w: max15;<k-1 |S;(W) < c. Sk(w) > €}, k = 1,...,n. Show that the Ak are disjoint and A = UK-AR (b) Show that Sn - Sk: is independent of Sk. (c) Show that Sa SdP > SA SZAP. (Hint: Sn = S: + (n - Sk). n (d) Deduce that E{8}}xidP > eºP{A}. $2 E. A > k=1 Λ, (e) Conclude Kolmogorov's Inequality: P{ max IS./ ZE} <*B{5}}. (5.4) 1<n [Note that if maxi<k<n Sn) is replaced by Sn itself, this is just Chebyshev's in- equality. So Kolmogorov's inequality is a significant extension of Chebyshev's in- equality. Inequalities like this will reappear in Chapter 9.] 5.12. (Convergence of Series.) Let X1, X2, ... be independent random variables with mean zero and finite variance. Put Sn X1 + ... + Xn. Show that if ~~ E{X}} < , that [m=1 Xn converges a.s. (Hint: Use Kolmogorov's inequality (Exercise 5.11.) -