2015年8月14日 星期五

Kolmogorov's Three Series Theorem

About Posts which Tagged by 'Probability'

Let $\{X_n\}$ be a sequence of independent random variables.  Define for fixed $A>0$, $$Y_n(\omega)=\begin{cases}X_n(\omega),&\mbox{if }|X_n(\omega)|\leq A\\ 0,&\mbox{if }|X_n(\omega)|>A \end{cases}$$ then $\sum_nX_n$ converges a.e. $\iff$ the following three series converge a.e.
(1) $\sum_n\mathscr{P}\{|X_n|>A\}=\sum_n\mathscr{P}\{X_n\neq Y_n\}$;
(2) $\sum_n\mathscr{E}(Y_n)$;
(3) $\sum_n\sigma^2(Y_n)$.

Application of Three Series Theorem on Strong Convergence



$\bullet$ Proof.
We first introduce two inequalities that would be used in the proof.

[Theorem 1] Chebyshev type for maximal sum of random variables I.
Let $\{X_n\}$ be a sequence of independent random variables such that $$\forall\,n: \mathscr{E}(X_n)=0\mbox{ and }\mathscr{E}(X_n^2)=\sigma^2(X_n)<\infty.$$Then we have for every $\varepsilon>0$ $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|>\varepsilon\right\}\leq\frac{\sigma^2(S_n)}{\varepsilon^2}.$$

[Theorem 2] Chebyshev type for maximal sum of random variables II.
Let $\{X_n\}$ be a sequence of independent random variables with finite means and suppose that there exists $A$ such that $$\forall\,n: |X_n-\mathscr{E}(X_n)|\leq A<\infty.$$Then we have for every $\varepsilon>0$ $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|\leq\varepsilon\right\}\leq\frac{(2A+4\varepsilon)^2}{\sigma^2(S_n)}.$$

Back to the proof.
$(\Longleftarrow)$ Suppose that the three series, (1), (2) and (3), converge.  Our goal is to show $\sum_nX_n$ converge.  Thus we consider the convergence of $\sum_n\{Y_n-\mathscr{E}(Y_n)\}$ with (3) and then use (2) to show $\sum_nY_n$ also converges, and finally, by (1) we have the convergence property of  $\sum_nX_n$ is equivalent to $\sum_nY_n$.

Let $m\geq1$. We show the almost surely convergence of $\sum_n\{Y_n-\mathscr{E}(Y_n)\}$ by $$\begin{array}{l}\mathscr{P}\left\{\bigcup_{n=1}^\infty\bigcap_{k=n}^\infty\left\{\left|\sum_{j=n}^k\{Y_j-\mathscr{E}(Y_j)\}\right|\leq\frac{1}{m}\right\}\right\} \\
\quad=\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{\bigcap_{k=n}^\infty\left\{\left|\sum_{j=n}^k\{Y_j-\mathscr{E}(Y_j)\}\right|\leq\frac{1}{m}\right\}\right\}\quad(\because\mbox{ monotone}) \\
\quad=\underset{n\rightarrow\infty}{\lim}\underset{n_0\rightarrow\infty}{\lim}\mathscr{P}\left\{\bigcap_{k=n}^{n_0}\left\{\left|\sum_{j=n}^k\{Y_j-\mathscr{E}(Y_j)\}\right|\leq\frac{1}{m}\right\}\right\} \\
\quad=\underset{n\rightarrow\infty}{\lim}\underset{n_0\rightarrow\infty}{\lim}\mathscr{P}\left\{\underset{n\leq k\leq n_0}{\max}\left|\sum_{j=n}^k\{Y_j-\mathscr{E}(Y_j)\}\right|\leq\frac{1}{m}\right\} \\
\quad\geq\underset{n\rightarrow\infty}{\lim}\underset{n_0\rightarrow\infty}{\lim}1-m^2\sum_{k=n}^{n_0}\sigma^2(Y_k) \quad(\because\mbox{ Theorem 1})\\
\quad=1-m^2\underset{n\rightarrow\infty}{\lim}\underset{n_0\rightarrow\infty}{\lim}\sum_{k=n}^{n_0}\sigma^2(Y_k) \\
\quad=1\quad\mbox{by (3),}\end{array}$$which is $\sum_n\sigma^2(Y_n)<\infty\implies\sum_{k=n}^{n_0}\sigma^2(Y_k)\rightarrow0\mbox{ as }n\rightarrow0,\,n_0\rightarrow0.$Thus, $$\sum_n\{Y_n-\mathscr{E}(Y_n)\}\mbox{  converges a.e.}$$
By (2), we have $\sum_n\mathscr{E}(Y_n)$ converges implies $\sum_nY_n$ also converges.  Finally by (1), since $\sum_n\mathscr{P}\{X_n\neq Y_n\}$ converges, we have $$\sum_nY_n\mbox{ converges a.e.}\iff\sum_nX_n\mbox{ converges a.e..}$$

$(\Longrightarrow)$ Suppose $\sum_nX_n$ converges a.e.
(1) $\sum_nX_n$ converges a.e. implise, for each $A>0$, $\mathscr{P}\{|X_n|>A\mbox{ i.o.}\}=0.$  Suppose $\sum_n\mathscr{P}\{|X_n|>A\}=\infty$.  Since $X_n$ are independent, by Borel-Cantelli Lemma II, we have $$\mathscr{P}\{|X_n|>A\mbox{ i.o.}\}=1\mbox{ contradict to }\sum_nX_n \mbox{ converges a.e.}$$Thus, $\sum_n\mathscr{P}\{|X_n|>A\}<\infty$.  Which implies $$\sum_n\mathscr{P}\{X_n\neq Y_n\}<\infty, $$that is, $\{X_n\}$ and $\{Y_n\}$ are equivalent.  Thus we have $\sum_nY_n$ also converges a.e.

(3) Consider $|Y_n-\mathscr{E}(Y_n)|\leq2A$ which is bounded.  By Theorem 2 above, we have $$\mathscr{P}\left\{\underset{n\leq k\leq n'}{\max}\left|\sum_{j=n}^kY_j\right|\leq\varepsilon\right\}\leq\frac{(4A+4\varepsilon)^2}{\sum_{j=n}^{n'}\sigma^2(Y_j)}\,\forall\,n'>n.$$Suppose the denominator $\sum_{j=n}^{n'}\sigma^2(Y_j)$ diverges as $n'\rightarrow\infty$, then $$\mathscr{P}\left\{\underset{n\leq k\leq n'}{\max}\left|\sum_{j=n}^kY_j\right|\leq\varepsilon\right\}=0.$$Which means for all $\omega\in\Omega\setminus N$, where $N$ is null set, and for all $n$, there exists $k>n$ such that $\left|\sum_{j=n}^kY_j(\omega)\right|>\varepsilon$.  This contradict to that the series $\sum_nY_n$ converges in (1).  Thus, $$\sum_n\sigma^2(Y_n)\mbox{ converges a.e.}$$

(2) Finally, consider the series $\sum_n\{Y_n-\mathscr{E}(Y_n)\}$ and the sufficiency part of this Three Series Theorem.  We have already know that$$\sum_n\mathscr{E}\left(Y_n-\mathscr{E}(Y_n)\right)=0\mbox{ and }\sum_n\sigma^2(Y_n)<\infty.$$Furthermore, since $|Y_n-\mathscr{E}(Y_n)|\leq2A$, $$\mathscr{P}\{|Y_n-\mathscr{E}(Y_n)|>2A\}=0.$$The conditions (1), (2) and (3) are satisfied, thus we have $$\sum_n\{Y_n-\mathscr{E}(Y_n)\}\mbox{ converges a.e.}$$In (1), we know that $\sum_nY_n$ converges a.e., which implies $$\sum_n\mathscr{E}(Y_n)\mbox{ converges a.e.}$$

$\Box$



沒有留言:

張貼留言