2015年9月7日 星期一

Varied Type of Borel-Cantelli Lemma II

About Posts which Tagged by 'Probability'

Let $\{E_n\}$ be arbitrary events in $\mathscr{F}$.  If  for each $m$, $\sum_{n>m}\mathscr{P}\{E_n\mid E_m^c\cap\cdots\cap E_{n-1}^c\}=\infty$, then $\mathscr{P}\{E_n\mbox{ i.o.}\}=1.$

$\bullet$ Proof.

2015年9月6日 星期日

Convergence of Moments (3)

About Posts which Tagged by 'Probability'

Let $\{X_n\}$ and $X$ be random variables.  Let $0<r<\infty$, $X_n\in L^r$, and $X_n\rightarrow X$ in probability.  Then the following three propositions are equivalent.

(1) $\{|X_n|^r\}$ is uniformly integrable;
(2) $X_n\rightarrow X$ in $L^r$;
(3) $\mathscr{E}|X_n|^r\rightarrow\mathscr{E}|X|^r<\infty$.

$\bullet$ Proof.

Convergence of Moments (2)

About Posts which Tagged by 'Probability'

Let $\{X_n\}$ and $X$ be random variables.  If $X_n$ converges in distribution to $X$, and for some $p>0$, $\sup_n\mathscr{E}|X_n|^p=M<\infty$, then for each $r<p$, $$\underset{n\rightarrow\infty}{\lim}\mathscr{E}|X_n|^r=\mathscr{E}|X|^r<\infty.$$

$\bullet$ Proof.

Convergence of Moments (1)

About Posts which Tagged by 'Probability'

Let $\{X_n\}$ and $X$ be random variables.  If $X_n\rightarrow X$ a.e., then for every $r>0$, $$\mathscr{E}|X|^r\leq\underset{n\rightarrow\infty}{\underline{\lim}}\mathscr{E}|X_n|^r.$$If $X_n\rightarrow X$ in $L^r$, and $X\in L^r$, then $\mathscr{E}|X_n|^r\rightarrow\mathscr{E}|X|^r$.

$\bullet$ Proof.

2015年9月4日 星期五

Characteristic Functions

About Posts which Tagged by 'Probability'

For any random variable $X$ with probability measure $\mu$ and distribution function $F$, the characteristic function (ch.f.) is a function $f$ on $\mathbb{R}$ defined as $$f(t)=\mathscr{E}\left(e^{itX}\right)=\int_{-\infty}^\infty e^{itx}\,dF(x)\mbox{  for all }t\in\mathbb{R}.$$There are some simple properties of ch.f.:

2015年9月3日 星期四

Cantelli's Law of Large Numbers

About Posts which Tagged by 'Probability'

If $\{X_n\}$ are independent random variables such that the fourth moments $\mathscr{E}(X_n^4)$ have a common bound and define $S_n=\sum_{j=1}^nX_j$, then $$\frac{S_n-\mathscr{E}(S_n)}{n}\rightarrow0\mbox{  a.e.}$$

$\bullet$ Proof.
WLOG, suppose $\mathscr{E}(X_n)=0$ for all $n$ and denote the common bound of $\mathscr{E}(X_n^4)$ to be $$\mathscr{E}(X_n^4)\leq M_4<\infty\mbox{  for all }n.$$Then by Lyapunov's inequality, we have the second moments $$\mathscr{E}|X_n|^2\leq\left[\mathscr{E}|X_n|^4\right]^\frac{2}{4}\leq \sqrt{M_4}<\infty.$$Consider the fourth moment of $S_n$, $$\begin{array}{rl}\mathscr{E}(S_n^4)
&=\mathscr{E}\left[\left(\sum_{j=1}^nX_j\right)^4\right]\\ &= \mathscr{E}\left[\sum_{j=1}^nX_j^4+{4\choose1}\sum_{i\neq j}X_iX_j^3+{4\choose2}\sum_{i\neq j}X_i^2X_j^2\right.\\ &\quad\left.+{4\choose1}{3\choose1}\sum_{i\neq j\neq k}X_iX_jX_k^2+{4\choose1}{3\choose1}{2\choose1}\sum_{i\neq j\neq k\neq l}X_iX_jX_kX_l\right]\\
&=\sum_{j=1}^n\mathscr{E}(X_j^4)+4\sum_{i\neq j}\mathscr{E}(X_i)\mathscr{E}(X_j^3)+6\sum_{i\neq j}\mathscr{E}(X_i^2)\mathscr{E}(X_j^2)\quad(\because\mbox{ indep.})\\ &\quad+12\sum_{i\neq j\neq k}\mathscr{E}(X_i)\mathscr{E}(X_j)\mathscr{E}(X_k^2)+24\sum_{i\neq j\neq k\neq l}\mathscr{E}(X_i)\mathscr{E}(X_j)\mathscr{E}(X_k)\mathscr{E}(X_l)\\ &=\sum_{j=1}^n\mathscr{E}(X_j^4)+6\sum_{i\neq j}\mathscr{E}(X_i^2)\mathscr{E}(X_j^2)\qquad\qquad(\because\mbox{ assuming }\mathscr{E}(X_n)=0.) \\ &\leq nM_4+3n(n-1)\sqrt{M_4}\sqrt{M_4}=n(3n-2)M_4.\end{array}$$By Markov's inequality, for $\varepsilon>0$, $$\mathscr{P}\{|S_n|>n\varepsilon\}\leq\frac{\mathscr{E}(S_n^4)}{n^4\varepsilon^4}\leq\frac{n(3n-2)M_4}{n^4\varepsilon^4}=\frac{3M_4}{n^2\varepsilon^4}+\frac{2M_4}{n^3\varepsilon^4}.$$Thus, $$\sum_n\mathscr{P}\{|S_n|>n\varepsilon\}\leq\sum_n\frac{3M_4}{n^2\varepsilon^4}+\frac{2M_4}{n^3\varepsilon^4}<\infty.$$By Borel-Cantelli Lemma I, we have $$\mathscr{P}\{|S_n|>n\varepsilon\mbox{ i.o.}\}=0\implies\frac{S_n}{n}\rightarrow0\mbox{  a.e.}$$

$\Box$

2015年9月2日 星期三

The Converse of Strong Law of Number

About Posts which Tagged by 'Probability'

Let $\{X_n\}$ be a sequence of i.i.d. random variables, we have $$\frac{S_n}{n}\mbox{ converges a.e. }\implies\mathscr{E}|X_1|<\infty.$$

$\bullet$ Proof.

Application of Fubini's Theorem (2)

About Posts which Tagged by 'Probability'

If $X$ and $Y$ are independent, $\mathscr{E}|X|^p<\infty$ for some $p>1$, and $\mathscr{E}(Y)=0$, then $\mathscr{E}|X+Y|^p\geq\mathscr{E}|X|^p$.

$\bullet$ Proof.

Application of Fubini's Theorem (1)

About Posts which Tagged by 'Probability'

If $X$ and $Y$ are independent and for some $p>0$, $\mathscr{E}|X+Y|^p<\infty$, then $\mathscr{E}|X|^p<\infty$ and $\mathscr{E}|Y|^p<\infty$.

$\bullet$ Proof.

Probability Measure

About Posts which Tagged by 'Probability'

Let $\Omega$ be a sample space, $\mathscr{F}$ be a Borel field of subsets of $\Omega$.  A probability measure (p.m.) $\mathscr{P}\{\cdot\}$ on $\mathscr{F}$ is a real-valued function with domian $\mathscr{F}$ satisfying 

(1) $\displaystyle\forall\,E\in\mathscr{F}:\,\mathscr{P}\{E\}\geq0$.
(2) If $\{E_j\}$ is a countable collection of (pairwise) disjoint sets in $\mathscr{F}$, then $$\mathscr{P}\left\{\bigcup_J E_j\right\}=\sum_j\mathscr{P}\{E_j\}.$$(3)$\mathscr{P}\{\Omega\}=1$.

These axioms imply the following many properties for all sets in $\mathscr{F}$:

Independence and Fubini's Theorem

About Posts which Tagged by 'Probability'

A basic property of the expectation of two independent random variables is the following.

[Theorem] If $X$ and $Y$ are independent and both have finite expectations, then $$\mathscr{E}(XY)=\mathscr{E}(X)\mathscr{E}(Y).$$

To prove this, the Fubini's theorem is the quick solution, otherwise, we prove this by the basic definition of the expectation.

$\bullet$ Proof.

2015年9月1日 星期二

Varied Type of Slutsky's Theorem (1): Converge in Probability

About Posts which Tagged by 'Probability'

If $X_n\rightarrow X$ and $Y_n\rightarrow Y$ both in probability, then
(1) $X_n\pm Y_n\rightarrow X\pm Y$ in probability;
(2) $X_nY_n\rightarrow XY$ in probability.


$\bullet$ Proof.

Varied Type of Slutsky's Theorem (2): Converge in $r$-th Mean

About Posts which Tagged by 'Probability'

(1) If $X_n\rightarrow X$ and $Y_n\rightarrow Y$ both in $L^p$, then $$X_n\pm Y_n\rightarrow X\pm Y\mbox{  in }L^p;$$
(2) If $X_n\rightarrow X$ in $L^p$ and $Y_n\rightarrow Y$ in $L^q$, where $p>1$ and $1/p+1/q=1$, then $$X_nY_n\rightarrow XY\mbox{  in }L^1.$$


$\bullet$ Proof.

Counterexample for Omitting UAN Condition in Feller's Proof

About Posts which Tagged by 'Probability'

Recall the Lindeberg-Feller Central Limit Theorem.

Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent.  Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$.  In order that as $n\rightarrow\infty$ the two conclusions below both hold:

(1) $S_n$ converges in distribution to $\Phi$.
(2) $\{X_{nj}\}$ is uniformly asymptotically negligible (UAN);

it is necessary and sufficient that for each $\eta>0$, we have $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0$$
It is important that the sufficient conditions for the Lindeberg's criterion should hold simultaneously.  Here is a counterexample that we omit the requirement of UAN.

$\bullet$ Counterexample.
Let $\{X_n\}$ be a sequence of independent random variables with densities $X_n\sim N\left(0,\sigma^2_n\right)$, where $\sigma^2_1=1$ and $\sigma^2_k=2^{k-1}$ for $k\geq2$.  Then $$B_n^2=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2_k=1+1+2+4+\cdots+2^{n-2}=2^{n-1}.$$First, we find the limit distribution of $S_n$.  Since for each $k$, $$\frac{X_k}{B_n}\sim N\left(0,\sigma^2_k/2^{n-1}\right),$$ and $$\sum_{k=1}^n\frac{\sigma^2_k}{2^{n-1}}=\frac{1}{2^{n-1}}+\frac{1}{2^{n-1}}\sum_{k=2}^n2^{k-2}=\frac{1}{2^{n-1}}+\frac{2^{n-1}-1}{2^{n-1}}=1,$$we have $$\frac{S_n}{B_n}\sim\mathscr{N}\left(0,1\right).$$
The sequence $\{X_n\}$ is not UAN, since for $\epsilon>0$,
$$\begin{array}{rl}\underset{n\rightarrow\infty}{\lim}\underset{1\leq k\leq n}{\max}\mathscr{P}\{|X_k|>\epsilon\}&=\underset{n\rightarrow\infty}{\lim}\underset{2\leq k\leq n}{\max}\mathscr{P}\left\{\frac{|X_k|}{2^{(k-1)/2}}>\frac{\epsilon}{2^{(k-1)/2}}\right\}\\
&=\underset{n\rightarrow\infty}{\lim}\underset{2\leq k\leq n}{\max}2\Phi\left(-\frac{\epsilon}{2^{(k-1)/2}}\right)\\
&=\underset{n\rightarrow\infty}{\lim}2\Phi\left(-\frac{\epsilon}{2^{(n-1)/2}}\right)\\
&=2\Phi(0)=1\neq0. \end{array}$$
Note that, the Lindeberg's condition implies no significant large variance among $\{X_n\}$.  In this case, we have $$\underset{n\rightarrow\infty}{\lim}\underset{1\leq k\leq n}{\max}\frac{\sigma^2_k}{B_n^2}=\underset{n\rightarrow\infty}{\lim}\frac{2^{n-2}}{2^{n-1}}=\frac{1}{2}\neq0.$$Which implies the Lindeberg's condition does not hold.


Lindeberg's Condition Implies Each Variance to Be Similarly Small

About Posts which Tagged by 'Probability'

Recall the Lindeberg-Feller Central Limit Theorem (short version).

Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent.  Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$.  The Lindeberg's condition for $S_n$ converginf to $\Phi$ is $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0\mbox{  for each }\eta>0.$$This criterion implies that there is no such $X_{nj}$, $j=1,\ldots,k_n$, whose variance dominates the others', i.e. $$\underset{1\leq j\leq k_n}{\max}\sigma_{nj}\rightarrow0.$$

$\bullet$ Proof.

Application of Dominate Convergence Theorem

About Posts which Tagged by 'Probability'

If $\{X_n\}$ is a sequence of identical distributed random variables with finite mean, then $$\underset{n\rightarrow\infty}{\lim}\frac{1}{n}\mathscr{E}\left(\underset{1\leq j\leq n}{\max}|X_j|\right)=0.$$

$\bullet$ Proof.

Proof of Inequality (10)

About Posts which Tagged by 'Probability'

Let $X$ be a random variable.  If $\mathscr{E}(X^2)=1$ and $\mathscr{E}|X|\geq a>0$, then $$\mathscr{P}\{|X|\geq\lambda a\}\geq(1-\lambda)^2a^2\mbox{  for }0\leq\lambda\leq1.$$

$\bullet$ Proof.