About Posts which Tagged by 'Probability'
Let $\{E_n\}$ be arbitrary events in $\mathscr{F}$. If for each $m$, $\sum_{n>m}\mathscr{P}\{E_n\mid E_m^c\cap\cdots\cap E_{n-1}^c\}=\infty$, then $\mathscr{P}\{E_n\mbox{ i.o.}\}=1.$
$\bullet$ Proof.
2015年9月7日 星期一
2015年9月6日 星期日
Convergence of Moments (3)
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ and $X$ be random variables. Let $0<r<\infty$, $X_n\in L^r$, and $X_n\rightarrow X$ in probability. Then the following three propositions are equivalent.
(1) $\{|X_n|^r\}$ is uniformly integrable;
(2) $X_n\rightarrow X$ in $L^r$;
(3) $\mathscr{E}|X_n|^r\rightarrow\mathscr{E}|X|^r<\infty$.
$\bullet$ Proof.
Let $\{X_n\}$ and $X$ be random variables. Let $0<r<\infty$, $X_n\in L^r$, and $X_n\rightarrow X$ in probability. Then the following three propositions are equivalent.
(1) $\{|X_n|^r\}$ is uniformly integrable;
(2) $X_n\rightarrow X$ in $L^r$;
(3) $\mathscr{E}|X_n|^r\rightarrow\mathscr{E}|X|^r<\infty$.
$\bullet$ Proof.
Convergence of Moments (2)
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ and $X$ be random variables. If $X_n$ converges in distribution to $X$, and for some $p>0$, $\sup_n\mathscr{E}|X_n|^p=M<\infty$, then for each $r<p$, $$\underset{n\rightarrow\infty}{\lim}\mathscr{E}|X_n|^r=\mathscr{E}|X|^r<\infty.$$
$\bullet$ Proof.
Let $\{X_n\}$ and $X$ be random variables. If $X_n$ converges in distribution to $X$, and for some $p>0$, $\sup_n\mathscr{E}|X_n|^p=M<\infty$, then for each $r<p$, $$\underset{n\rightarrow\infty}{\lim}\mathscr{E}|X_n|^r=\mathscr{E}|X|^r<\infty.$$
$\bullet$ Proof.
Convergence of Moments (1)
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ and $X$ be random variables. If $X_n\rightarrow X$ a.e., then for every $r>0$, $$\mathscr{E}|X|^r\leq\underset{n\rightarrow\infty}{\underline{\lim}}\mathscr{E}|X_n|^r.$$If $X_n\rightarrow X$ in $L^r$, and $X\in L^r$, then $\mathscr{E}|X_n|^r\rightarrow\mathscr{E}|X|^r$.
$\bullet$ Proof.
Let $\{X_n\}$ and $X$ be random variables. If $X_n\rightarrow X$ a.e., then for every $r>0$, $$\mathscr{E}|X|^r\leq\underset{n\rightarrow\infty}{\underline{\lim}}\mathscr{E}|X_n|^r.$$If $X_n\rightarrow X$ in $L^r$, and $X\in L^r$, then $\mathscr{E}|X_n|^r\rightarrow\mathscr{E}|X|^r$.
$\bullet$ Proof.
2015年9月4日 星期五
Characteristic Functions
About Posts which Tagged by 'Probability'
For any random variable $X$ with probability measure $\mu$ and distribution function $F$, the characteristic function (ch.f.) is a function $f$ on $\mathbb{R}$ defined as $$f(t)=\mathscr{E}\left(e^{itX}\right)=\int_{-\infty}^\infty e^{itx}\,dF(x)\mbox{ for all }t\in\mathbb{R}.$$There are some simple properties of ch.f.:
For any random variable $X$ with probability measure $\mu$ and distribution function $F$, the characteristic function (ch.f.) is a function $f$ on $\mathbb{R}$ defined as $$f(t)=\mathscr{E}\left(e^{itX}\right)=\int_{-\infty}^\infty e^{itx}\,dF(x)\mbox{ for all }t\in\mathbb{R}.$$There are some simple properties of ch.f.:
2015年9月3日 星期四
Cantelli's Law of Large Numbers
About Posts which Tagged by 'Probability'
If $\{X_n\}$ are independent random variables such that the fourth moments $\mathscr{E}(X_n^4)$ have a common bound and define $S_n=\sum_{j=1}^nX_j$, then $$\frac{S_n-\mathscr{E}(S_n)}{n}\rightarrow0\mbox{ a.e.}$$
$\bullet$ Proof.
WLOG, suppose $\mathscr{E}(X_n)=0$ for all $n$ and denote the common bound of $\mathscr{E}(X_n^4)$ to be $$\mathscr{E}(X_n^4)\leq M_4<\infty\mbox{ for all }n.$$Then by Lyapunov's inequality, we have the second moments $$\mathscr{E}|X_n|^2\leq\left[\mathscr{E}|X_n|^4\right]^\frac{2}{4}\leq \sqrt{M_4}<\infty.$$Consider the fourth moment of $S_n$, $$\begin{array}{rl}\mathscr{E}(S_n^4)
&=\mathscr{E}\left[\left(\sum_{j=1}^nX_j\right)^4\right]\\ &= \mathscr{E}\left[\sum_{j=1}^nX_j^4+{4\choose1}\sum_{i\neq j}X_iX_j^3+{4\choose2}\sum_{i\neq j}X_i^2X_j^2\right.\\ &\quad\left.+{4\choose1}{3\choose1}\sum_{i\neq j\neq k}X_iX_jX_k^2+{4\choose1}{3\choose1}{2\choose1}\sum_{i\neq j\neq k\neq l}X_iX_jX_kX_l\right]\\
&=\sum_{j=1}^n\mathscr{E}(X_j^4)+4\sum_{i\neq j}\mathscr{E}(X_i)\mathscr{E}(X_j^3)+6\sum_{i\neq j}\mathscr{E}(X_i^2)\mathscr{E}(X_j^2)\quad(\because\mbox{ indep.})\\ &\quad+12\sum_{i\neq j\neq k}\mathscr{E}(X_i)\mathscr{E}(X_j)\mathscr{E}(X_k^2)+24\sum_{i\neq j\neq k\neq l}\mathscr{E}(X_i)\mathscr{E}(X_j)\mathscr{E}(X_k)\mathscr{E}(X_l)\\ &=\sum_{j=1}^n\mathscr{E}(X_j^4)+6\sum_{i\neq j}\mathscr{E}(X_i^2)\mathscr{E}(X_j^2)\qquad\qquad(\because\mbox{ assuming }\mathscr{E}(X_n)=0.) \\ &\leq nM_4+3n(n-1)\sqrt{M_4}\sqrt{M_4}=n(3n-2)M_4.\end{array}$$By Markov's inequality, for $\varepsilon>0$, $$\mathscr{P}\{|S_n|>n\varepsilon\}\leq\frac{\mathscr{E}(S_n^4)}{n^4\varepsilon^4}\leq\frac{n(3n-2)M_4}{n^4\varepsilon^4}=\frac{3M_4}{n^2\varepsilon^4}+\frac{2M_4}{n^3\varepsilon^4}.$$Thus, $$\sum_n\mathscr{P}\{|S_n|>n\varepsilon\}\leq\sum_n\frac{3M_4}{n^2\varepsilon^4}+\frac{2M_4}{n^3\varepsilon^4}<\infty.$$By Borel-Cantelli Lemma I, we have $$\mathscr{P}\{|S_n|>n\varepsilon\mbox{ i.o.}\}=0\implies\frac{S_n}{n}\rightarrow0\mbox{ a.e.}$$
If $\{X_n\}$ are independent random variables such that the fourth moments $\mathscr{E}(X_n^4)$ have a common bound and define $S_n=\sum_{j=1}^nX_j$, then $$\frac{S_n-\mathscr{E}(S_n)}{n}\rightarrow0\mbox{ a.e.}$$
$\bullet$ Proof.
WLOG, suppose $\mathscr{E}(X_n)=0$ for all $n$ and denote the common bound of $\mathscr{E}(X_n^4)$ to be $$\mathscr{E}(X_n^4)\leq M_4<\infty\mbox{ for all }n.$$Then by Lyapunov's inequality, we have the second moments $$\mathscr{E}|X_n|^2\leq\left[\mathscr{E}|X_n|^4\right]^\frac{2}{4}\leq \sqrt{M_4}<\infty.$$Consider the fourth moment of $S_n$, $$\begin{array}{rl}\mathscr{E}(S_n^4)
&=\mathscr{E}\left[\left(\sum_{j=1}^nX_j\right)^4\right]\\ &= \mathscr{E}\left[\sum_{j=1}^nX_j^4+{4\choose1}\sum_{i\neq j}X_iX_j^3+{4\choose2}\sum_{i\neq j}X_i^2X_j^2\right.\\ &\quad\left.+{4\choose1}{3\choose1}\sum_{i\neq j\neq k}X_iX_jX_k^2+{4\choose1}{3\choose1}{2\choose1}\sum_{i\neq j\neq k\neq l}X_iX_jX_kX_l\right]\\
&=\sum_{j=1}^n\mathscr{E}(X_j^4)+4\sum_{i\neq j}\mathscr{E}(X_i)\mathscr{E}(X_j^3)+6\sum_{i\neq j}\mathscr{E}(X_i^2)\mathscr{E}(X_j^2)\quad(\because\mbox{ indep.})\\ &\quad+12\sum_{i\neq j\neq k}\mathscr{E}(X_i)\mathscr{E}(X_j)\mathscr{E}(X_k^2)+24\sum_{i\neq j\neq k\neq l}\mathscr{E}(X_i)\mathscr{E}(X_j)\mathscr{E}(X_k)\mathscr{E}(X_l)\\ &=\sum_{j=1}^n\mathscr{E}(X_j^4)+6\sum_{i\neq j}\mathscr{E}(X_i^2)\mathscr{E}(X_j^2)\qquad\qquad(\because\mbox{ assuming }\mathscr{E}(X_n)=0.) \\ &\leq nM_4+3n(n-1)\sqrt{M_4}\sqrt{M_4}=n(3n-2)M_4.\end{array}$$By Markov's inequality, for $\varepsilon>0$, $$\mathscr{P}\{|S_n|>n\varepsilon\}\leq\frac{\mathscr{E}(S_n^4)}{n^4\varepsilon^4}\leq\frac{n(3n-2)M_4}{n^4\varepsilon^4}=\frac{3M_4}{n^2\varepsilon^4}+\frac{2M_4}{n^3\varepsilon^4}.$$Thus, $$\sum_n\mathscr{P}\{|S_n|>n\varepsilon\}\leq\sum_n\frac{3M_4}{n^2\varepsilon^4}+\frac{2M_4}{n^3\varepsilon^4}<\infty.$$By Borel-Cantelli Lemma I, we have $$\mathscr{P}\{|S_n|>n\varepsilon\mbox{ i.o.}\}=0\implies\frac{S_n}{n}\rightarrow0\mbox{ a.e.}$$
$\Box$
2015年9月2日 星期三
The Converse of Strong Law of Number
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of i.i.d. random variables, we have $$\frac{S_n}{n}\mbox{ converges a.e. }\implies\mathscr{E}|X_1|<\infty.$$
$\bullet$ Proof.
Let $\{X_n\}$ be a sequence of i.i.d. random variables, we have $$\frac{S_n}{n}\mbox{ converges a.e. }\implies\mathscr{E}|X_1|<\infty.$$
$\bullet$ Proof.
Application of Fubini's Theorem (2)
About Posts which Tagged by 'Probability'
If $X$ and $Y$ are independent, $\mathscr{E}|X|^p<\infty$ for some $p>1$, and $\mathscr{E}(Y)=0$, then $\mathscr{E}|X+Y|^p\geq\mathscr{E}|X|^p$.
$\bullet$ Proof.
If $X$ and $Y$ are independent, $\mathscr{E}|X|^p<\infty$ for some $p>1$, and $\mathscr{E}(Y)=0$, then $\mathscr{E}|X+Y|^p\geq\mathscr{E}|X|^p$.
$\bullet$ Proof.
Application of Fubini's Theorem (1)
About Posts which Tagged by 'Probability'
If $X$ and $Y$ are independent and for some $p>0$, $\mathscr{E}|X+Y|^p<\infty$, then $\mathscr{E}|X|^p<\infty$ and $\mathscr{E}|Y|^p<\infty$.
$\bullet$ Proof.
If $X$ and $Y$ are independent and for some $p>0$, $\mathscr{E}|X+Y|^p<\infty$, then $\mathscr{E}|X|^p<\infty$ and $\mathscr{E}|Y|^p<\infty$.
$\bullet$ Proof.
Probability Measure
About Posts which Tagged by 'Probability'
Let $\Omega$ be a sample space, $\mathscr{F}$ be a Borel field of subsets of $\Omega$. A probability measure (p.m.) $\mathscr{P}\{\cdot\}$ on $\mathscr{F}$ is a real-valued function with domian $\mathscr{F}$ satisfying
(1) $\displaystyle\forall\,E\in\mathscr{F}:\,\mathscr{P}\{E\}\geq0$.
(2) If $\{E_j\}$ is a countable collection of (pairwise) disjoint sets in $\mathscr{F}$, then $$\mathscr{P}\left\{\bigcup_J E_j\right\}=\sum_j\mathscr{P}\{E_j\}.$$(3)$\mathscr{P}\{\Omega\}=1$.
These axioms imply the following many properties for all sets in $\mathscr{F}$:
Let $\Omega$ be a sample space, $\mathscr{F}$ be a Borel field of subsets of $\Omega$. A probability measure (p.m.) $\mathscr{P}\{\cdot\}$ on $\mathscr{F}$ is a real-valued function with domian $\mathscr{F}$ satisfying
(1) $\displaystyle\forall\,E\in\mathscr{F}:\,\mathscr{P}\{E\}\geq0$.
(2) If $\{E_j\}$ is a countable collection of (pairwise) disjoint sets in $\mathscr{F}$, then $$\mathscr{P}\left\{\bigcup_J E_j\right\}=\sum_j\mathscr{P}\{E_j\}.$$(3)$\mathscr{P}\{\Omega\}=1$.
These axioms imply the following many properties for all sets in $\mathscr{F}$:
Independence and Fubini's Theorem
About Posts which Tagged by 'Probability'
A basic property of the expectation of two independent random variables is the following.
[Theorem] If $X$ and $Y$ are independent and both have finite expectations, then $$\mathscr{E}(XY)=\mathscr{E}(X)\mathscr{E}(Y).$$
To prove this, the Fubini's theorem is the quick solution, otherwise, we prove this by the basic definition of the expectation.
$\bullet$ Proof.
2015年9月1日 星期二
Varied Type of Slutsky's Theorem (1): Converge in Probability
About Posts which Tagged by 'Probability'
If $X_n\rightarrow X$ and $Y_n\rightarrow Y$ both in probability, then
(1) $X_n\pm Y_n\rightarrow X\pm Y$ in probability;
(2) $X_nY_n\rightarrow XY$ in probability.
$\bullet$ Proof.
If $X_n\rightarrow X$ and $Y_n\rightarrow Y$ both in probability, then
(1) $X_n\pm Y_n\rightarrow X\pm Y$ in probability;
(2) $X_nY_n\rightarrow XY$ in probability.
$\bullet$ Proof.
Varied Type of Slutsky's Theorem (2): Converge in $r$-th Mean
About Posts which Tagged by 'Probability'
(1) If $X_n\rightarrow X$ and $Y_n\rightarrow Y$ both in $L^p$, then $$X_n\pm Y_n\rightarrow X\pm Y\mbox{ in }L^p;$$
(2) If $X_n\rightarrow X$ in $L^p$ and $Y_n\rightarrow Y$ in $L^q$, where $p>1$ and $1/p+1/q=1$, then $$X_nY_n\rightarrow XY\mbox{ in }L^1.$$
$\bullet$ Proof.
(1) If $X_n\rightarrow X$ and $Y_n\rightarrow Y$ both in $L^p$, then $$X_n\pm Y_n\rightarrow X\pm Y\mbox{ in }L^p;$$
(2) If $X_n\rightarrow X$ in $L^p$ and $Y_n\rightarrow Y$ in $L^q$, where $p>1$ and $1/p+1/q=1$, then $$X_nY_n\rightarrow XY\mbox{ in }L^1.$$
$\bullet$ Proof.
Counterexample for Omitting UAN Condition in Feller's Proof
About Posts which Tagged by 'Probability'
Recall the Lindeberg-Feller Central Limit Theorem.
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$. In order that as $n\rightarrow\infty$ the two conclusions below both hold:
(1) $S_n$ converges in distribution to $\Phi$.
(2) $\{X_{nj}\}$ is uniformly asymptotically negligible (UAN);
it is necessary and sufficient that for each $\eta>0$, we have $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0$$
It is important that the sufficient conditions for the Lindeberg's criterion should hold simultaneously. Here is a counterexample that we omit the requirement of UAN.
$\bullet$ Counterexample.
Let $\{X_n\}$ be a sequence of independent random variables with densities $X_n\sim N\left(0,\sigma^2_n\right)$, where $\sigma^2_1=1$ and $\sigma^2_k=2^{k-1}$ for $k\geq2$. Then $$B_n^2=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2_k=1+1+2+4+\cdots+2^{n-2}=2^{n-1}.$$First, we find the limit distribution of $S_n$. Since for each $k$, $$\frac{X_k}{B_n}\sim N\left(0,\sigma^2_k/2^{n-1}\right),$$ and $$\sum_{k=1}^n\frac{\sigma^2_k}{2^{n-1}}=\frac{1}{2^{n-1}}+\frac{1}{2^{n-1}}\sum_{k=2}^n2^{k-2}=\frac{1}{2^{n-1}}+\frac{2^{n-1}-1}{2^{n-1}}=1,$$we have $$\frac{S_n}{B_n}\sim\mathscr{N}\left(0,1\right).$$
The sequence $\{X_n\}$ is not UAN, since for $\epsilon>0$,
$$\begin{array}{rl}\underset{n\rightarrow\infty}{\lim}\underset{1\leq k\leq n}{\max}\mathscr{P}\{|X_k|>\epsilon\}&=\underset{n\rightarrow\infty}{\lim}\underset{2\leq k\leq n}{\max}\mathscr{P}\left\{\frac{|X_k|}{2^{(k-1)/2}}>\frac{\epsilon}{2^{(k-1)/2}}\right\}\\
&=\underset{n\rightarrow\infty}{\lim}\underset{2\leq k\leq n}{\max}2\Phi\left(-\frac{\epsilon}{2^{(k-1)/2}}\right)\\
&=\underset{n\rightarrow\infty}{\lim}2\Phi\left(-\frac{\epsilon}{2^{(n-1)/2}}\right)\\
&=2\Phi(0)=1\neq0. \end{array}$$
Note that, the Lindeberg's condition implies no significant large variance among $\{X_n\}$. In this case, we have $$\underset{n\rightarrow\infty}{\lim}\underset{1\leq k\leq n}{\max}\frac{\sigma^2_k}{B_n^2}=\underset{n\rightarrow\infty}{\lim}\frac{2^{n-2}}{2^{n-1}}=\frac{1}{2}\neq0.$$Which implies the Lindeberg's condition does not hold.
Recall the Lindeberg-Feller Central Limit Theorem.
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$. In order that as $n\rightarrow\infty$ the two conclusions below both hold:
(1) $S_n$ converges in distribution to $\Phi$.
(2) $\{X_{nj}\}$ is uniformly asymptotically negligible (UAN);
it is necessary and sufficient that for each $\eta>0$, we have $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0$$
It is important that the sufficient conditions for the Lindeberg's criterion should hold simultaneously. Here is a counterexample that we omit the requirement of UAN.
$\bullet$ Counterexample.
Let $\{X_n\}$ be a sequence of independent random variables with densities $X_n\sim N\left(0,\sigma^2_n\right)$, where $\sigma^2_1=1$ and $\sigma^2_k=2^{k-1}$ for $k\geq2$. Then $$B_n^2=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2_k=1+1+2+4+\cdots+2^{n-2}=2^{n-1}.$$First, we find the limit distribution of $S_n$. Since for each $k$, $$\frac{X_k}{B_n}\sim N\left(0,\sigma^2_k/2^{n-1}\right),$$ and $$\sum_{k=1}^n\frac{\sigma^2_k}{2^{n-1}}=\frac{1}{2^{n-1}}+\frac{1}{2^{n-1}}\sum_{k=2}^n2^{k-2}=\frac{1}{2^{n-1}}+\frac{2^{n-1}-1}{2^{n-1}}=1,$$we have $$\frac{S_n}{B_n}\sim\mathscr{N}\left(0,1\right).$$
The sequence $\{X_n\}$ is not UAN, since for $\epsilon>0$,
$$\begin{array}{rl}\underset{n\rightarrow\infty}{\lim}\underset{1\leq k\leq n}{\max}\mathscr{P}\{|X_k|>\epsilon\}&=\underset{n\rightarrow\infty}{\lim}\underset{2\leq k\leq n}{\max}\mathscr{P}\left\{\frac{|X_k|}{2^{(k-1)/2}}>\frac{\epsilon}{2^{(k-1)/2}}\right\}\\
&=\underset{n\rightarrow\infty}{\lim}\underset{2\leq k\leq n}{\max}2\Phi\left(-\frac{\epsilon}{2^{(k-1)/2}}\right)\\
&=\underset{n\rightarrow\infty}{\lim}2\Phi\left(-\frac{\epsilon}{2^{(n-1)/2}}\right)\\
&=2\Phi(0)=1\neq0. \end{array}$$
Note that, the Lindeberg's condition implies no significant large variance among $\{X_n\}$. In this case, we have $$\underset{n\rightarrow\infty}{\lim}\underset{1\leq k\leq n}{\max}\frac{\sigma^2_k}{B_n^2}=\underset{n\rightarrow\infty}{\lim}\frac{2^{n-2}}{2^{n-1}}=\frac{1}{2}\neq0.$$Which implies the Lindeberg's condition does not hold.
Lindeberg's Condition Implies Each Variance to Be Similarly Small
About Posts which Tagged by 'Probability'
Recall the Lindeberg-Feller Central Limit Theorem (short version).
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$. The Lindeberg's condition for $S_n$ converginf to $\Phi$ is $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0\mbox{ for each }\eta>0.$$This criterion implies that there is no such $X_{nj}$, $j=1,\ldots,k_n$, whose variance dominates the others', i.e. $$\underset{1\leq j\leq k_n}{\max}\sigma_{nj}\rightarrow0.$$
$\bullet$ Proof.
Recall the Lindeberg-Feller Central Limit Theorem (short version).
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$. The Lindeberg's condition for $S_n$ converginf to $\Phi$ is $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0\mbox{ for each }\eta>0.$$This criterion implies that there is no such $X_{nj}$, $j=1,\ldots,k_n$, whose variance dominates the others', i.e. $$\underset{1\leq j\leq k_n}{\max}\sigma_{nj}\rightarrow0.$$
$\bullet$ Proof.
Application of Dominate Convergence Theorem
About Posts which Tagged by 'Probability'
If $\{X_n\}$ is a sequence of identical distributed random variables with finite mean, then $$\underset{n\rightarrow\infty}{\lim}\frac{1}{n}\mathscr{E}\left(\underset{1\leq j\leq n}{\max}|X_j|\right)=0.$$
$\bullet$ Proof.
Proof of Inequality (10)
About Posts which Tagged by 'Probability'
Let $X$ be a random variable. If $\mathscr{E}(X^2)=1$ and $\mathscr{E}|X|\geq a>0$, then $$\mathscr{P}\{|X|\geq\lambda a\}\geq(1-\lambda)^2a^2\mbox{ for }0\leq\lambda\leq1.$$
$\bullet$ Proof.
Let $X$ be a random variable. If $\mathscr{E}(X^2)=1$ and $\mathscr{E}|X|\geq a>0$, then $$\mathscr{P}\{|X|\geq\lambda a\}\geq(1-\lambda)^2a^2\mbox{ for }0\leq\lambda\leq1.$$
$\bullet$ Proof.
2015年8月31日 星期一
Convergence of the Characteristic Functions
About Posts which Tagged by 'Probability'
[Theorem 1] Vague convergence implies convergence of ch.f.
Let $\{\mu_n,\,1\leq n\leq\infty\}$ be probability measures on $\mathbb{R}$ with ch.f.'s $\{f_n,\,1\leq n\leq\infty\}$. We have $$\mu_n\overset{v}{\rightarrow}\mu_\infty\implies f_n\rightarrow f_\infty\mbox{ uniformly in every finite interval.}$$
[Theorem 2] Convergence of ch.f. implies vague convergence.
Let $\{\mu_n,\,1\leq n<\infty\}$ be probability measures on $\mathbb{R}$ with ch.f.'s $\{f_n,\,1\leq n<\infty\}$. Suppose that
(a1) $f_n$ converges everywhere in $\mathbb{R}$, say $f_n\rightarrow f_\infty$.
(a2) $f_\infty$ is continuous at $t=0$.
Then we have
(b1) $\mu_n\overset{v}{\rightarrow}\mu_\infty$, where $\mu_\infty$ is a probability measure.
(b2) $f_\infty$ is the ch.f. of $\mu_\infty$.
General Conditions for A Series Converging as An Exponential Term
About Posts which Tagged by 'Probability'
Let $\{\theta_{nj},\,1\leq j \leq k_n,\,1\leq n\}$ be a double array of complex numbers satisfying the following conditions as $n\rightarrow\infty$:
(1) $\displaystyle\underset{1\leq j \leq k_n}{\max}|\theta_{nj}|\rightarrow0;$
(2) $\displaystyle\sum_{j=1}^{k_n}|\theta_{nj}|\leq M<\infty$, where $M$ does not depend on $n$;
(3) $\displaystyle\sum_{j=1}^{k_n}\theta_{nj}\rightarrow\theta$, where $\theta$ is a (finite) complex number.
Then we have $$\prod_{j=1}^{k_n}(1+\theta_{nj})\rightarrow e^\theta.$$
$\bullet$ Proof.
(1) $\displaystyle\underset{1\leq j \leq k_n}{\max}|\theta_{nj}|\rightarrow0;$
(2) $\displaystyle\sum_{j=1}^{k_n}|\theta_{nj}|\leq M<\infty$, where $M$ does not depend on $n$;
(3) $\displaystyle\sum_{j=1}^{k_n}\theta_{nj}\rightarrow\theta$, where $\theta$ is a (finite) complex number.
Then we have $$\prod_{j=1}^{k_n}(1+\theta_{nj})\rightarrow e^\theta.$$
$\bullet$ Proof.
2015年8月28日 星期五
Varied Type of Borel-Cantelli Lemma I
About Posts which Tagged by 'Probability'
Let $\{E_n\}$ be arbitrary events satisfying
(1) $\underset{n}{\lim}\mathscr{P}(E_n)=0$;
(2) $\underset{n}{\sum}\mathscr{P}(E_nE_{n+1}^c)<\infty$,
then $\mathscr{P}\{\limsup_n E_n\}=0$.
$\bullet$ Proof.
Let $\{E_n\}$ be arbitrary events satisfying
(1) $\underset{n}{\lim}\mathscr{P}(E_n)=0$;
(2) $\underset{n}{\sum}\mathscr{P}(E_nE_{n+1}^c)<\infty$,
then $\mathscr{P}\{\limsup_n E_n\}=0$.
$\bullet$ Proof.
2015年8月27日 星期四
Application of the Characteristic Function (2)
About Posts which Tagged by 'Probability'
Let $X_n$ have the binomial distribution with parameter $(n,p_n)$, and suppose that $n\,p_n\rightarrow\lambda\geq0$. Prove that $X_n$ converges in dist. to the Poisson d.f. with parameter $\lambda$. (In the old days this was called the law of small numbers.)
$\bullet$ Proof.
Let $X_n$ have the binomial distribution with parameter $(n,p_n)$, and suppose that $n\,p_n\rightarrow\lambda\geq0$. Prove that $X_n$ converges in dist. to the Poisson d.f. with parameter $\lambda$. (In the old days this was called the law of small numbers.)
$\bullet$ Proof.
Application of The Classical Central Limit Theorem (2)
About Posts which Tagged by 'Probability'
Let $\{X_j,\,j\geq1\}$ be independent, identically distributed r.v.'s with mean $0$ and variance $1$. Prove that both $$\frac{\displaystyle\sum_{j=1}^nX_j}{\sqrt{\displaystyle\sum_{j=1}^nX^2_j}}\quad
\mbox{ and }\quad\frac{\displaystyle{\sqrt{n}\sum_{j=1}^nX_j}}{\displaystyle\sum_{j=1}^nX^2_j}$$converge in distribution to $\Phi$.
$\bullet$ Proof.
Let $\{X_j,\,j\geq1\}$ be independent, identically distributed r.v.'s with mean $0$ and variance $1$. Prove that both $$\frac{\displaystyle\sum_{j=1}^nX_j}{\sqrt{\displaystyle\sum_{j=1}^nX^2_j}}\quad
\mbox{ and }\quad\frac{\displaystyle{\sqrt{n}\sum_{j=1}^nX_j}}{\displaystyle\sum_{j=1}^nX^2_j}$$converge in distribution to $\Phi$.
$\bullet$ Proof.
Application of The Classical Central Limit Theorem (1)
About Posts which Tagged by 'Probability'
Let $X_\lambda$ have the Poisson distribution with parameter $\lambda$. Consider the limit distribution of $(X_\lambda-\lambda)/\lambda^{1/2}$ as $\lambda\rightarrow\infty$. Since $X_\lambda\sim\textit{Poi}\,(\lambda)$, we have $$\mathscr{E}(X_\lambda)=\lambda\mbox{ and }\sigma^2(X_\lambda)=\lambda.$$ $X_\lambda$ is a single random variable which of course be i.i.d. Thus by the Classical Central Limit Theorem, we have $$\frac{X_\lambda-\mathscr{E}(X_\lambda)}{\sigma(X_\lambda)\sqrt{1}} = \frac{X_\lambda-\lambda}{\lambda^{1/2}}\overset{\mathscr{L}}{\longrightarrow}\boldsymbol{\Phi},
$$where $\boldsymbol{\Phi}$ is normal distribution with mean 0 and variance 1.
Let $X_\lambda$ have the Poisson distribution with parameter $\lambda$. Consider the limit distribution of $(X_\lambda-\lambda)/\lambda^{1/2}$ as $\lambda\rightarrow\infty$. Since $X_\lambda\sim\textit{Poi}\,(\lambda)$, we have $$\mathscr{E}(X_\lambda)=\lambda\mbox{ and }\sigma^2(X_\lambda)=\lambda.$$ $X_\lambda$ is a single random variable which of course be i.i.d. Thus by the Classical Central Limit Theorem, we have $$\frac{X_\lambda-\mathscr{E}(X_\lambda)}{\sigma(X_\lambda)\sqrt{1}} = \frac{X_\lambda-\lambda}{\lambda^{1/2}}\overset{\mathscr{L}}{\longrightarrow}\boldsymbol{\Phi},
$$where $\boldsymbol{\Phi}$ is normal distribution with mean 0 and variance 1.
$\Box$
Linderberg-Feller's Central Limit Theorem (completed)
About Posts which Tagged by 'Probability'
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$. In order that as $n\rightarrow\infty$ the two conclusions below both hold:
(1) $S_n$ converges in distribution to $\Phi$.
(2) $\{X_{nj}\}$ is uniformly asymptotically negligible (UAN);
it is necessary and sufficient that for each $\eta>0$, we have $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0$$
$\bullet$ Proof.
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$. In order that as $n\rightarrow\infty$ the two conclusions below both hold:
(1) $S_n$ converges in distribution to $\Phi$.
(2) $\{X_{nj}\}$ is uniformly asymptotically negligible (UAN);
it is necessary and sufficient that for each $\eta>0$, we have $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0$$
$\bullet$ Proof.
2015年8月26日 星期三
Uniformly Asymptotically Negligible (2): Connect to the Characteristic Function
About Posts which Tagged by 'Probability'
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and $\{f_{nj}(t)\}$ be their ch.f.'s. $$\forall\,\varepsilon>0,\,\underset{n\rightarrow\infty}{\lim}\max_{1\leq j\leq k_n}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0\iff\forall\,t\in\mathbb{R},\,\underset{n\rightarrow\infty}{\lim}\max_{1\leq j\leq k_n}\left|f_{nj}(t)-1\right|=0.$$
$\bullet$ Proof.
$\bullet$ Proof.
Counterexample for Converse of Borel-Cantelli Lemma I
About Posts which Tagged by 'Probability'
Let $\mathscr{F}$ be a Borel field and $\{E_n\}_{n\geq1}\in\mathscr{F}$ are events. We have the first Borel-Cantelli Lemma $$\sum_{n=1}^\infty \mathscr{P}\{E_n\} < \infty \implies \mathscr{P}\{E_n\mbox{ i.o.}\}=0,$$but, the converse is NOT true.
$\bullet$ Counterexample.
Let $\mathscr{F}$ be a Borel field and $\{E_n\}_{n\geq1}\in\mathscr{F}$ are events. We have the first Borel-Cantelli Lemma $$\sum_{n=1}^\infty \mathscr{P}\{E_n\} < \infty \implies \mathscr{P}\{E_n\mbox{ i.o.}\}=0,$$but, the converse is NOT true.
$\bullet$ Counterexample.
Equivalence of Convergence of Sum of Random Variables
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of independent random variables, then $$\sum_nX_n\mbox{ converges a.e.}\iff\sum_nX_n\mbox{ converges in probability}.$$
$\bullet$ Proof.
Let $\{X_n\}$ be a sequence of independent random variables, then $$\sum_nX_n\mbox{ converges a.e.}\iff\sum_nX_n\mbox{ converges in probability}.$$
$\bullet$ Proof.
Proof of Chebyshev Type for Maximal Sum of Random Variables II
About Posts which Tagged by 'Probability'
Chebyshev type for maximal sum of random variables II. Let $\{X_n\}$ be independent random variables with finite means and suppose that there exists an $A$ such that $$\forall\,n,\,|X_n-\mathscr{E}(X_n)|\leq A<\infty,$$Then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|\leq\varepsilon\right\}\leq\frac{(2A+4\varepsilon)^2}{\sigma^2(S_n)}.$$
See List of Inequalities.
$\bullet$ Proof.
Chebyshev type for maximal sum of random variables II. Let $\{X_n\}$ be independent random variables with finite means and suppose that there exists an $A$ such that $$\forall\,n,\,|X_n-\mathscr{E}(X_n)|\leq A<\infty,$$Then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|\leq\varepsilon\right\}\leq\frac{(2A+4\varepsilon)^2}{\sigma^2(S_n)}.$$
See List of Inequalities.
$\bullet$ Proof.
Proof of Chebyshev Type for Maximal Sum of Random Variables I
Declaration for Posts which Tagged by 'Probability'
Chebyshev type for maximal sum of random variables I. Let $\{X_n\}$ be independent random variables such that $\mathscr{E}(X_n)=0$ and $\mathscr{E}(X_n^2)=\sigma^2(X_n)<\infty$ for all $n$, then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|>\varepsilon\right\}\leq\frac{\sigma^2(S_n)}{\varepsilon^2}.$$
See List of Inequalities.
$\bullet$ Proof.
Chebyshev type for maximal sum of random variables I. Let $\{X_n\}$ be independent random variables such that $\mathscr{E}(X_n)=0$ and $\mathscr{E}(X_n^2)=\sigma^2(X_n)<\infty$ for all $n$, then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|>\varepsilon\right\}\leq\frac{\sigma^2(S_n)}{\varepsilon^2}.$$
See List of Inequalities.
$\bullet$ Proof.
2015年8月25日 星期二
Extension of Weak Law of Large Number (2)
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of pairwisely independent random variables with common distribution functions $F$. Define $S_n=\sum_j X_j$. Suppose that we have
(1) $\displaystyle\int_{|x|\leq n}x\,dF(x)=o(1)$,
(2) $\displaystyle n\int_{|x|>n}\,dF(x)=o(1)$;
then $$\frac{S_n}{n}\rightarrow0\mbox{ in probability.}$$
$\bullet$ Proof.
Let $\{X_n\}$ be a sequence of pairwisely independent random variables with common distribution functions $F$. Define $S_n=\sum_j X_j$. Suppose that we have
(1) $\displaystyle\int_{|x|\leq n}x\,dF(x)=o(1)$,
(2) $\displaystyle n\int_{|x|>n}\,dF(x)=o(1)$;
then $$\frac{S_n}{n}\rightarrow0\mbox{ in probability.}$$
$\bullet$ Proof.
Uniformly Asymptotically Negligible
About Posts which Tagged by 'Probability'
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables. We have the following criteria of "negligibility". For $\varepsilon>0$,
(a) $\displaystyle\forall\,j:\quad\underset{n\rightarrow\infty}{\lim}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$;
(b) $\displaystyle\underset{n\rightarrow\infty}{\lim}\max_{1\leq j\leq k_n}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$;
(c) $\displaystyle\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\varepsilon\right\}=0$;
(d) $\displaystyle\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$.
[Definition] $\{X_{nj}\}$ is called Uniformly Asymptotically Negligible (UAN, holospoudic) if (b) holds.
The implications $(d)\Rightarrow(c)\Rightarrow(b)\Rightarrow(a)$ are all strict. On the other hand, if for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent, then $(d)\equiv(c)$.
$\bullet$ Proof.
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables. We have the following criteria of "negligibility". For $\varepsilon>0$,
(a) $\displaystyle\forall\,j:\quad\underset{n\rightarrow\infty}{\lim}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$;
(b) $\displaystyle\underset{n\rightarrow\infty}{\lim}\max_{1\leq j\leq k_n}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$;
(c) $\displaystyle\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\varepsilon\right\}=0$;
(d) $\displaystyle\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$.
[Definition] $\{X_{nj}\}$ is called Uniformly Asymptotically Negligible (UAN, holospoudic) if (b) holds.
The implications $(d)\Rightarrow(c)\Rightarrow(b)\Rightarrow(a)$ are all strict. On the other hand, if for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent, then $(d)\equiv(c)$.
$\bullet$ Proof.
Converge in Distribution and Vague Convergence (2): Equivalence for p.m.'s
About Posts which Tagged by 'Probability'
[Notations] Sets of Continuous functions.
$C_K\,$: the set of continuous functions $f$ each vanishing outside a compact set $K(f)$.
$C_0\;\,$: the set of continuous functions $f$ such that $\lim_{|x|\rightarrow\infty}f(x)=0$.
$C_B\,$: the set of bounded continuous functions.
$C\;\;\,$: the set of continuous functions.
It is clearly that $f\in C_K\implies f\in C_0\implies f\in C_B\implies f\in C$.
[Theorem] Let $\{\mu_n\}_{n\geq1}$ and $\mu$ be a sequence of p.m.'s, then $$\mu_n\overset{v}{\longrightarrow}\mu\iff\forall\,f\in C_B,\;\int f\,d\mu_n\rightarrow\int f\,d\mu.$$
$\bullet$ Proof.
[Notations] Sets of Continuous functions.
$C_K\,$: the set of continuous functions $f$ each vanishing outside a compact set $K(f)$.
$C_0\;\,$: the set of continuous functions $f$ such that $\lim_{|x|\rightarrow\infty}f(x)=0$.
$C_B\,$: the set of bounded continuous functions.
$C\;\;\,$: the set of continuous functions.
It is clearly that $f\in C_K\implies f\in C_0\implies f\in C_B\implies f\in C$.
[Theorem] Let $\{\mu_n\}_{n\geq1}$ and $\mu$ be a sequence of p.m.'s, then $$\mu_n\overset{v}{\longrightarrow}\mu\iff\forall\,f\in C_B,\;\int f\,d\mu_n\rightarrow\int f\,d\mu.$$
$\bullet$ Proof.
2015年8月24日 星期一
Application of the Characteristic Function (1)
About Posts which Tagged by 'Probability'
Suppose that $X$ and $Y$ are i.i.d. random variables such that $\mathscr{E}(X)=\mathscr{E}(Y)=0$ and $\mathscr{E}(X^2)=\mathscr{E}(Y^2)=1$. If $\frac{X+Y}{\sqrt{2}}$ have the same distribution as $X$, then $X$, $Y$ have the standard normal distribution.
$\bullet$ Proof.
Suppose that $X$ and $Y$ are i.i.d. random variables such that $\mathscr{E}(X)=\mathscr{E}(Y)=0$ and $\mathscr{E}(X^2)=\mathscr{E}(Y^2)=1$. If $\frac{X+Y}{\sqrt{2}}$ have the same distribution as $X$, then $X$, $Y$ have the standard normal distribution.
$\bullet$ Proof.
Application of Lyapunov's Central Limit Theorem (4)
About Posts which Tagged by 'Probability'
Let $\{X_n,\,n\geq1\}$ be a sequence of uniformly bounded independent random variables, and $S_n=\sum_{i=1}^nX_i$. Suppose $\sigma^2_n=\mbox{Var}(S_n)\rightarrow\infty$ as $n\rightarrow\infty$, then $$\frac{S_n-\mathscr{E}(S_n)}{\sigma_n}\overset{d}{\rightarrow}\Phi.$$
$\bullet$ Proof.
Let $\{X_n,\,n\geq1\}$ be a sequence of uniformly bounded independent random variables, and $S_n=\sum_{i=1}^nX_i$. Suppose $\sigma^2_n=\mbox{Var}(S_n)\rightarrow\infty$ as $n\rightarrow\infty$, then $$\frac{S_n-\mathscr{E}(S_n)}{\sigma_n}\overset{d}{\rightarrow}\Phi.$$
$\bullet$ Proof.
Proof of Cantelli's Inequality
About Posts which Tagged by 'Probability'
Cantelli's inequality. Suppose $\sigma^2=\mbox{Var}(X)<\infty$. Then for $a>0$, we have $$\mathscr{P}\{|X-\mathscr{E}(X)|>a\}\leq\frac{2\sigma^2}{a^2+\sigma^2}.$$
See List of Inequalities.
$\bullet$ Proof.
Cantelli's inequality. Suppose $\sigma^2=\mbox{Var}(X)<\infty$. Then for $a>0$, we have $$\mathscr{P}\{|X-\mathscr{E}(X)|>a\}\leq\frac{2\sigma^2}{a^2+\sigma^2}.$$
See List of Inequalities.
$\bullet$ Proof.
Proof of Inequality (6)
About Posts which Tagged by 'Probability'
Let $X$ and $Y$ be random variables. If $X\geq0$ and $Y\geq0$, $p\geq0$, then $$\mathscr{E}\{(X+Y)^p\}\leq2^p\{\mathscr{E}(X^p)+\mathscr{E}(Y^p)\}.$$If $p>1$, the factor $2^p$ may be replaced by $2^{p-1}$. If $0\leq p\leq1$, it may be replaced by $1$.
See List of Inequalities.
$\bullet$ Proof.
Let $X$ and $Y$ be random variables. If $X\geq0$ and $Y\geq0$, $p\geq0$, then $$\mathscr{E}\{(X+Y)^p\}\leq2^p\{\mathscr{E}(X^p)+\mathscr{E}(Y^p)\}.$$If $p>1$, the factor $2^p$ may be replaced by $2^{p-1}$. If $0\leq p\leq1$, it may be replaced by $1$.
See List of Inequalities.
$\bullet$ Proof.
Application of Lindeberg's Central Limit Theorem (3): NOT converge to Normal
About Posts which Tagged by 'Probability'
Let $\{X_n,\,n\geq1\}$ be independent random variables with $$\mathscr{P}\{X_k=\pm1\}=\frac{1}{2}\left(1-\frac{1}{k}\right)\mbox{ and }\mathscr{P}\{X_k=\pm\sqrt{k}\}=\frac{1}{2k}.$$Let $S_n=\sum_{k=1}^n$, we show that $S_n$ does NOT converge to normal distribution by the divergence of the Lindeberg's condition.
First, we evaluate the mean and variance of $X_k$'s and $S_n$, $$\begin{array}{rl}&\mathscr{E}(X_k)=\frac{1}{2}\left(1-\frac{1}{k}\right)-\frac{1}{2}\left(1-\frac{1}{k}\right)+\sqrt{k}\frac{1}{2k}-\sqrt{k}\frac{1}{2k}=0\\ \implies & \mathscr{E}(S_n)=\sum_{k=1}^n\mathscr{E}(X_k)=0.\end{array}$$ And, $$\sigma^2(X_k)=\mathscr{E}(X_k^2)=\frac{1}{2}\left(1-\frac{1}{k}\right)+\frac{1}{2}\left(1-\frac{1}{k}\right)+k\frac{1}{2k}+k\frac{1}{2k}=2-\frac{1}{k},$$then $$B_n^2=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2(X_k)=2n-\sum_{k=1}^n\frac{1}{k}.$$Let $\eta=B_n^{-1}$, the Lindeberg's condition $$\begin{array}{rl}\frac{1}{B_n^2}\sum_{k=1}^n\mathscr{E}\left(X_k^2\,I\{|X_k|>\eta B_n\}\right) & = \frac{1}{B_n^2}\sum_{k=1}^n\mathscr{E}\left(X_k^2\,I\{|X_k|>1\}\right) \\ & = \frac{1}{B_n^2}\sum_{k=1}^n\left(k\frac{1}{2k}+k\frac{1}{2k}\right) \\ & = \frac{n}{2n-\sum_{k=1}^n\frac{1}{k}}\\ & \rightarrow \frac{1}{2}\neq0\mbox{ as }n\rightarrow\infty.\end{array}$$That is, there exists a $\eta=B_n^{-1}$ such that the Lindeberg's condition is failed. By the relationship between the Lindeberg's condition and the Lyapunov's condition, we also have that the Lyapunov's condition is failed. Thus, $S_n$ does NOT converge to normal distribution.
Let $\{X_n,\,n\geq1\}$ be independent random variables with $$\mathscr{P}\{X_k=\pm1\}=\frac{1}{2}\left(1-\frac{1}{k}\right)\mbox{ and }\mathscr{P}\{X_k=\pm\sqrt{k}\}=\frac{1}{2k}.$$Let $S_n=\sum_{k=1}^n$, we show that $S_n$ does NOT converge to normal distribution by the divergence of the Lindeberg's condition.
First, we evaluate the mean and variance of $X_k$'s and $S_n$, $$\begin{array}{rl}&\mathscr{E}(X_k)=\frac{1}{2}\left(1-\frac{1}{k}\right)-\frac{1}{2}\left(1-\frac{1}{k}\right)+\sqrt{k}\frac{1}{2k}-\sqrt{k}\frac{1}{2k}=0\\ \implies & \mathscr{E}(S_n)=\sum_{k=1}^n\mathscr{E}(X_k)=0.\end{array}$$ And, $$\sigma^2(X_k)=\mathscr{E}(X_k^2)=\frac{1}{2}\left(1-\frac{1}{k}\right)+\frac{1}{2}\left(1-\frac{1}{k}\right)+k\frac{1}{2k}+k\frac{1}{2k}=2-\frac{1}{k},$$then $$B_n^2=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2(X_k)=2n-\sum_{k=1}^n\frac{1}{k}.$$Let $\eta=B_n^{-1}$, the Lindeberg's condition $$\begin{array}{rl}\frac{1}{B_n^2}\sum_{k=1}^n\mathscr{E}\left(X_k^2\,I\{|X_k|>\eta B_n\}\right) & = \frac{1}{B_n^2}\sum_{k=1}^n\mathscr{E}\left(X_k^2\,I\{|X_k|>1\}\right) \\ & = \frac{1}{B_n^2}\sum_{k=1}^n\left(k\frac{1}{2k}+k\frac{1}{2k}\right) \\ & = \frac{n}{2n-\sum_{k=1}^n\frac{1}{k}}\\ & \rightarrow \frac{1}{2}\neq0\mbox{ as }n\rightarrow\infty.\end{array}$$That is, there exists a $\eta=B_n^{-1}$ such that the Lindeberg's condition is failed. By the relationship between the Lindeberg's condition and the Lyapunov's condition, we also have that the Lyapunov's condition is failed. Thus, $S_n$ does NOT converge to normal distribution.
$\Box$
Application of Lyapunov's Central Limit Theorem (3)
About Posts which Tagged by 'Probability'
Let $\{X_n,\,n\geq1\}$ be independent random variables with $$\mathscr{P}\{X_k=\pm1\}=\frac{1}{2}\left(1-\frac{1}{k^2}\right)\mbox{ and }\mathscr{P}\{X_k=\pm\sqrt{k}\}=\frac{1}{4k^2}.$$Let $S_n=\sum_{k=1}^n$, we show that $S_n/B_n$ converges to normal distribution for some normalizing constants $B_n$.
First, we evaluate the mean and variance of $X_k$'s and $S_n$, $$\begin{array}{rl}&\mathscr{E}(X_k)=\frac{1}{2}\left(1-\frac{1}{k^2}\right)-\frac{1}{2}\left(1-\frac{1}{k^2}\right)+\sqrt{k}\frac{1}{4k^2}-\sqrt{k}\frac{1}{4k^2}=0\\ \implies & \mathscr{E}(S_n)=\sum_{k=1}^n\mathscr{E}(X_k)=0.\end{array}$$ And, $$\sigma^2(X_k)=\mathscr{E}(X_k^2)=\frac{1}{2}\left(1-\frac{1}{k^2}\right)+\frac{1}{2}\left(1-\frac{1}{k^2}\right)+k\frac{1}{4k^2}+k\frac{1}{4k^2}=1+\frac{1}{2k}-\frac{1}{k^2},$$then $$B_n^2=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2(X_k)=n+\sum_{k=1}^n\frac{1}{2k}-\sum_{k=1}^n\frac{1}{k^2}.$$Let $\delta=2$ in the Lyapunov's condition, we have $$\mathscr{E}(X_k^4)=\frac{1}{2}\left(1-\frac{1}{k^2}\right)+\frac{1}{2}\left(1-\frac{1}{k^2}\right)+k^2\frac{1}{4k^2}+k^2\frac{1}{4k^2}=\frac{3}{2}-\frac{1}{k^2},$$thus the Lyapunov's condition $$\frac{1}{B_n^4}\sum_{k=1}^n\mathscr{E}(X_k^4)=\frac{\frac{3n}{2}-\sum_{k=1}^n\frac{1}{k^2}}{\left[n+\sum_{k=1}^n\frac{1}{2k}-\sum_{k=1}^n\frac{1}{k^2}\right]^2}\rightarrow0\mbox{ as }n\rightarrow\infty,$$holds by comparing the order of $n$. Hence we have $$\frac{S_n}{B_n}\overset{d}{\rightarrow}\Phi.$$
Let $\{X_n,\,n\geq1\}$ be independent random variables with $$\mathscr{P}\{X_k=\pm1\}=\frac{1}{2}\left(1-\frac{1}{k^2}\right)\mbox{ and }\mathscr{P}\{X_k=\pm\sqrt{k}\}=\frac{1}{4k^2}.$$Let $S_n=\sum_{k=1}^n$, we show that $S_n/B_n$ converges to normal distribution for some normalizing constants $B_n$.
First, we evaluate the mean and variance of $X_k$'s and $S_n$, $$\begin{array}{rl}&\mathscr{E}(X_k)=\frac{1}{2}\left(1-\frac{1}{k^2}\right)-\frac{1}{2}\left(1-\frac{1}{k^2}\right)+\sqrt{k}\frac{1}{4k^2}-\sqrt{k}\frac{1}{4k^2}=0\\ \implies & \mathscr{E}(S_n)=\sum_{k=1}^n\mathscr{E}(X_k)=0.\end{array}$$ And, $$\sigma^2(X_k)=\mathscr{E}(X_k^2)=\frac{1}{2}\left(1-\frac{1}{k^2}\right)+\frac{1}{2}\left(1-\frac{1}{k^2}\right)+k\frac{1}{4k^2}+k\frac{1}{4k^2}=1+\frac{1}{2k}-\frac{1}{k^2},$$then $$B_n^2=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2(X_k)=n+\sum_{k=1}^n\frac{1}{2k}-\sum_{k=1}^n\frac{1}{k^2}.$$Let $\delta=2$ in the Lyapunov's condition, we have $$\mathscr{E}(X_k^4)=\frac{1}{2}\left(1-\frac{1}{k^2}\right)+\frac{1}{2}\left(1-\frac{1}{k^2}\right)+k^2\frac{1}{4k^2}+k^2\frac{1}{4k^2}=\frac{3}{2}-\frac{1}{k^2},$$thus the Lyapunov's condition $$\frac{1}{B_n^4}\sum_{k=1}^n\mathscr{E}(X_k^4)=\frac{\frac{3n}{2}-\sum_{k=1}^n\frac{1}{k^2}}{\left[n+\sum_{k=1}^n\frac{1}{2k}-\sum_{k=1}^n\frac{1}{k^2}\right]^2}\rightarrow0\mbox{ as }n\rightarrow\infty,$$holds by comparing the order of $n$. Hence we have $$\frac{S_n}{B_n}\overset{d}{\rightarrow}\Phi.$$
$\Box$
Related Topic with Uniformly Integrable
About Posts which Tagged by 'Probability'
If $\{|X_n|^\beta,\,n\geq1\}$ is uniformly integrable for some $\beta\geq1$ and $S_n=\sum_{i=1}^nX_i$, then $$\left|\frac{S_n}{n}\right|^\beta\mbox{ is uniformly integrable.}$$
$\bullet$ Proof.
If $\{|X_n|^\beta,\,n\geq1\}$ is uniformly integrable for some $\beta\geq1$ and $S_n=\sum_{i=1}^nX_i$, then $$\left|\frac{S_n}{n}\right|^\beta\mbox{ is uniformly integrable.}$$
$\bullet$ Proof.
2015年8月23日 星期日
Proof of Lyapunov's Inequality
About Posts which Tagged by 'Probability'
Lyapunov's inequality. Let $X$ be a random variable. For $0<s<t$, $$\left(\mathscr{E}|X|^s\right)^\frac{1}{s}\leq \left(\mathscr{E}|X|^t\right)^\frac{1}{t}.$$
See List of Inequalities.
$\bullet$ Proof.
Lyapunov's inequality. Let $X$ be a random variable. For $0<s<t$, $$\left(\mathscr{E}|X|^s\right)^\frac{1}{s}\leq \left(\mathscr{E}|X|^t\right)^\frac{1}{t}.$$
See List of Inequalities.
$\bullet$ Proof.
Proof of Chebyshev's inequality
About Posts which Tagged by 'Probability'
Chebyshev's inequality. Let $X$ be a random variable. Let $\phi$ be a strictly increasing function on $(0,\infty)$ and $\phi(u)=\phi(-u)$. Suppose $\mathscr{E}[\phi(X)]<\infty$. Then $\forall\,u>0$, $$\mathscr{P}\{|X|\geq u\}\leq\frac{\mathscr{E}[\phi(X)]}{\phi(u)}.$$
See List of Inequalities.
$\bullet$ Proof.
Chebyshev's inequality. Let $X$ be a random variable. Let $\phi$ be a strictly increasing function on $(0,\infty)$ and $\phi(u)=\phi(-u)$. Suppose $\mathscr{E}[\phi(X)]<\infty$. Then $\forall\,u>0$, $$\mathscr{P}\{|X|\geq u\}\leq\frac{\mathscr{E}[\phi(X)]}{\phi(u)}.$$
See List of Inequalities.
$\bullet$ Proof.
Proof of Jensen's inequality
About Posts which Tagged by 'Probability'
Jensen's inequality. Let $X$ be a random variable. Let $\phi$ be a convex function. Suppose $X$ and $\phi(X)$ are integrable. $$\phi(\mathscr{E}X)\leq \mathscr{E}[\phi(X)].$$
See List of Inequalities.
$\bullet$ Proof.
Jensen's inequality. Let $X$ be a random variable. Let $\phi$ be a convex function. Suppose $X$ and $\phi(X)$ are integrable. $$\phi(\mathscr{E}X)\leq \mathscr{E}[\phi(X)].$$
See List of Inequalities.
$\bullet$ Proof.
Proof of Minkowski's inequality
About Posts which Tagged by 'Probability'
Minkowski's inequality. Let $X$ and $Y$ are random variables. Let $1<p<\infty$. $$\left(\mathscr{E}|X+Y|^p\right)^{\frac{1}{p}}\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}+\left(\mathscr{E}|Y|^p\right)^{\frac{1}{p}}.$$
See List of Inequalities.
$\bullet$ Proof.
Minkowski's inequality. Let $X$ and $Y$ are random variables. Let $1<p<\infty$. $$\left(\mathscr{E}|X+Y|^p\right)^{\frac{1}{p}}\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}+\left(\mathscr{E}|Y|^p\right)^{\frac{1}{p}}.$$
See List of Inequalities.
$\bullet$ Proof.
Proof of H$\ddot{o}$lder's inequality
About Posts which Tagged by 'Probability'
H$\ddot{o}$lder's inequality. Let $X$ and $Y$ are random variables. Let $1<p<\infty$ and $\frac{1}{p}+\frac{1}{q}=1$. $$|\mathscr{E}(XY)|\leq \mathscr{E}|XY|\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}\left(\mathscr{E}|Y|^q\right)^{\frac{1}{q}}.$$
See List of Inequalities.
$\bullet$ Proof.
H$\ddot{o}$lder's inequality. Let $X$ and $Y$ are random variables. Let $1<p<\infty$ and $\frac{1}{p}+\frac{1}{q}=1$. $$|\mathscr{E}(XY)|\leq \mathscr{E}|XY|\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}\left(\mathscr{E}|Y|^q\right)^{\frac{1}{q}}.$$
See List of Inequalities.
$\bullet$ Proof.
Extension of Borel-Cantelli Lemma II
About Posts which Tagged by 'Probability'
Let $\mathscr{F}$ be a Borel field and $\{E_n\}_{n\geq1}\in\mathscr{F}$ are events. If $\{E_n\}$ are pairwise independent, then the conclusion
$$\sum_{n=1}^\infty \mathscr{P}\{E_n\} = \infty\implies\mathscr{P}\{E_n\mbox{ i.o.}\}=1$$remains true.
[See Borel-Cantelli Lemma]
$\bullet$ Proof.
Let $\mathscr{F}$ be a Borel field and $\{E_n\}_{n\geq1}\in\mathscr{F}$ are events. If $\{E_n\}$ are pairwise independent, then the conclusion
$$\sum_{n=1}^\infty \mathscr{P}\{E_n\} = \infty\implies\mathscr{P}\{E_n\mbox{ i.o.}\}=1$$remains true.
[See Borel-Cantelli Lemma]
$\bullet$ Proof.
2015年8月22日 星期六
Application of Lyapunov's Central Limit Theorem (2): Coupon Collector's Problem
About Posts which Tagged by 'Probability'
Coupon Collector's Problem. Coupons are drawn at random with replacement from among $N$ distinct coupons until exactly $n$ distinct coupons are observed. Let $S_n$ denote the total number of coupons drawn. Then $S_n=Y_1+\cdots+Y_n$, where $Y_j$ is the number of coupons drawn after observing $j-1$ distinct coupons until the $j$th distinct coupon is drawn. Then $Y_1$, ..., $Y_n$ are independent Geometric random variables with means and variances, $$\begin{array}{rl}\mathscr{E}(Y_j)&=\frac{N}{N-j+1};\\ \sigma^2(Y_j)&=\frac{N(j-1)}{(N-j+1)^2}.\end{array}$$Let $n=\lceil Nr\rceil$ for some fixed $r\in(0,1)$, then $$\alpha_n=\mathscr{E}(S_n)=\sum_{j=1}^n\mathscr{E}(Y_j)=\sum_{j=1}^n\frac{N}{N-j+1}=\sum_{j=1}^n\frac{1}{1-\frac{j-1}{N}},$$we have $$N\int_{-\frac{1}{N}}^{r-\frac{1}{N}}\frac{1}{1-x}\,dx\leq\mathscr{E}(S_n)\leq N\int_{0}^{r}\frac{1}{1-x}\,dx.$$Thus, as $N\rightarrow\infty$, $$\underset{N\rightarrow\infty}{\lim}\mathscr{E}(S_n)=\frac{n}{r}\log{\left(\frac{1}{1-r}\right)}.$$Similarly, we have $$\sigma^2_n=\sigma^2(S_n)=\sum_{j=1}^n\sigma^2(Y_j)=\sum_{j=1}^n\frac{N(j-1)}{(N-j+1)^2}=\sum_{j=1}^n\frac{\frac{j-1}{N}}{\left(1-\frac{j-1}{N}\right)^2},$$then $$\underset{N\rightarrow\infty}{\lim}\sigma^2(S_n)=N\int_0^r\frac{x}{(1-x)^2}\,dx=\frac{n}{r}\left(\frac{r}{1-r}+\log{(1-r)}\right).$$Consider the Lyapunov's condition with $\delta=2$. Since $Y_j\overset{i.i.d.}{\sim}\mbox{Geo}\left(p_j=\frac{N-j+1}{N}\right)$, we have $$\mathscr{E}|Y_j|^4=\frac{1}{p_j^4}(2-p_j)(12-12p_j+p_j^2)<\infty.$$The Lyapunov's condition $$\begin{array}{rl}\frac{1}{\sigma^4_n}\sum_{j=1}^n\mathscr{E}\left|Y_j-\mathscr{E}(Y_j)\right|^4
&\leq\frac{1}{\sigma^4_n}\sum_{j=1}^n\mathscr{E}|Y_j|^4 \\
&=\frac{1}{\sigma^4_n}\sum_{j=1}^n\left[\frac{1}{p_j^4}(2-p_j)(12-12p_j+p_j^2)\right] \\
&\leq\frac{1}{\sigma^4_n}\sum_{j=1}^n\frac{26}{p_j^4}\quad(\because\,0\leq p_j\leq1)\\
&\leq\frac{26}{\sigma^4_n}N\int_0^r\frac{1}{(1-x)^4}\,dx \\
&=\frac{N}{N^2}\frac{26\cdot c(r)}{\left[\frac{r}{1-r}+\log{(1-r)}\right]^2}\quad (c(r)\mbox{ is a constant})\\
&\rightarrow0\mbox{ as }N\rightarrow\infty\end{array}$$holds. Thus, $$\sqrt{n}\left(\frac{S_n}{n}-m\right)\overset{d}{\rightarrow}N(0,\sigma^2),$$where $$\begin{array}{rl}nm=\mathscr{E}(S_n) &\implies m=-\frac{\log{(1-r)}}{r}\\
n\sigma^2=\sigma^2_n &\implies\sigma^2=\frac{1}{r}\left(\frac{r}{1-r}+\log{(1-r)}\right).\end{array}$$
Coupon Collector's Problem. Coupons are drawn at random with replacement from among $N$ distinct coupons until exactly $n$ distinct coupons are observed. Let $S_n$ denote the total number of coupons drawn. Then $S_n=Y_1+\cdots+Y_n$, where $Y_j$ is the number of coupons drawn after observing $j-1$ distinct coupons until the $j$th distinct coupon is drawn. Then $Y_1$, ..., $Y_n$ are independent Geometric random variables with means and variances, $$\begin{array}{rl}\mathscr{E}(Y_j)&=\frac{N}{N-j+1};\\ \sigma^2(Y_j)&=\frac{N(j-1)}{(N-j+1)^2}.\end{array}$$Let $n=\lceil Nr\rceil$ for some fixed $r\in(0,1)$, then $$\alpha_n=\mathscr{E}(S_n)=\sum_{j=1}^n\mathscr{E}(Y_j)=\sum_{j=1}^n\frac{N}{N-j+1}=\sum_{j=1}^n\frac{1}{1-\frac{j-1}{N}},$$we have $$N\int_{-\frac{1}{N}}^{r-\frac{1}{N}}\frac{1}{1-x}\,dx\leq\mathscr{E}(S_n)\leq N\int_{0}^{r}\frac{1}{1-x}\,dx.$$Thus, as $N\rightarrow\infty$, $$\underset{N\rightarrow\infty}{\lim}\mathscr{E}(S_n)=\frac{n}{r}\log{\left(\frac{1}{1-r}\right)}.$$Similarly, we have $$\sigma^2_n=\sigma^2(S_n)=\sum_{j=1}^n\sigma^2(Y_j)=\sum_{j=1}^n\frac{N(j-1)}{(N-j+1)^2}=\sum_{j=1}^n\frac{\frac{j-1}{N}}{\left(1-\frac{j-1}{N}\right)^2},$$then $$\underset{N\rightarrow\infty}{\lim}\sigma^2(S_n)=N\int_0^r\frac{x}{(1-x)^2}\,dx=\frac{n}{r}\left(\frac{r}{1-r}+\log{(1-r)}\right).$$Consider the Lyapunov's condition with $\delta=2$. Since $Y_j\overset{i.i.d.}{\sim}\mbox{Geo}\left(p_j=\frac{N-j+1}{N}\right)$, we have $$\mathscr{E}|Y_j|^4=\frac{1}{p_j^4}(2-p_j)(12-12p_j+p_j^2)<\infty.$$The Lyapunov's condition $$\begin{array}{rl}\frac{1}{\sigma^4_n}\sum_{j=1}^n\mathscr{E}\left|Y_j-\mathscr{E}(Y_j)\right|^4
&\leq\frac{1}{\sigma^4_n}\sum_{j=1}^n\mathscr{E}|Y_j|^4 \\
&=\frac{1}{\sigma^4_n}\sum_{j=1}^n\left[\frac{1}{p_j^4}(2-p_j)(12-12p_j+p_j^2)\right] \\
&\leq\frac{1}{\sigma^4_n}\sum_{j=1}^n\frac{26}{p_j^4}\quad(\because\,0\leq p_j\leq1)\\
&\leq\frac{26}{\sigma^4_n}N\int_0^r\frac{1}{(1-x)^4}\,dx \\
&=\frac{N}{N^2}\frac{26\cdot c(r)}{\left[\frac{r}{1-r}+\log{(1-r)}\right]^2}\quad (c(r)\mbox{ is a constant})\\
&\rightarrow0\mbox{ as }N\rightarrow\infty\end{array}$$holds. Thus, $$\sqrt{n}\left(\frac{S_n}{n}-m\right)\overset{d}{\rightarrow}N(0,\sigma^2),$$where $$\begin{array}{rl}nm=\mathscr{E}(S_n) &\implies m=-\frac{\log{(1-r)}}{r}\\
n\sigma^2=\sigma^2_n &\implies\sigma^2=\frac{1}{r}\left(\frac{r}{1-r}+\log{(1-r)}\right).\end{array}$$
2015年8月21日 星期五
Application of Lyapunov's Central Limit Theorem (1)
About Posts which Tagged by 'Probability'
Let $S_n=\sum_{k=1}^nX_{nk}$ and $\{X_{nk}\}$ be independent random variables with $$\mathscr{P}\{X_{nk}=1\}=\frac{1}{n-k+1}=1-\mathscr{P}\{X_{nk}=0\}.$$We have $$\begin{array}{rl}
\alpha_{nk} & = \mathscr{E}(X_{nk}) = \frac{1}{n-k+1}; \\
\sigma^2_{nk} & = \mathscr{E}(X_{nk}-\alpha_{nk})^2 = \frac{1}{n-k+1}\left(1-\frac{1}{n-k+1}\right); \\
\gamma_{nk} & = \mathscr{E}|X_{nk}-\alpha_{nk}|^3 \\
& = \mathscr{E}(X_{nk}^3)-3\alpha_{nk}\mathscr{E}(X_{nk}^2)+3\alpha_{nk}^2\mathscr{E}(X_{nk})-\alpha_{nk}^3\\
&=\frac{1}{n-k+1}-\frac{3}{(n-k+1)^2}+\frac{3}{(n-k+1)^3}-\frac{1}{(n-k+1)^3}\\
&=\frac{(n-k+1)^2-3(n-k+1)+2}{(n-k+1)^3}.
\end{array}$$Let $$\begin{array}{rl}
\sigma^2_n &=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2_{nk}=\sum_{k=1}^n\frac{n-k}{(n-k+1)^2}=\sum_{k=1}^n\frac{k-1}{k^2}; \\
\Gamma_n &=\sum_{k=1}^n\gamma_{nk}=\sum_{k=1}^n\frac{(n-k+1)^2-3(n-k+1)+2}{(n-k+1)^3}=\sum_{k=1}^n\frac{k^2-3k+2}{k^3}.
\end{array}$$By Lyapunov's condition, $$
\frac{1}{\sigma_n^3}\Gamma_n=\frac{\sum_{k=1}^n\frac{k^2-3k+2}{k^3}}{\left(\sum_{k=1}^n\frac{k-1}{k^2}\right)^{3/2}}=\frac{\sum_{k=1}^n\frac{1}{k}-3\sum_{k=1}^n\frac{1}{k^2}+2\sum_{k=1}^n\frac{1}{k^3}}{\left(\sum_{k=1}^n\frac{1}{k}-\sum_{k=1}^n\frac{1}{k^2}\right)^{3/2}}\rightarrow0 $$as $n\rightarrow\infty$ compare to $(\sum_{k=1}^n\frac{1}{k})^{-1/2}$. We have $(S_n-A_n)/s_n\overset{d}{\rightarrow}N(0,1)$ where $$\begin{array}{rl}A_n&=\sum_{k=1}^n\alpha_{nk}=\sum_{k=1}^n\frac{1}{n-k+1}=\sum_{k=1}^n\frac{1}{k}; \\
s_n&=\sigma_n=\left(\sum_{k=1}^n\frac{1}{k}-\sum_{k=1}^n\frac{1}{k^2}\right)^{1/2}.\end{array}$$
Let $S_n=\sum_{k=1}^nX_{nk}$ and $\{X_{nk}\}$ be independent random variables with $$\mathscr{P}\{X_{nk}=1\}=\frac{1}{n-k+1}=1-\mathscr{P}\{X_{nk}=0\}.$$We have $$\begin{array}{rl}
\alpha_{nk} & = \mathscr{E}(X_{nk}) = \frac{1}{n-k+1}; \\
\sigma^2_{nk} & = \mathscr{E}(X_{nk}-\alpha_{nk})^2 = \frac{1}{n-k+1}\left(1-\frac{1}{n-k+1}\right); \\
\gamma_{nk} & = \mathscr{E}|X_{nk}-\alpha_{nk}|^3 \\
& = \mathscr{E}(X_{nk}^3)-3\alpha_{nk}\mathscr{E}(X_{nk}^2)+3\alpha_{nk}^2\mathscr{E}(X_{nk})-\alpha_{nk}^3\\
&=\frac{1}{n-k+1}-\frac{3}{(n-k+1)^2}+\frac{3}{(n-k+1)^3}-\frac{1}{(n-k+1)^3}\\
&=\frac{(n-k+1)^2-3(n-k+1)+2}{(n-k+1)^3}.
\end{array}$$Let $$\begin{array}{rl}
\sigma^2_n &=\sigma^2(S_n)=\sum_{k=1}^n\sigma^2_{nk}=\sum_{k=1}^n\frac{n-k}{(n-k+1)^2}=\sum_{k=1}^n\frac{k-1}{k^2}; \\
\Gamma_n &=\sum_{k=1}^n\gamma_{nk}=\sum_{k=1}^n\frac{(n-k+1)^2-3(n-k+1)+2}{(n-k+1)^3}=\sum_{k=1}^n\frac{k^2-3k+2}{k^3}.
\end{array}$$By Lyapunov's condition, $$
\frac{1}{\sigma_n^3}\Gamma_n=\frac{\sum_{k=1}^n\frac{k^2-3k+2}{k^3}}{\left(\sum_{k=1}^n\frac{k-1}{k^2}\right)^{3/2}}=\frac{\sum_{k=1}^n\frac{1}{k}-3\sum_{k=1}^n\frac{1}{k^2}+2\sum_{k=1}^n\frac{1}{k^3}}{\left(\sum_{k=1}^n\frac{1}{k}-\sum_{k=1}^n\frac{1}{k^2}\right)^{3/2}}\rightarrow0 $$as $n\rightarrow\infty$ compare to $(\sum_{k=1}^n\frac{1}{k})^{-1/2}$. We have $(S_n-A_n)/s_n\overset{d}{\rightarrow}N(0,1)$ where $$\begin{array}{rl}A_n&=\sum_{k=1}^n\alpha_{nk}=\sum_{k=1}^n\frac{1}{n-k+1}=\sum_{k=1}^n\frac{1}{k}; \\
s_n&=\sigma_n=\left(\sum_{k=1}^n\frac{1}{k}-\sum_{k=1}^n\frac{1}{k^2}\right)^{1/2}.\end{array}$$
Application of Lindeberg's Central Limit Theorem (2)
About Posts which Tagged by 'Probability'
Let $X_j$ be defined as follows for some $\alpha>1$: $$X_j=\begin{cases}\pm j^\alpha, &\mbox{ with probability }\frac{1}{6\,j^{2(\alpha-1)}}\mbox{ each;}\\0,&\mbox{ with probability }1-\frac{1}{3\,j^{2(\alpha-1)}}.\end{cases}$$ We have $$\begin{array}{rl}
\mathscr{E}(X_j) &=0. \\
\sigma^2(X_j) & =\mathscr{E}(X_j^2)=j^{2\alpha}\frac{2}{6\,j^{2(\alpha-1)}}=\frac{j^2}{3}\implies \\
\sigma^2_n &=\sigma^2(S_n)=\sum_{j=1}^n\sigma^2(X_j)=\sum_{j=1}^n\frac{j^2}{3}=\frac{n(n+1)(2n+1)}{18}. \\
\end{array}$$The Lindeberg's condition is defined as, let $\eta>0$, $$\begin{array}{rl}
\frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{|X_j|>\eta\sigma_n\}\right)
& = \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{j^\alpha>\eta\sigma_n\}\right) \\
(\because\,j\leq n)& \leq \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{n^\alpha>\eta\sigma_n\}\right) \\
& = \frac{1}{\sigma^2_n}\left[\sum_{j=1}^n\mathscr{E}(X_j^2)\right]\,I\{n^\alpha>\eta\sigma_n\} \\
& = I\{n^\alpha>\eta\sigma_n\}. \\
\end{array}$$Hence, when $I\{n^\alpha>\eta\sigma_n\}=0$, Lindeberg's condition holds. We have $$I\{n^\alpha>\eta\sigma_n\}=0 \iff n^\alpha<\eta\sigma_n \iff n^\alpha=o(n^{\frac{3}{2}}) \iff \alpha<\frac{3}{2}.$$The Lindeberg's condition is satisfied if and only if $\alpha<3/2$.
Let $X_j$ be defined as follows for some $\alpha>1$: $$X_j=\begin{cases}\pm j^\alpha, &\mbox{ with probability }\frac{1}{6\,j^{2(\alpha-1)}}\mbox{ each;}\\0,&\mbox{ with probability }1-\frac{1}{3\,j^{2(\alpha-1)}}.\end{cases}$$ We have $$\begin{array}{rl}
\mathscr{E}(X_j) &=0. \\
\sigma^2(X_j) & =\mathscr{E}(X_j^2)=j^{2\alpha}\frac{2}{6\,j^{2(\alpha-1)}}=\frac{j^2}{3}\implies \\
\sigma^2_n &=\sigma^2(S_n)=\sum_{j=1}^n\sigma^2(X_j)=\sum_{j=1}^n\frac{j^2}{3}=\frac{n(n+1)(2n+1)}{18}. \\
\end{array}$$The Lindeberg's condition is defined as, let $\eta>0$, $$\begin{array}{rl}
\frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{|X_j|>\eta\sigma_n\}\right)
& = \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{j^\alpha>\eta\sigma_n\}\right) \\
(\because\,j\leq n)& \leq \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{n^\alpha>\eta\sigma_n\}\right) \\
& = \frac{1}{\sigma^2_n}\left[\sum_{j=1}^n\mathscr{E}(X_j^2)\right]\,I\{n^\alpha>\eta\sigma_n\} \\
& = I\{n^\alpha>\eta\sigma_n\}. \\
\end{array}$$Hence, when $I\{n^\alpha>\eta\sigma_n\}=0$, Lindeberg's condition holds. We have $$I\{n^\alpha>\eta\sigma_n\}=0 \iff n^\alpha<\eta\sigma_n \iff n^\alpha=o(n^{\frac{3}{2}}) \iff \alpha<\frac{3}{2}.$$The Lindeberg's condition is satisfied if and only if $\alpha<3/2$.
Application of Lindeberg's Central Limit Theorem (1)
About Posts which Tagged by 'Probability'
For each $j$ let $X_j$ have the uniform distribution in $[-j,j]$. We have $$\begin{array}{rl} \mathscr{E}(X_j) &=\frac{-j+j}{2}=0; \\
\sigma^2(X_j) &=\frac{[j-(-j)]^2}{12}=\frac{j^2}{3}\;\Rightarrow\; \\
\sigma^2_n &=\sigma^2(S_n)=\sum_{j=1}^n\sigma^2(X_j)=\sum_{j=1}^n\frac{j^2}{3}=\frac{n(n+1)(2n+1)}{18}.
\end{array}$$The Lindeberg's condition is defined as, for all $\eta>0$, $$\begin{array}{rl}
\frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{|X_j|>\eta\sigma_n\}\right)
& = \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{X_j^2>\eta^2\sigma_n^2\}\right) \\
(\because\,|X_j|\leq n) & \leq \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{n^2>\eta^2\sigma_n^2\}\right) \\
& = \frac{1}{\sigma^2_n}\left[\sum_{j=1}^n\mathscr{E}(X_j^2)\right]\,I\left\{1>\frac{\eta^2\sigma_n^2}{n^2}\right\} \\
& = I\{1>\eta^2\frac{(n+1)(2n+1)}{18n}\}\rightarrow0\mbox{ as }n\rightarrow\infty, \\ \end{array}$$since $(n+1)(2n+1)/18n\rightarrow\infty$ as $n\rightarrow\infty$. Hence Lindeberg's condition holds. Then $$\frac{S_n-\mathscr{E}(S_n)}{\sigma_n}\sim\frac{3}{\sqrt{n}}\left(\frac{S_n}{n}\right)\overset{d}{\rightarrow}\mathscr{N}(0,1).$$
The R code for simulating this result is shown as follows.
set.seed(100)
# set simulation time and total number of summation
sim <- 1000; n <- 100
# simulate sample
s <- sapply(1:sim, function(k) sum(sapply(1:n, function(i) runif(1,-i,i))))
# calculate means
m <- s/n
# draw result
hist(m, freq = FALSE, xlab = 'mean', main = '', border = 'white', col = 'gray')
title(paste0('Histogram of the Mean of Unif(-i,i), i=1,2,..., n'), line = 2)
title(paste0('n = ', n, '; Simulation times = ', sim), line = 0.6)
# compare to normal distribution
curve(dnorm(x, 0, sqrt(n)/3), col = 2, lwd = 2, add = TRUE)
legend('topleft', expression(N(0,sqrt(n)/3)), col = 2, lty = 1, lwd = 2, bty = 'n')
For each $j$ let $X_j$ have the uniform distribution in $[-j,j]$. We have $$\begin{array}{rl} \mathscr{E}(X_j) &=\frac{-j+j}{2}=0; \\
\sigma^2(X_j) &=\frac{[j-(-j)]^2}{12}=\frac{j^2}{3}\;\Rightarrow\; \\
\sigma^2_n &=\sigma^2(S_n)=\sum_{j=1}^n\sigma^2(X_j)=\sum_{j=1}^n\frac{j^2}{3}=\frac{n(n+1)(2n+1)}{18}.
\end{array}$$The Lindeberg's condition is defined as, for all $\eta>0$, $$\begin{array}{rl}
\frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{|X_j|>\eta\sigma_n\}\right)
& = \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{X_j^2>\eta^2\sigma_n^2\}\right) \\
(\because\,|X_j|\leq n) & \leq \frac{1}{\sigma^2_n}\sum_{j=1}^n\mathscr{E}\left(X_j^2\,I\{n^2>\eta^2\sigma_n^2\}\right) \\
& = \frac{1}{\sigma^2_n}\left[\sum_{j=1}^n\mathscr{E}(X_j^2)\right]\,I\left\{1>\frac{\eta^2\sigma_n^2}{n^2}\right\} \\
& = I\{1>\eta^2\frac{(n+1)(2n+1)}{18n}\}\rightarrow0\mbox{ as }n\rightarrow\infty, \\ \end{array}$$since $(n+1)(2n+1)/18n\rightarrow\infty$ as $n\rightarrow\infty$. Hence Lindeberg's condition holds. Then $$\frac{S_n-\mathscr{E}(S_n)}{\sigma_n}\sim\frac{3}{\sqrt{n}}\left(\frac{S_n}{n}\right)\overset{d}{\rightarrow}\mathscr{N}(0,1).$$
The R code for simulating this result is shown as follows.
set.seed(100)
# set simulation time and total number of summation
sim <- 1000; n <- 100
# simulate sample
s <- sapply(1:sim, function(k) sum(sapply(1:n, function(i) runif(1,-i,i))))
# calculate means
m <- s/n
# draw result
hist(m, freq = FALSE, xlab = 'mean', main = '', border = 'white', col = 'gray')
title(paste0('Histogram of the Mean of Unif(-i,i), i=1,2,..., n'), line = 2)
title(paste0('n = ', n, '; Simulation times = ', sim), line = 0.6)
# compare to normal distribution
curve(dnorm(x, 0, sqrt(n)/3), col = 2, lwd = 2, add = TRUE)
legend('topleft', expression(N(0,sqrt(n)/3)), col = 2, lty = 1, lwd = 2, bty = 'n')
Application of Borel-Cantelli Lemma
About Posts which Tagged by 'Probability'
Let $\{X_n\}_{n\geq1}$ be i.i.d. exponential random variables with parameter $\lambda$, then $$\mathscr{P}\left\{\underset{n\rightarrow\infty}{\limsup}\frac{X_n}{\log{n}}=\frac{1}{\lambda}\right\}=1.$$
$\bullet$ Proof.
Let $\{X_n\}_{n\geq1}$ be i.i.d. exponential random variables with parameter $\lambda$, then $$\mathscr{P}\left\{\underset{n\rightarrow\infty}{\limsup}\frac{X_n}{\log{n}}=\frac{1}{\lambda}\right\}=1.$$
$\bullet$ Proof.
2015年8月17日 星期一
Proof of Fatou's Lemma
About Posts which Tagged by 'Probability'
[Fatou's Lemma.] If $|X_n|\geq0$ a.e. on $\Lambda$, then $$\int_\Lambda\underset{n\rightarrow\infty}{\liminf}X_n\,d\mathscr{P}\leq\underset{n\rightarrow\infty}{\liminf}\int_\Lambda X_n\,d\mathscr{P}.$$Furthermore, if for all $n$, $|X_n|\leq Y$ a.e. on $\Lambda$ with $\mathscr{E}(Y)<\infty$, the above remains true as well as $$\int_\Lambda\underset{n\rightarrow\infty}{\limsup}X_n\,d\mathscr{P}\geq\underset{n\rightarrow\infty}{\limsup}\int_\Lambda X_n\,d\mathscr{P}.$$In the second statement, it would be false if the condition involving $Y$ is omitted.
Expectation and Tail Probability (3)
About Posts which Tagged by 'Probability'
Let $X$ be a random variable. For any $r>0$, $\mathscr{E}|X|^r<\infty$, we have $\mathscr{E}|X|^r<\infty$ if and only if $\sum_{n=1}^\infty n^{r-1}\mathscr{P}\{|X|\geq n\}$ converges.
Expectation and Tail Probability (2)
About Posts which Tagged by 'Probability'
Let $X$ be a random variable and $c$ be a fixed constant, $c>0$. Then $\mathscr{E}|X|<\infty$ if and only if $\sum_{n=1}^\infty \mathscr{P}\{|X|\geq cn\}$ converges.
Application of Fatou's Lemma
About Posts which Tagged by 'Probability'
Let $\{E_n\}$ be events in a Borel field $\mathscr{F}$, we have $$\mathscr{P}\{\underset{n}{\limsup}E_n\}\geq\underset{n}{\overline{\lim}}\mathscr{P}\{E_n\},$$ $$\mathscr{P}\{\underset{n}{\liminf}E_n\}\leq\underset{n}{\underline{\lim}}\mathscr{P}\{E_n\}.$$
Let $\{E_n\}$ be events in a Borel field $\mathscr{F}$, we have $$\mathscr{P}\{\underset{n}{\limsup}E_n\}\geq\underset{n}{\overline{\lim}}\mathscr{P}\{E_n\},$$ $$\mathscr{P}\{\underset{n}{\liminf}E_n\}\leq\underset{n}{\underline{\lim}}\mathscr{P}\{E_n\}.$$
2015年8月15日 星期六
Representation of the Characteristic Function
About Posts which Tagged by 'Probability'
We introduce some criteria that a function $f$ is a characteristic function (ch.f.).
1. Bochner's Theorem
$f$ is a ch.f. $\iff$
(1) $f(0)=1$;
(2) $f$ is continuous at $t=0$;
(3) $f$ is positive definite (p.d., see Supp).
The p.d. property is nearly impossible to verify, thus we do not recommend that checking the conditions of Bochner's Theorem. Practically, the following theorems might be useful to verify a characteristic function.
2. P$\dot{o}$lya's Theorem
If $f:\mathbb{R}\rightarrow\mathbb{R}$ satisfies
(1) $f(0)=1$;
(2) $f(t)\geq0$;
(3) $f(t)=f(-t)$ symmetric;
(4) $f$ is decreasing on $[0,\infty)$;
(5) $f$ is continuous on $[0,\infty)$;
(6) $f$ is convex on $[0,\infty)$,
then $f$ is a ch.f.
3. If $f_\alpha(t)=\exp{\{-|t|^\alpha\}}$, $0<\alpha\leq2$, then $f_\alpha(t)$ is a ch.f.
4. If $f$ is a ch.f., then so is $e^{\lambda(f-1)}$ for each $\lambda\geq0$.
[Supp] A function $f$ is positive definite (p.d.) iff for any finite set of real numbers $t_j$ and complex numbers $z_j$ (with conjugate complex $\bar{z}$), $1\leq j\;eq n$, we have $$\sum_{j=1}^n\sum_{k=1}^n f(t_j-t_k)z_j\bar{z}_k\geq0.$$
We introduce some criteria that a function $f$ is a characteristic function (ch.f.).
1. Bochner's Theorem
$f$ is a ch.f. $\iff$
(1) $f(0)=1$;
(2) $f$ is continuous at $t=0$;
(3) $f$ is positive definite (p.d., see Supp).
The p.d. property is nearly impossible to verify, thus we do not recommend that checking the conditions of Bochner's Theorem. Practically, the following theorems might be useful to verify a characteristic function.
2. P$\dot{o}$lya's Theorem
If $f:\mathbb{R}\rightarrow\mathbb{R}$ satisfies
(1) $f(0)=1$;
(2) $f(t)\geq0$;
(3) $f(t)=f(-t)$ symmetric;
(4) $f$ is decreasing on $[0,\infty)$;
(5) $f$ is continuous on $[0,\infty)$;
(6) $f$ is convex on $[0,\infty)$,
then $f$ is a ch.f.
3. If $f_\alpha(t)=\exp{\{-|t|^\alpha\}}$, $0<\alpha\leq2$, then $f_\alpha(t)$ is a ch.f.
4. If $f$ is a ch.f., then so is $e^{\lambda(f-1)}$ for each $\lambda\geq0$.
[Supp] A function $f$ is positive definite (p.d.) iff for any finite set of real numbers $t_j$ and complex numbers $z_j$ (with conjugate complex $\bar{z}$), $1\leq j\;eq n$, we have $$\sum_{j=1}^n\sum_{k=1}^n f(t_j-t_k)z_j\bar{z}_k\geq0.$$
Strong LLN v.s. Weak LLN
About Posts which Tagged by 'Probability'
It is clear that SLLN implies WLLN since almost surely convergence implies converge in probability. Here, we introduce a counterexample that satisfies WLLN but not SLLN.
$\bullet$ Counterexample. (WLLN $\not\Rightarrow$ SLLN)
It is clear that SLLN implies WLLN since almost surely convergence implies converge in probability. Here, we introduce a counterexample that satisfies WLLN but not SLLN.
$\bullet$ Counterexample. (WLLN $\not\Rightarrow$ SLLN)
Lindeberg's CLT v.s. Lyapunov's CLT
About Posts which Tagged by 'Probability'
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and $$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n; \\
\mathscr{E}\left(|X_{nj}-\alpha_{nj}|^{2+\delta}\right)=r^{2+\delta}_{nj}, \delta>0.&
\end{array}$$
$\diamondsuit$ Lyapunov's condition:
$\exists\,\delta>0$ such that $\gamma^{2+\delta}_{nj}$ exists for each $n$ and $j$, and $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^{2+\delta}_n}\sum_{j=1}^{k_n}r^{2+\delta}_{nj}=0.$$
$\diamondsuit$ Lindeberg's condition:
$$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^2_n}\sum_{j=1}^{k_n}\mathscr{E}\left[(X_{nj}-\alpha_{nj})^2\,I\left(|x_{nj}-\alpha_{nj}|>\eta s_n\right)\right]=0, \forall\; \eta>0 $$
[Theorem] If the Lyapunov's condition holds, then so does the Lindeberg's condition. The converse is NOT true.
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and $$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n; \\
\mathscr{E}\left(|X_{nj}-\alpha_{nj}|^{2+\delta}\right)=r^{2+\delta}_{nj}, \delta>0.&
\end{array}$$
$\diamondsuit$ Lyapunov's condition:
$\exists\,\delta>0$ such that $\gamma^{2+\delta}_{nj}$ exists for each $n$ and $j$, and $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^{2+\delta}_n}\sum_{j=1}^{k_n}r^{2+\delta}_{nj}=0.$$
$\diamondsuit$ Lindeberg's condition:
$$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^2_n}\sum_{j=1}^{k_n}\mathscr{E}\left[(X_{nj}-\alpha_{nj})^2\,I\left(|x_{nj}-\alpha_{nj}|>\eta s_n\right)\right]=0, \forall\; \eta>0 $$
[Theorem] If the Lyapunov's condition holds, then so does the Lindeberg's condition. The converse is NOT true.
Converge in Distribution and Vague Convergence (1): Equivalence for s.p.m.'s
About Posts which Tagged by 'Probability'
[Notations] Sets of Continuous functions.
$C_K\,$: the set of continuous functions $f$ each vanishing outside a compact set $K(f)$.
$C_0\;\,$: the set of continuous functions $f$ such that $\lim_{|x|\rightarrow\infty}f(x)=0$.
$C_B\,$: the set of bounded continuous functions.
$C\;\;\,$: the set of continuous functions.
It is clearly that $f\in C_K\implies f\in C_0\implies f\in C_B\implies f\in C$.
[Theorem] Let $\{\mu_n\}_{n\geq1}$ and $\mu$ be a sequence of s.p.m.'s, then $$\mu_n\overset{v}{\longrightarrow}\mu\iff\forall\,f\in C_K\,(\mbox{or }C_0),\;\int f\,d\mu_n\rightarrow\int f\,d\mu.$$
$\bullet$ Proof.
[Notations] Sets of Continuous functions.
$C_K\,$: the set of continuous functions $f$ each vanishing outside a compact set $K(f)$.
$C_0\;\,$: the set of continuous functions $f$ such that $\lim_{|x|\rightarrow\infty}f(x)=0$.
$C_B\,$: the set of bounded continuous functions.
$C\;\;\,$: the set of continuous functions.
It is clearly that $f\in C_K\implies f\in C_0\implies f\in C_B\implies f\in C$.
[Theorem] Let $\{\mu_n\}_{n\geq1}$ and $\mu$ be a sequence of s.p.m.'s, then $$\mu_n\overset{v}{\longrightarrow}\mu\iff\forall\,f\in C_K\,(\mbox{or }C_0),\;\int f\,d\mu_n\rightarrow\int f\,d\mu.$$
$\bullet$ Proof.
Slutsky's Theorem
About Posts which Tagged by 'Probability'
If $X_n\rightarrow X$ in distribution, and $Y_n\rightarrow0$ in distribution, then
(1) $X_n+Y_n\rightarrow X$ in distribution;
(2) $X_nY_n\rightarrow 0$ in distribution.
$\bullet$ Proof.
If $X_n\rightarrow X$ in distribution, and $Y_n\rightarrow0$ in distribution, then
(1) $X_n+Y_n\rightarrow X$ in distribution;
(2) $X_nY_n\rightarrow 0$ in distribution.
$\bullet$ Proof.
Uniformly Integrable
About Posts which Tagged by 'Probability'
Let $\{X_t\}$, $t\in T$ be a fmaily of random variables where $T$ is an arbitrary index set.
[Definition] $\{X_t\}$ is said to be uniformly integrable iff $$\underset{A\rightarrow\infty}{\lim}\int_{|X_t|>A}|X_t|\,d\mathscr{P}=0$$ uniformly in $t\in T$.
[Theorem] The family $\{X_t\}$ is uniformly integrable if and only if the following two conditions are satisfied:
(1) $\mathscr{E}|X_t|$ is bounded in $t\in T$.
(2) For every $\varepsilon>0$, there exists $\delta(\varepsilon)>0$ such that for any $E\in\mathscr{F}$, $$\mathscr{P}(E)<\delta(\varepsilon)\implies\int_E|X_t|d\mathscr{P}<\varepsilon\mbox{ for every }t\in T.$$
$\bullet$ Proof.
Let $\{X_t\}$, $t\in T$ be a fmaily of random variables where $T$ is an arbitrary index set.
[Definition] $\{X_t\}$ is said to be uniformly integrable iff $$\underset{A\rightarrow\infty}{\lim}\int_{|X_t|>A}|X_t|\,d\mathscr{P}=0$$ uniformly in $t\in T$.
[Theorem] The family $\{X_t\}$ is uniformly integrable if and only if the following two conditions are satisfied:
(1) $\mathscr{E}|X_t|$ is bounded in $t\in T$.
(2) For every $\varepsilon>0$, there exists $\delta(\varepsilon)>0$ such that for any $E\in\mathscr{F}$, $$\mathscr{P}(E)<\delta(\varepsilon)\implies\int_E|X_t|d\mathscr{P}<\varepsilon\mbox{ for every }t\in T.$$
$\bullet$ Proof.
Convergence Theorems
About Posts which Tagged by 'Probability'
Let $X$ and $\{X_n\}$ be random variables.
1. Monotone Convergence Theorem
If $X_n\geq0$ and $X_n\uparrow X$ a.e. on $\Lambda$, then $$\underset{n\rightarrow\infty}{\lim}\int_\Lambda X_n\,d\mathscr{P}=\int_\Lambda X\,d\mathscr{P}=\int_\Lambda\underset{n\rightarrow\infty}{\lim}X_n\,d\mathscr{P}.$$
2. Dominated Convergence Theorem
If $\underset{n\rightarrow\infty}{\lim}X_n=X$ a.e. and, for all $n$, $|X_n|\leq Y$ a.e. on $\Lambda$ with $\mathscr{E}(Y)<\infty$, then $$\underset{n\rightarrow\infty}{\lim}\int_\Lambda X_n\,d\mathscr{P}=\int_\Lambda X\,d\mathscr{P}=\int_\Lambda\underset{n\rightarrow\infty}{\lim}X_n\,d\mathscr{P}.$$
3. Bounded Convergence Theorem
If $\underset{n\rightarrow\infty}{\lim}X_n=X$ a.e. and there exists a constant $M$ such that, for all $n$, $|X_n|\leq M$ a.e. on $\Lambda$, then $$\underset{n\rightarrow\infty}{\lim}\int_\Lambda X_n\,d\mathscr{P}=\int_\Lambda X\,d\mathscr{P}=\int_\Lambda\underset{n\rightarrow\infty}{\lim}X_n\,d\mathscr{P}.$$
4. Fatou's Lemma
If $|X_n|\geq0$ a.e. on $\Lambda$, then $$\int_\Lambda\underset{n\rightarrow\infty}{\liminf}X_n\,d\mathscr{P}\leq\underset{n\rightarrow\infty}{\liminf}\int_\Lambda X_n\,d\mathscr{P}.$$Furthermore, if for all $n$, $|X_n|\leq Y$ a.e. on $\Lambda$ with $\mathscr{E}(Y)<\infty$, the above remains true as well as $$\int_\Lambda\underset{n\rightarrow\infty}{\limsup}X_n\,d\mathscr{P}\geq\underset{n\rightarrow\infty}{\limsup}\int_\Lambda X_n\,d\mathscr{P}.$$See Proof of Fatou's Lemma
See Application of Fatou's Lemma
Let $X$ and $\{X_n\}$ be random variables.
1. Monotone Convergence Theorem
If $X_n\geq0$ and $X_n\uparrow X$ a.e. on $\Lambda$, then $$\underset{n\rightarrow\infty}{\lim}\int_\Lambda X_n\,d\mathscr{P}=\int_\Lambda X\,d\mathscr{P}=\int_\Lambda\underset{n\rightarrow\infty}{\lim}X_n\,d\mathscr{P}.$$
2. Dominated Convergence Theorem
If $\underset{n\rightarrow\infty}{\lim}X_n=X$ a.e. and, for all $n$, $|X_n|\leq Y$ a.e. on $\Lambda$ with $\mathscr{E}(Y)<\infty$, then $$\underset{n\rightarrow\infty}{\lim}\int_\Lambda X_n\,d\mathscr{P}=\int_\Lambda X\,d\mathscr{P}=\int_\Lambda\underset{n\rightarrow\infty}{\lim}X_n\,d\mathscr{P}.$$
3. Bounded Convergence Theorem
If $\underset{n\rightarrow\infty}{\lim}X_n=X$ a.e. and there exists a constant $M$ such that, for all $n$, $|X_n|\leq M$ a.e. on $\Lambda$, then $$\underset{n\rightarrow\infty}{\lim}\int_\Lambda X_n\,d\mathscr{P}=\int_\Lambda X\,d\mathscr{P}=\int_\Lambda\underset{n\rightarrow\infty}{\lim}X_n\,d\mathscr{P}.$$
4. Fatou's Lemma
If $|X_n|\geq0$ a.e. on $\Lambda$, then $$\int_\Lambda\underset{n\rightarrow\infty}{\liminf}X_n\,d\mathscr{P}\leq\underset{n\rightarrow\infty}{\liminf}\int_\Lambda X_n\,d\mathscr{P}.$$Furthermore, if for all $n$, $|X_n|\leq Y$ a.e. on $\Lambda$ with $\mathscr{E}(Y)<\infty$, the above remains true as well as $$\int_\Lambda\underset{n\rightarrow\infty}{\limsup}X_n\,d\mathscr{P}\geq\underset{n\rightarrow\infty}{\limsup}\int_\Lambda X_n\,d\mathscr{P}.$$See Proof of Fatou's Lemma
See Application of Fatou's Lemma
2015年8月14日 星期五
Expectation and Tail Probability (1)
About Posts which Tagged by 'Probability'
Let $X$ be a random variable. We have $$\sum_{n=1}^\infty \mathscr{P}\{|X|\geq n\}\leq \mathscr{E}|X|\leq1+\sum_{n=1}^\infty \mathscr{P}\{|X|\geq n\}$$ so that $\mathscr{E}|X|<\infty$ if and only if $\sum_{n=1}^\infty \mathscr{P}\{|X|\geq n\}$ converges.
Application of Three Series Theorem on Strong Convergence
About Posts which Tagged by 'Probability'
Let $\phi$ be a positive, even and continuous function on $(-\infty,\infty)$ such that as $|x|$ increases, $$\frac{\phi(x)}{|x|}\uparrow,\;\frac{\phi(x)}{x^2}\downarrow.$$ Let $\{X_n\}$ be a sequence of independent random variables with d.f.'s $F_n$ and $\mathscr{E}(X_n)=0$ and $0<a_n\uparrow\infty$. If, additionally, $\phi$ satisfies $$\sum_n\frac{\mathscr{E}\left(\phi(X_n)\right)}{\phi(a_n)}<\infty,$$ then $$\sum_n\frac{X_n}{a_n}\mbox{ converges a.e.}$$
Let $\phi$ be a positive, even and continuous function on $(-\infty,\infty)$ such that as $|x|$ increases, $$\frac{\phi(x)}{|x|}\uparrow,\;\frac{\phi(x)}{x^2}\downarrow.$$ Let $\{X_n\}$ be a sequence of independent random variables with d.f.'s $F_n$ and $\mathscr{E}(X_n)=0$ and $0<a_n\uparrow\infty$. If, additionally, $\phi$ satisfies $$\sum_n\frac{\mathscr{E}\left(\phi(X_n)\right)}{\phi(a_n)}<\infty,$$ then $$\sum_n\frac{X_n}{a_n}\mbox{ converges a.e.}$$
Extension of Strong Law of Large Number
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of independent and identically distributed random variables with $\mathscr{E}|X_1|=\infty$. Let $\{a_n\}$ be a sequence of positive numbers satisfying the condition $a_n/n \uparrow$. Define $S_n=\sum_j X_j$. Then we have $$\underset{n\rightarrow\infty}{\limsup}\frac{|S_n|}{a_n}=0\;\mbox{ a.s.,}\mbox{ or }=+\infty\;\mbox{ a.s.}$$ according as $$\sum_n\mathscr{P}\{|X_n|\geq a_n\}=\sum_n\int_{|x|\geq a_n}\,dF(x)<\infty,\mbox{ or }=+\infty.$$
$\bullet$ Proof. (The Converge Part)
Let $\{X_n\}$ be a sequence of independent and identically distributed random variables with $\mathscr{E}|X_1|=\infty$. Let $\{a_n\}$ be a sequence of positive numbers satisfying the condition $a_n/n \uparrow$. Define $S_n=\sum_j X_j$. Then we have $$\underset{n\rightarrow\infty}{\limsup}\frac{|S_n|}{a_n}=0\;\mbox{ a.s.,}\mbox{ or }=+\infty\;\mbox{ a.s.}$$ according as $$\sum_n\mathscr{P}\{|X_n|\geq a_n\}=\sum_n\int_{|x|\geq a_n}\,dF(x)<\infty,\mbox{ or }=+\infty.$$
$\bullet$ Proof. (The Converge Part)
Extension of Weak Law of Large Number (1)
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of independent random variables with distribution functions $\{F_n\}$. Define $S_n=\sum_j X_j$. Let $\{b_n\}$ be a given sequence of real numbers increasing to $+\infty$. Suppose that we have
(1) $\displaystyle\sum_{j=1}^n\int_{|x|>b_n}\,dF_j(x)=o(1)$,
(2) $\displaystyle\frac{1}{b_n^2}\sum_{j=1}^n\int_{|x|\leq b_n}x^2\,dF_j(x)=o(1)$;
then if we put $$a_n=\sum_{j=1}^n\int_{|x|\leq b_n}x\,dF_j(x),$$ we have $$\frac{1}{b_n}(S_n-a_n)\rightarrow0\mbox{ in probability.}$$
$\bullet$ Proof.
Let $\{X_n\}$ be a sequence of independent random variables with distribution functions $\{F_n\}$. Define $S_n=\sum_j X_j$. Let $\{b_n\}$ be a given sequence of real numbers increasing to $+\infty$. Suppose that we have
(1) $\displaystyle\sum_{j=1}^n\int_{|x|>b_n}\,dF_j(x)=o(1)$,
(2) $\displaystyle\frac{1}{b_n^2}\sum_{j=1}^n\int_{|x|\leq b_n}x^2\,dF_j(x)=o(1)$;
then if we put $$a_n=\sum_{j=1}^n\int_{|x|\leq b_n}x\,dF_j(x),$$ we have $$\frac{1}{b_n}(S_n-a_n)\rightarrow0\mbox{ in probability.}$$
$\bullet$ Proof.
Strong Law of Large Number
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of independent and identically distributed random variables. Define $S_n=\sum_j X_j$. Then we have $$\mathscr{E}|X_1|<\infty\implies\frac{S_n}{n}\rightarrow \mathscr{E}(X_1)\;\mbox{ a.s.}$$ $$\mathscr{E}|X_1|=\infty\implies\underset{n\rightarrow\infty}{\limsup}\frac{|S_n|}{n}=+\infty\;\mbox{ a.s.}$$
Let $\{X_n\}$ be a sequence of independent and identically distributed random variables. Define $S_n=\sum_j X_j$. Then we have $$\mathscr{E}|X_1|<\infty\implies\frac{S_n}{n}\rightarrow \mathscr{E}(X_1)\;\mbox{ a.s.}$$ $$\mathscr{E}|X_1|=\infty\implies\underset{n\rightarrow\infty}{\limsup}\frac{|S_n|}{n}=+\infty\;\mbox{ a.s.}$$
Kolmogorov's Three Series Theorem
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of independent random variables. Define for fixed $A>0$, $$Y_n(\omega)=\begin{cases}X_n(\omega),&\mbox{if }|X_n(\omega)|\leq A\\ 0,&\mbox{if }|X_n(\omega)|>A \end{cases}$$ then $\sum_nX_n$ converges a.e. $\iff$ the following three series converge a.e.
(1) $\sum_n\mathscr{P}\{|X_n|>A\}=\sum_n\mathscr{P}\{X_n\neq Y_n\}$;
(2) $\sum_n\mathscr{E}(Y_n)$;
(3) $\sum_n\sigma^2(Y_n)$.
Application of Three Series Theorem on Strong Convergence
Let $\{X_n\}$ be a sequence of independent random variables. Define for fixed $A>0$, $$Y_n(\omega)=\begin{cases}X_n(\omega),&\mbox{if }|X_n(\omega)|\leq A\\ 0,&\mbox{if }|X_n(\omega)|>A \end{cases}$$ then $\sum_nX_n$ converges a.e. $\iff$ the following three series converge a.e.
(1) $\sum_n\mathscr{P}\{|X_n|>A\}=\sum_n\mathscr{P}\{X_n\neq Y_n\}$;
(2) $\sum_n\mathscr{E}(Y_n)$;
(3) $\sum_n\sigma^2(Y_n)$.
Application of Three Series Theorem on Strong Convergence
Weak Law of Large Number
About Posts which Tagged by 'Probability'
Let $\{X_n\}$ be a sequence of pairwisely independent and identically distributed random variables with finite mean $m$. Define $S_n=\sum_j X_j$. Then $$\frac{S_n}{n}\rightarrow m\mbox{ in probability.}$$
Let $\{X_n\}$ be a sequence of pairwisely independent and identically distributed random variables with finite mean $m$. Define $S_n=\sum_j X_j$. Then $$\frac{S_n}{n}\rightarrow m\mbox{ in probability.}$$
Simple Limit Theorems
About Posts which Tagged by 'Probability'
Let $\{X_n\}_{n\geq1}$ be a sequence of random variables, and $S_n=\sum_{j=1}^n X_j$. To verify $$\frac{S_n-\mathscr{E}(S_n)}{n}\overset{p}{\rightarrow}0,$$ we need to show $\mathscr{E}(S_n^2)=o(n^2)$ inspired by the $L^2$ convergence. However, it might not be easy.
[Theorem] If $X_j$'s are uncorrelated and their second moment have a common bound, then $$\frac{S_n-\mathscr{E}(S_n)}{n}\rightarrow0$$ is true
(1) in $L^2$;
(2) in probability;
(3) almost surely.
$\bullet$ Proof.
Let $\{X_n\}_{n\geq1}$ be a sequence of random variables, and $S_n=\sum_{j=1}^n X_j$. To verify $$\frac{S_n-\mathscr{E}(S_n)}{n}\overset{p}{\rightarrow}0,$$ we need to show $\mathscr{E}(S_n^2)=o(n^2)$ inspired by the $L^2$ convergence. However, it might not be easy.
[Theorem] If $X_j$'s are uncorrelated and their second moment have a common bound, then $$\frac{S_n-\mathscr{E}(S_n)}{n}\rightarrow0$$ is true
(1) in $L^2$;
(2) in probability;
(3) almost surely.
$\bullet$ Proof.
Converge in r-th Mean v.s. Converge in Probability
About Posts which Tagged by 'Probability'
Let $X$ and $\{X_n\}_{n\geq1}$ be random varibles. $X_n$ converge to $X$ in $r$-th mean implies $X_n$ converge to $X$ in probability. The converse is NOT true except for $X_n$ being dominated by some random variable with finite $r$-th moment.
Let $X$ and $\{X_n\}_{n\geq1}$ be random varibles. $X_n$ converge to $X$ in $r$-th mean implies $X_n$ converge to $X$ in probability. The converse is NOT true except for $X_n$ being dominated by some random variable with finite $r$-th moment.
Converge in Probability v.s. Converge in Distribution
About Posts which Tagged by 'Probability'
Let $X$ and $\{X_n\}_{n\geq1}$ be random variables with distribution functions $F$ and $\{F_n\}_{n\geq1}$. $X_n$ converge to $X$ in probability implies $X_n$ converge to $X$ in distribution. The converse is NOT true except for $F$ degenerating to a constant.
Let $X$ and $\{X_n\}_{n\geq1}$ be random variables with distribution functions $F$ and $\{F_n\}_{n\geq1}$. $X_n$ converge to $X$ in probability implies $X_n$ converge to $X$ in distribution. The converse is NOT true except for $F$ degenerating to a constant.
2015年8月13日 星期四
Converge Almost Surely v.s. Converge in r-th Mean
About Posts which Tagged by 'Probability'
Let $X$ and $\{X_n\}_{n\geq1}$ be random variables. $X_n$ converge to $X$ almost surely dose NOT implies $X_n$ converge to $X$ in $r$-th mean, and vice versa.
Let $X$ and $\{X_n\}_{n\geq1}$ be random variables. $X_n$ converge to $X$ almost surely dose NOT implies $X_n$ converge to $X$ in $r$-th mean, and vice versa.
Converge Almost Surely v.s. Converge in Probability
About Posts which Tagged by 'Probability'
Let $X$ and $\{X_n\}_{n\geq1}$ be random variables. $X_n$ converge to $X$ almost surely implies $X_n$ converge to $X$ in probability. The converse is NOT true except for convergence along a subsequence.
Let $X$ and $\{X_n\}_{n\geq1}$ be random variables. $X_n$ converge to $X$ almost surely implies $X_n$ converge to $X$ in probability. The converse is NOT true except for convergence along a subsequence.
Inequalities for Random Variable
About Posts which Tagged by 'Probability'
Let $X$ and $Y$ be random variables.
(1) [See Proof] H$\ddot{o}$lder's inequality. Let $1<p<\infty$ and $\frac{1}{p}+\frac{1}{q}=1$. $$|\mathscr{E}(XY)|\leq \mathscr{E}|XY|\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}\left(\mathscr{E}|Y|^q\right)^{\frac{1}{q}}.$$
(2) [See Proof] Minkowski's inequality. Let $1<p<\infty$. $$\left(\mathscr{E}|X+Y|^p\right)^{\frac{1}{p}}\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}+\left(\mathscr{E}|Y|^p\right)^{\frac{1}{p}}.$$
(3) [See Proof] Lyapunov's inequality. For $0<s<t$, $$\left(\mathscr{E}|X|^s\right)^\frac{1}{s}\leq \left(\mathscr{E}|X|^t\right)^\frac{1}{t}.$$
(4) [See Proof] Jensen's inequality. Let $\phi$ be a convex function. Suppose $X$ and $\phi(X)$ are integrable. $$\phi(\mathscr{E}X)\leq \mathscr{E}[\phi(X)].$$
(5) [See Proof] Chebyshev's inequality. Let $\phi$ be a strictly increasing function on $(0,\infty)$ and $\phi(u)=\phi(-u)$. Suppose $\mathscr{E}[\phi(X)]<\infty$. Then $\forall\,u>0$, $$\mathscr{P}\{|X|\geq u\}\leq\frac{\mathscr{E}[\phi(X)]}{\phi(u)}.$$
(6) [See Proof] If $X\geq0$ and $Y\geq0$, $p\geq0$, then $$\mathscr{E}\{(X+Y)^p\}\leq2^p\{\mathscr{E}(X^p)+\mathscr{E}(Y^p)\}.$$If $p>1$, the factor $2^p$ may be replaced by $2^{p-1}$. If $0\leq p\leq1$, it may be replaced by $1$.
(7) [See Proof] Cantelli's inequality. Suppose $\sigma^2=\mbox{Var}(X)<\infty$. Then for $a>0$, we have $$\mathscr{P}\{|X-\mathscr{E}(X)|>a\}\leq\frac{2\sigma^2}{a^2+\sigma^2}.$$
(8) [See Proof] Chebyshev type for maximal sum of random variables I. Let $\{X_n\}$ be independent random variables such that $\mathscr{E}(X_n)=0$ and $\mathscr{E}(X_n^2)=\sigma^2(X_n)<\infty$ for all $n$, then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|>\varepsilon\right\}\leq\frac{\sigma^2(S_n)}{\varepsilon^2}.$$
(9) [See Proof] Chebyshev type for maximal sum of random variables II. Let $\{X_n\}$ be independent random variables with finite means and suppose that there exists an $A$ such that $$\forall\,n,\,|X_n-\mathscr{E}(X_n)|\leq A<\infty,$$Then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|\leq\varepsilon\right\}\leq\frac{(2A+4\varepsilon)^2}{\sigma^2(S_n)}.$$
(10) [See Proof] If $\mathscr{E}(X^2)=1$ and $\mathscr{E}|X|\geq a>0$, then $$\mathscr{P}\{|X|\geq\lambda a\}\geq(1-\lambda)^2a^2\mbox{ for }0\leq\lambda\leq1.$$
Let $X$ and $Y$ be random variables.
(1) [See Proof] H$\ddot{o}$lder's inequality. Let $1<p<\infty$ and $\frac{1}{p}+\frac{1}{q}=1$. $$|\mathscr{E}(XY)|\leq \mathscr{E}|XY|\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}\left(\mathscr{E}|Y|^q\right)^{\frac{1}{q}}.$$
(2) [See Proof] Minkowski's inequality. Let $1<p<\infty$. $$\left(\mathscr{E}|X+Y|^p\right)^{\frac{1}{p}}\leq \left(\mathscr{E}|X|^p\right)^{\frac{1}{p}}+\left(\mathscr{E}|Y|^p\right)^{\frac{1}{p}}.$$
(3) [See Proof] Lyapunov's inequality. For $0<s<t$, $$\left(\mathscr{E}|X|^s\right)^\frac{1}{s}\leq \left(\mathscr{E}|X|^t\right)^\frac{1}{t}.$$
(4) [See Proof] Jensen's inequality. Let $\phi$ be a convex function. Suppose $X$ and $\phi(X)$ are integrable. $$\phi(\mathscr{E}X)\leq \mathscr{E}[\phi(X)].$$
(5) [See Proof] Chebyshev's inequality. Let $\phi$ be a strictly increasing function on $(0,\infty)$ and $\phi(u)=\phi(-u)$. Suppose $\mathscr{E}[\phi(X)]<\infty$. Then $\forall\,u>0$, $$\mathscr{P}\{|X|\geq u\}\leq\frac{\mathscr{E}[\phi(X)]}{\phi(u)}.$$
(6) [See Proof] If $X\geq0$ and $Y\geq0$, $p\geq0$, then $$\mathscr{E}\{(X+Y)^p\}\leq2^p\{\mathscr{E}(X^p)+\mathscr{E}(Y^p)\}.$$If $p>1$, the factor $2^p$ may be replaced by $2^{p-1}$. If $0\leq p\leq1$, it may be replaced by $1$.
(7) [See Proof] Cantelli's inequality. Suppose $\sigma^2=\mbox{Var}(X)<\infty$. Then for $a>0$, we have $$\mathscr{P}\{|X-\mathscr{E}(X)|>a\}\leq\frac{2\sigma^2}{a^2+\sigma^2}.$$
(8) [See Proof] Chebyshev type for maximal sum of random variables I. Let $\{X_n\}$ be independent random variables such that $\mathscr{E}(X_n)=0$ and $\mathscr{E}(X_n^2)=\sigma^2(X_n)<\infty$ for all $n$, then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|>\varepsilon\right\}\leq\frac{\sigma^2(S_n)}{\varepsilon^2}.$$
(9) [See Proof] Chebyshev type for maximal sum of random variables II. Let $\{X_n\}$ be independent random variables with finite means and suppose that there exists an $A$ such that $$\forall\,n,\,|X_n-\mathscr{E}(X_n)|\leq A<\infty,$$Then let $S_n=\sum_{j=1}^nX_j$, we have for every $\varepsilon>0$, $$\mathscr{P}\left\{\underset{1\leq j\leq n}{\max}|S_j|\leq\varepsilon\right\}\leq\frac{(2A+4\varepsilon)^2}{\sigma^2(S_n)}.$$
(10) [See Proof] If $\mathscr{E}(X^2)=1$ and $\mathscr{E}|X|\geq a>0$, then $$\mathscr{P}\{|X|\geq\lambda a\}\geq(1-\lambda)^2a^2\mbox{ for }0\leq\lambda\leq1.$$
Linderberg-Feller's Central Limit Theorem (short version)
About Posts which Tagged by 'Probability'
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$If $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^2_n}\sum_{j=1}^{k_n}\mathscr{E}\left[(X_{nj}-\alpha_{nj})^2\,I\left(|X_{nj}-\alpha_{nj}|>\eta s_n\right)\right]=0, \forall\; \eta>0 $$ Then $$ \frac{S_n-\alpha_n}{s_n}\overset{d}{\longrightarrow} \Phi. $$
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$If $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^2_n}\sum_{j=1}^{k_n}\mathscr{E}\left[(X_{nj}-\alpha_{nj})^2\,I\left(|X_{nj}-\alpha_{nj}|>\eta s_n\right)\right]=0, \forall\; \eta>0 $$ Then $$ \frac{S_n-\alpha_n}{s_n}\overset{d}{\longrightarrow} \Phi. $$
$\bullet$ Proof.
Lyapunov's Central Limit Theorem
Declaration for Posts which Tagged by 'Probability'
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and $$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n; \\
\mathscr{E}\left(|X_{nj}-\alpha_{nj}|^{2+\delta}\right)=r^{2+\delta}_{nj}, &
\end{array}$$ where $\delta>0$. If $\gamma^{2+\delta}_{nj}$ exists for each $n$ and $j$, and $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^{2+\delta}_n}\sum_{j=1}^{k_n}r^{2+\delta}_{nj}=0,$$ then $$ \frac{S_n-\alpha_n}{s_n}\overset{d}{\longrightarrow} \Phi. $$
Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and $$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n; \\
\mathscr{E}\left(|X_{nj}-\alpha_{nj}|^{2+\delta}\right)=r^{2+\delta}_{nj}, &
\end{array}$$ where $\delta>0$. If $\gamma^{2+\delta}_{nj}$ exists for each $n$ and $j$, and $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^{2+\delta}_n}\sum_{j=1}^{k_n}r^{2+\delta}_{nj}=0,$$ then $$ \frac{S_n-\alpha_n}{s_n}\overset{d}{\longrightarrow} \Phi. $$
$\bullet$ Proof.
The Classical Central Limit Theorem
About Posts which Tagged by 'Probability'
Let $\{X_n\}_{n=1}^\infty$ be a sequence of i.i.d. random variables with mean $m$ and finite variance $\sigma^2>0$ and define $S_n=\sum_{j=1}^nX_n$. Then $$ \frac{S_n-mn}{\sigma\sqrt{n}}\overset{L}{\longrightarrow} \Phi. $$
$\bullet$ Proof.
Let $\{X_n\}_{n=1}^\infty$ be a sequence of i.i.d. random variables with mean $m$ and finite variance $\sigma^2>0$ and define $S_n=\sum_{j=1}^nX_n$. Then $$ \frac{S_n-mn}{\sigma\sqrt{n}}\overset{L}{\longrightarrow} \Phi. $$
$\bullet$ Proof.
Borel-Cantelli Lemma
About Posts which Tagged by 'Probability'
Let $\mathscr{F}$ be a Borel field and $\{E_n\}_{n\geq1}\in\mathscr{F}$ are events. We have
(1) $\sum_{n=1}^\infty \mathscr{P}\{E_n\} < \infty \implies \mathscr{P}\{E_n\mbox{ i.o.}\}=0;$
(2) If $\sum_{n=1}^\infty \mathscr{P}\{E_n\} = \infty$ and $E_n$'s are independent. Then $\mathscr{P}\{E_n\mbox{ i.o.}\}=1.$
$\bullet$ Proof.
Let $\mathscr{F}$ be a Borel field and $\{E_n\}_{n\geq1}\in\mathscr{F}$ are events. We have
(1) $\sum_{n=1}^\infty \mathscr{P}\{E_n\} < \infty \implies \mathscr{P}\{E_n\mbox{ i.o.}\}=0;$
(2) If $\sum_{n=1}^\infty \mathscr{P}\{E_n\} = \infty$ and $E_n$'s are independent. Then $\mathscr{P}\{E_n\mbox{ i.o.}\}=1.$
$\bullet$ Proof.
Almost Surely Convergence
About Posts which Tagged by 'Probability'
Let $\Omega$ be the sample space, and, $X$ and $\{X_n\}_{n\geq1}$ be random variables. We say $X_n$ converges almost surely to $X$ through the following definition.
$$\begin{array}{ccl}X_n\overset{a.s.}{\longrightarrow}X &\Leftrightarrow&\mbox{(1) }\exists\mbox{ null set }N\mbox{ such that }\forall\,\omega\in\Omega\setminus N,\,\underset{n\rightarrow\infty}{\lim}X_n(\omega)=X(\omega)\mbox{ finite} \\
& & \\
&\Leftrightarrow&\mbox{(2) }\forall\,\varepsilon>0,\,\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|\leq\varepsilon,\,\forall\,n\geq m\right\}=1 \\
& &\qquad\mbox{or, }\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{for some}\,n\geq m\right\}=0 \\
& & \\
&\Leftrightarrow&\mbox{(3) }\forall\,\varepsilon>0,\,\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{i.o.}\right\}=0. \\ \end{array}$$
$\bullet$ Proof.
Let $\Omega$ be the sample space, and, $X$ and $\{X_n\}_{n\geq1}$ be random variables. We say $X_n$ converges almost surely to $X$ through the following definition.
$$\begin{array}{ccl}X_n\overset{a.s.}{\longrightarrow}X &\Leftrightarrow&\mbox{(1) }\exists\mbox{ null set }N\mbox{ such that }\forall\,\omega\in\Omega\setminus N,\,\underset{n\rightarrow\infty}{\lim}X_n(\omega)=X(\omega)\mbox{ finite} \\
& & \\
&\Leftrightarrow&\mbox{(2) }\forall\,\varepsilon>0,\,\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|\leq\varepsilon,\,\forall\,n\geq m\right\}=1 \\
& &\qquad\mbox{or, }\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{for some}\,n\geq m\right\}=0 \\
& & \\
&\Leftrightarrow&\mbox{(3) }\forall\,\varepsilon>0,\,\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{i.o.}\right\}=0. \\ \end{array}$$
The first one is the basic definition of almost surely convergence. But, it might be hard to be used since we must check the convergence for every element in $\Omega$. Thus, here we introduce some usual tools for checking almost surely convergence derived by the definition. Finally, we know that the Borel-Cantelli Lemma plays an important role as the most useful tool here.
$\bullet$ Proof.
Convergence Modes and Their Relationship
About Posts which Tagged by 'Probability'
Let $\Omega$ be the sample space, and, $X$ and $\{X_n\}_{n\geq1}$ be random variables with distribution functions $F$ and $F_n$. There are four common modes of convergence.
1. Converge almost surely. [See More]$$\begin{array}{ccl}X_n\overset{a.s.}{\longrightarrow}X &\Leftrightarrow&\exists\mbox{ null set }N\mbox{ such that }\forall\,\omega\in\Omega\setminus N,\,\underset{n\rightarrow\infty}{\lim}X_n(\omega)=X(\omega)\mbox{ finite} \\
& & \\
&\Leftrightarrow&\forall\,\varepsilon>0,\,\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|\leq\varepsilon,\,\forall\,n\geq m\right\}=1 \\
& &\qquad\mbox{or, }\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{for some}\,n\geq m\right\}=0 \\
& & \\
&\Leftrightarrow&\forall\,\varepsilon>0,\,\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{i.o.}\right\}=\mathscr{P}\left\{\bigcap_{m=1}^\infty\bigcup_{n\geq m}\{|X_n-X|>\varepsilon\} \right\}=0 \\ \end{array}$$
2. Converge in $r$-th mean. $$X_n\overset{L^r}{\longrightarrow}X\iff X_n\in L^r,\,X\in L^r\mbox{ and }\underset{n\rightarrow\infty}{\lim}\mathscr{E}\left(|X_n-X|^r\right)=0.$$
3. Converge in probability. $$X_n\overset{p}{\longrightarrow}X\iff\forall\,\epsilon>0,\,\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|>\varepsilon\right\}=0.$$
4. Converge vaguely. Let $\{\mu_n\}_{n\geq1}$ and $\mu$ be subprobability measures (s.p.m.'s, $\mu(\mathbb{R})\leq1$) on $(\mathbb{R}, \mathscr{B})$, where $\mathscr{B}$ is a Borel field. $$\begin{array}{rcl}\mu_n\overset{v}{\longrightarrow}\mu&\iff&\exists\,\mbox{a dense set }D\in\mathbb{R}\mbox{ such that }\\
& &\forall\,a,b\in D,\,a<b,\;\mu_n((a,b])\rightarrow\mu((a,b])\mbox{ as }n\rightarrow\infty.\end{array}$$
5. Converge in distribution. $$\begin{array}{rl}X_n\overset{d}{\longrightarrow}X&\iff&\forall\,x\in C(F)=\{\mbox{points that }F\mbox{ is continuous}\},\\ & &F_n(x)\rightarrow F(x)\mbox{ as }n\rightarrow\infty.\end{array}$$
The relationship between those modes are as follows
$$\begin{array}{ccccccc}
X_n\overset{a.s.}{\longrightarrow}X & \Rightarrow
& X_n\overset{p}{\longrightarrow}X & \Rightarrow
& X_n\overset{d}{\longrightarrow}X & \equiv
& \mu_n\overset{v}{\longrightarrow}\mu \\
& & \Uparrow& & & &\\
& & X_n\overset{L^r}{\longrightarrow}X & & & &\\ \end{array}$$The converse are false except for some special cases.
$\star$ Converge Almost Surely v.s. Converge in r-th Mean
$\star$ Converge Almost Surely v.s. Converge in Probability
$\star$ Converge in r-th Mean v.s. Converge in Probability
$\star$ Converge in Probability v.s. Converge in Distribution
$\star$ Converge in Distribution and Vague Convergence (1): Equivalence for s.p.m.'s
$\star$ Converge in Distribution and Vague Convergence (2): Equivalence for p.m.'s
Let $\Omega$ be the sample space, and, $X$ and $\{X_n\}_{n\geq1}$ be random variables with distribution functions $F$ and $F_n$. There are four common modes of convergence.
1. Converge almost surely. [See More]$$\begin{array}{ccl}X_n\overset{a.s.}{\longrightarrow}X &\Leftrightarrow&\exists\mbox{ null set }N\mbox{ such that }\forall\,\omega\in\Omega\setminus N,\,\underset{n\rightarrow\infty}{\lim}X_n(\omega)=X(\omega)\mbox{ finite} \\
& & \\
&\Leftrightarrow&\forall\,\varepsilon>0,\,\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|\leq\varepsilon,\,\forall\,n\geq m\right\}=1 \\
& &\qquad\mbox{or, }\underset{m\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{for some}\,n\geq m\right\}=0 \\
& & \\
&\Leftrightarrow&\forall\,\varepsilon>0,\,\mathscr{P}\left\{|X_n-X|>\varepsilon,\,\mbox{i.o.}\right\}=\mathscr{P}\left\{\bigcap_{m=1}^\infty\bigcup_{n\geq m}\{|X_n-X|>\varepsilon\} \right\}=0 \\ \end{array}$$
2. Converge in $r$-th mean. $$X_n\overset{L^r}{\longrightarrow}X\iff X_n\in L^r,\,X\in L^r\mbox{ and }\underset{n\rightarrow\infty}{\lim}\mathscr{E}\left(|X_n-X|^r\right)=0.$$
3. Converge in probability. $$X_n\overset{p}{\longrightarrow}X\iff\forall\,\epsilon>0,\,\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{|X_n-X|>\varepsilon\right\}=0.$$
4. Converge vaguely. Let $\{\mu_n\}_{n\geq1}$ and $\mu$ be subprobability measures (s.p.m.'s, $\mu(\mathbb{R})\leq1$) on $(\mathbb{R}, \mathscr{B})$, where $\mathscr{B}$ is a Borel field. $$\begin{array}{rcl}\mu_n\overset{v}{\longrightarrow}\mu&\iff&\exists\,\mbox{a dense set }D\in\mathbb{R}\mbox{ such that }\\
& &\forall\,a,b\in D,\,a<b,\;\mu_n((a,b])\rightarrow\mu((a,b])\mbox{ as }n\rightarrow\infty.\end{array}$$
5. Converge in distribution. $$\begin{array}{rl}X_n\overset{d}{\longrightarrow}X&\iff&\forall\,x\in C(F)=\{\mbox{points that }F\mbox{ is continuous}\},\\ & &F_n(x)\rightarrow F(x)\mbox{ as }n\rightarrow\infty.\end{array}$$
The relationship between those modes are as follows
$$\begin{array}{ccccccc}
X_n\overset{a.s.}{\longrightarrow}X & \Rightarrow
& X_n\overset{p}{\longrightarrow}X & \Rightarrow
& X_n\overset{d}{\longrightarrow}X & \equiv
& \mu_n\overset{v}{\longrightarrow}\mu \\
& & \Uparrow& & & &\\
& & X_n\overset{L^r}{\longrightarrow}X & & & &\\ \end{array}$$The converse are false except for some special cases.
$\star$ Converge Almost Surely v.s. Converge in r-th Mean
$\star$ Converge Almost Surely v.s. Converge in Probability
$\star$ Converge in r-th Mean v.s. Converge in Probability
$\star$ Converge in Probability v.s. Converge in Distribution
$\star$ Converge in Distribution and Vague Convergence (1): Equivalence for s.p.m.'s
$\star$ Converge in Distribution and Vague Convergence (2): Equivalence for p.m.'s
訂閱:
文章 (Atom)