2015年8月13日 星期四

Lyapunov's Central Limit Theorem

Declaration for Posts which Tagged by 'Probability'

Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent.  Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and $$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n; \\
\mathscr{E}\left(|X_{nj}-\alpha_{nj}|^{2+\delta}\right)=r^{2+\delta}_{nj}, &
\end{array}$$ where $\delta>0$.  If $\gamma^{2+\delta}_{nj}$ exists for each $n$ and $j$, and $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^{2+\delta}_n}\sum_{j=1}^{k_n}r^{2+\delta}_{nj}=0,$$ then $$ \frac{S_n-\alpha_n}{s_n}\overset{d}{\longrightarrow} \Phi. $$

$\bullet$ Proof.
For Lyapunov's Condition, we simplify the proof by setting $\delta=1$ and, WLOG, letting $\alpha_{nj}=0$ and $s^2_n=1$.  Denote $\Gamma^3=\sum_{j=1}^{k_n}\gamma^3_{nj}$.  First, we need the following results.

[Theorem] Let $F$ be a distribution function which has finite absolute moment of order $k$, where $k\geq1$ is an integer.  Let $f$ be the characteristic function.  Then $f$ has the expansion in the neighboehood of $t=0$ as $$f(t)=\sum_{j=0}^k\frac{i^j}{j!}m^{(j)}t^j+o\left(|t|^k\right),$$ or $$f(t)=\sum_{j=0}^{k-1}\frac{i^j}{j!}m^{(j)}t^j+\frac{\theta_k}{k!}\mu^{(k)}|t|^k,$$ where $m^{(j)}$ is the moment of order $j$, $\mu^{(k)}$ is the absolute moment of order $k$, and $|\theta_k|\leq1$.

[Lemma] Let $\{\theta_{nj},\,n\geq1,\,1\leq j\leq k_n\}$ be a double array of complex numbers satisfying the following conditions as $n\rightarrow\infty$:
(1) $\underset{1\leq j\leq k_n}{\max}|\theta_{nj}|\rightarrow0$;
(2) $\sum_{j=1}^{k_n}|\theta_{nj}|\leq M\leq\infty$, where $M$ does not depend on $n$;
(3) $\sum_{j=1}^{k_n}\theta_{nj}\rightarrow\theta$, where $\theta$ is a finite complex number.
Then we have $$\prod_{j=1}^{k_n}(1+\theta_{nj})\rightarrow e^\theta.$$See General Conditions for A Series Converging as An Exponential Term.

Back to the proof.  We need to show the characteristic function of $S_n$ converges to the ch.f. of standard normal distribution.  Let $f_{nj}(t)$ be the ch.f. of $X_{nj}$.  By the assumption that the third moment of $X_{nj}$ exists and the Theorem above, we have
$$\begin{array}{ccl} f_{nj}(t)
& = & \sum_{r=0}^{2}\frac{i^r}{r!}m^{(r)}t^r+\frac{\Lambda_{nj}}{3!}\mu^{(3)}|t|^3,\,|\Lambda_{nj}|\leq1 \\
& = & 1-\frac{t^2}{2}\sigma^2_{nj}+\frac{\Lambda_{nj}}{6}\gamma_{nj}^3|t|^3\;\equiv\; 1+\theta_{nj} \end{array}$$
By Lyapunov's inequality, $r\leq s\Rightarrow\left(\mathscr{E}|X|^r\right)^{1/r}\leq\left(\mathscr{E}|X|^s\right)^{1/s}$, we have $$\sigma^3_{nj}\leq\gamma^3_{nj}\Rightarrow\underset{1\leq j\leq k_n}{\max}\sigma^3_{nj}\leq\underset{1\leq j\leq k_n}{\max}\gamma^3_{nj}\leq\sum_{j=1}^{k_n}\gamma^3_{nj}=\Gamma^3\rightarrow0$$ due to Lyapunov's condition.

Next, we check the three conditions of the Lemma:
(1) $$\underset{1\leq j\leq k_n}{\max}|\theta_{nj}|\leq \frac{t^2}{2}\underset{1\leq j\leq k_n}{\max}\sigma^2_{nj}+\frac{t^3}{6}\Gamma_n^3\rightarrow0$$ since $|\Lambda_{nj}|\leq1$ and $\Gamma^3_n \rightarrow0$.

(2) $$\sum_{j=1}^{k_n}|\theta_{nj}|=\frac{t^2}{2}\sum_{j=1}^{k_n}\sigma^2_{nj}+\frac{t^3}{6}\sum_{j=1}^{k_n}\Lambda_{nj}\gamma^3_{nj}\leq \frac{t^2}{2}+\frac{t^3}{6}\Gamma^3_n \equiv M,$$ since $\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n=1$ by assumption.

(3) $$\sum_{j=1}^{k_n}\theta_{nj}=-\frac{t^2}{2}\sum_{j=1}^{k_n}\sigma^2_{nj}+\frac{t^3}{6}\sum_{j=1}^{k_n}\Lambda_{nj}\gamma^3_{nj}\leq -\frac{t^2}{2}+\frac{t^3}{6}\Gamma^3_n \rightarrow -\frac{t^2}{2}.$$

The conditions (1), (2) and (3) of the Lemma are satisfied, thus we have the ch.f. of $S_n$
$$\prod_{j=1}^{k_n}f_{nj}(t)=\prod_{j=1}^{k_n}(1+\theta_{nj})\overset{n}{\longrightarrow}e^{-\frac{t^2}{2}},$$ which is the ch.f. of the standard normal distribution.
$\Box$

沒有留言:

張貼留言