Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent. Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$If $$\underset{n\rightarrow\infty}{\lim}\frac{1}{s^2_n}\sum_{j=1}^{k_n}\mathscr{E}\left[(X_{nj}-\alpha_{nj})^2\,I\left(|X_{nj}-\alpha_{nj}|>\eta s_n\right)\right]=0, \forall\; \eta>0 $$ Then $$ \frac{S_n-\alpha_n}{s_n}\overset{d}{\longrightarrow} \Phi. $$
For Linderberg Condition, we simplify the proof by letting $\alpha_{nj}=0$ and $s^2_n=1$ without loss of generality. First, we need the following results.
[Corollary] This is a corollary of Lyapunov's CLT. Suppose that $\alpha\neq0$, and, for each $n$ and $j$ there is a finite constant $M_{nj}$ such that $|X_{nj}|\leq M_{nj}$ a.e., and that $$\underset{1\leq j\leq k_n}{\max}M_{nj}\rightarrow0.$$ Then $S_n-\mathscr{E}(S_n)$ converges in dist. to $\Phi$.
[Lemma] Let $u(m,n)$ be a function of positive integers $m$ and $n$ such that $$\forall\,m:\, \underset{n\rightarrow\infty}{\lim}u(m,n)=0.$$ Then there exists a sequence $\{m_n\}$ increasing to $\infty$ such that $$\underset{n\rightarrow\infty}{\lim}u(m_n,n)=0.$$
The mean of $S'_n$:
Since we assume $\mathscr{E}(X_{nj})=0$, which leads $$\mathscr{E}(S'_n)=\sum_{j=1}^{k_n}\mathscr{E}(X'_{nj})=\sum_{j=1}^{k_n}\int_{|x|\leq\eta_n}x\,dF_{nj}(x)=\sum_{j=1}^{k_n}-\int_{|x|>\eta_n}x\,dF_{nj}(x).$$ Hence, $$\begin{array}{ccl}|\mathscr{E}(S'_n)|
& = & \left|\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x\,dF_{nj}(x)\right| \\
&\leq&\sum_{j=1}^{k_n}\int_{|x|>\eta_n}|x|\,dF_{nj}(x) = \sum_{j=1}^{k_n}\int_{|x|>\eta_n}\frac{x^2}{|x|}\,dF_{nj}(x) \\
&\leq&\frac{1}{\eta_n}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x^2\,dF_{nj}(x). \end{array}$$ Let $u(m,n)=m\sum_{j=1}^{k_n}\int_{|x|>\frac{1}{m}}x^2\,dF_{nj}(x)$. Then we have $\underset{n\rightarrow\infty}{\lim}u(m,n)=0$ for each $m$ by Lindeberg's condition. Then according to the Lemma, there exists $\{m_n\}$ increasing to $\infty$ such that $\underset{n\rightarrow\infty}{\lim}u(m_n,n)=0.$ Thus, let $\eta_n=\frac{1}{m_n}$, thus $\eta_n\rightarrow0$ and $$\frac{1}{\eta_n}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x^2\,dF_{nj}(x)\overset{n}{\longrightarrow}0.$$ Therefore, $|\mathscr{E}(S'_n)|\overset{n}{\longrightarrow}0.$
The variance of $S'_n$: (split into two parts)
$$\begin{array}{ccl}s'^2_n=\mbox{Var}(S'_n)
& = & \sum_{j=1}^{k_n}\mbox{Var}(X'_{nj}) \\
&\leq& \sum_{j=1}^{k_n}\mathscr{E}(X'^2_{nj}) \\
& = & \sum_{j=1}^{k_n}\int_{|x|\leq\eta_n}x^2\,dF_{nj}(x) \\
&\leq& \sum_{j=1}^{k_n}\int_{-\infty}^\infty x^2\,dF_{nj}(x)=s^2_n=1. \mbox{ (by assumption)}\\ \end{array}$$ Thus, $s'^2_n\leq1$.
One the other hand, first note that $$\begin{array}{ccl}\mathscr{E}^2(X'_{nj})
& = & \left[\int_{|x|\leq\eta_n}x\,dF_{nj}(x)\right]^2 = \left[-\int_{|x|>\eta_n}x\,dF_{nj}(x)\right]^2 \;(\because \mathscr{E}(X_{nj})=0) \\
&\leq& \left[\int_{|x|>\eta_n}x^2\,dF_{nj}(x)\right]\left[\int_{|x|>\eta_n}1^2\,dF_{nj}(x)\right] \; (\mbox{Cauchy-Schwarz}) \\
&\leq& \int_{|x|>\eta_n}x^2\,dF_{nj}(x).\\ \end{array}$$ Hence we have $$\begin{array}{ccl}s'^2_n=\mbox{Var}(S'_n)
& = & \sum_{j=1}^{k_n}\mbox{Var}(X'_{nj}) \\
& = & \sum_{j=1}^{k_n}\left[\mathscr{E}(X'^2_{nj})-\mathscr{E}^2(X'_{nj})\right] \\
&\geq& \sum_{j=1}^{k_n}\left[ \int_{|x|\leq\eta_n}x^2\,dF_{nj}(x) - \int_{|x|>\eta_n}x^2\,dF_{nj}(x) \right] \\
&\geq& 1-0 \mbox{ as } n\rightarrow\infty, \\ \end{array}$$ by Lindeberg's condition. Then we have $s'^2_n\geq1$. Combining two conclusion, we get $s'^2\overset{n}{\longrightarrow}1$.
Write $$S'_n=s'_n\left(\frac{S'_n-\mathscr{E}(S'_n)}{s'_n}+\frac{\mathscr{E}(S'_n)}{s'_n}\right).$$
By definition of $X'_{nj}$, we know $|X'_{nj}|\leq\eta_n$. Then $$\underset{1\leq j\leq k_n}{\max}\frac{\eta_n}{s'_n}\rightarrow0,$$ since $\eta_n\rightarrow0$. Thus, by the corollary above, we have $$\frac{S'_n-\mathscr{E}(S'_n)}{s'_n} \overset{d}{\longrightarrow}\Phi.$$ For the remaining parts, since $ |\mathscr{E}(S'_n)| \overset{n}{\longrightarrow}0 $ and $s'^2 \overset{n}{\longrightarrow}1 $, by Slutsky's theorem we have $S'_n\overset{d}{\longrightarrow}\Phi.$
Finally, for $\eta>0$,
$$\begin{array}{ccl}\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{|S_n-S'_n|>\eta_n\right\}
& = & \underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{S_n\neq S'_n\right\} \\
& = & \underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{\bigcup_{j=1}^{k_n}\{X_{nj}\neq X'_{nj}\}\right\} \\
&\leq& \underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\left\{X_{nj}\neq X'_{nj}\right\} \\
& = & \underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\left\{|X_{nj}|>\eta_n\right\} \\
& = & \underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}\frac{x^2}{x^2}\,dF_{nj}(x) \\
&\leq& \underset{n\rightarrow\infty}{\lim}\frac{1}{\eta_n^2}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x^2\,dF_{nj}(x) = 0,\\
\end{array}$$ by above Lemma. Hence, $S_n-S'_n\overset{p}{\longrightarrow}0.$
Write $S_n=S'_n+(S_n-S'_n)$. By Slutsky's theorem, since $S'_n\overset{d}{\longrightarrow}\Phi$ and $S_n-S'_n\overset{p}{\longrightarrow}0$, we have $S_n\overset{d}{\longrightarrow}\Phi.$
[Corollary] This is a corollary of Lyapunov's CLT. Suppose that $\alpha\neq0$, and, for each $n$ and $j$ there is a finite constant $M_{nj}$ such that $|X_{nj}|\leq M_{nj}$ a.e., and that $$\underset{1\leq j\leq k_n}{\max}M_{nj}\rightarrow0.$$ Then $S_n-\mathscr{E}(S_n)$ converges in dist. to $\Phi$.
[Lemma] Let $u(m,n)$ be a function of positive integers $m$ and $n$ such that $$\forall\,m:\, \underset{n\rightarrow\infty}{\lim}u(m,n)=0.$$ Then there exists a sequence $\{m_n\}$ increasing to $\infty$ such that $$\underset{n\rightarrow\infty}{\lim}u(m_n,n)=0.$$
Back to the proof. Define for $0<\eta_n<1$ and $\eta_n\rightarrow0$, $$X'_{nj}=\begin{cases}X_{nj}, &\mbox{if }|X_{nj}|\leq\eta_n,\\0, &\mbox{o.w.}\end{cases}$$ and $$\begin{array}{ll} \mathscr{E}(X'_{nj})=\alpha'_{nj}, & S'_n=\sum_{j=1}^{k_n}X'_{nj}; \\ \mathscr{E}(S'_n)=\sum_{j=1}^{k_n}\alpha'_{nj}=\alpha'_n, & \sigma^2(S'_n)=s'^2_n.\\ \end{array}$$ We need to find the mean and variance of $S'_n$ and show the limiting distribution of $\left(S'_n-\mathscr{E}(S'_n)\right)/s'_n$ is $\Phi$. Then, we apply Slutsky's theorem to show $S_n\overset{d}{\longrightarrow}\Phi$ by the equivalence between $S_n$ and $S'_n$.
The mean of $S'_n$:
Since we assume $\mathscr{E}(X_{nj})=0$, which leads $$\mathscr{E}(S'_n)=\sum_{j=1}^{k_n}\mathscr{E}(X'_{nj})=\sum_{j=1}^{k_n}\int_{|x|\leq\eta_n}x\,dF_{nj}(x)=\sum_{j=1}^{k_n}-\int_{|x|>\eta_n}x\,dF_{nj}(x).$$ Hence, $$\begin{array}{ccl}|\mathscr{E}(S'_n)|
& = & \left|\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x\,dF_{nj}(x)\right| \\
&\leq&\sum_{j=1}^{k_n}\int_{|x|>\eta_n}|x|\,dF_{nj}(x) = \sum_{j=1}^{k_n}\int_{|x|>\eta_n}\frac{x^2}{|x|}\,dF_{nj}(x) \\
&\leq&\frac{1}{\eta_n}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x^2\,dF_{nj}(x). \end{array}$$ Let $u(m,n)=m\sum_{j=1}^{k_n}\int_{|x|>\frac{1}{m}}x^2\,dF_{nj}(x)$. Then we have $\underset{n\rightarrow\infty}{\lim}u(m,n)=0$ for each $m$ by Lindeberg's condition. Then according to the Lemma, there exists $\{m_n\}$ increasing to $\infty$ such that $\underset{n\rightarrow\infty}{\lim}u(m_n,n)=0.$ Thus, let $\eta_n=\frac{1}{m_n}$, thus $\eta_n\rightarrow0$ and $$\frac{1}{\eta_n}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x^2\,dF_{nj}(x)\overset{n}{\longrightarrow}0.$$ Therefore, $|\mathscr{E}(S'_n)|\overset{n}{\longrightarrow}0.$
The variance of $S'_n$: (split into two parts)
$$\begin{array}{ccl}s'^2_n=\mbox{Var}(S'_n)
& = & \sum_{j=1}^{k_n}\mbox{Var}(X'_{nj}) \\
&\leq& \sum_{j=1}^{k_n}\mathscr{E}(X'^2_{nj}) \\
& = & \sum_{j=1}^{k_n}\int_{|x|\leq\eta_n}x^2\,dF_{nj}(x) \\
&\leq& \sum_{j=1}^{k_n}\int_{-\infty}^\infty x^2\,dF_{nj}(x)=s^2_n=1. \mbox{ (by assumption)}\\ \end{array}$$ Thus, $s'^2_n\leq1$.
One the other hand, first note that $$\begin{array}{ccl}\mathscr{E}^2(X'_{nj})
& = & \left[\int_{|x|\leq\eta_n}x\,dF_{nj}(x)\right]^2 = \left[-\int_{|x|>\eta_n}x\,dF_{nj}(x)\right]^2 \;(\because \mathscr{E}(X_{nj})=0) \\
&\leq& \left[\int_{|x|>\eta_n}x^2\,dF_{nj}(x)\right]\left[\int_{|x|>\eta_n}1^2\,dF_{nj}(x)\right] \; (\mbox{Cauchy-Schwarz}) \\
&\leq& \int_{|x|>\eta_n}x^2\,dF_{nj}(x).\\ \end{array}$$ Hence we have $$\begin{array}{ccl}s'^2_n=\mbox{Var}(S'_n)
& = & \sum_{j=1}^{k_n}\mbox{Var}(X'_{nj}) \\
& = & \sum_{j=1}^{k_n}\left[\mathscr{E}(X'^2_{nj})-\mathscr{E}^2(X'_{nj})\right] \\
&\geq& \sum_{j=1}^{k_n}\left[ \int_{|x|\leq\eta_n}x^2\,dF_{nj}(x) - \int_{|x|>\eta_n}x^2\,dF_{nj}(x) \right] \\
&\geq& 1-0 \mbox{ as } n\rightarrow\infty, \\ \end{array}$$ by Lindeberg's condition. Then we have $s'^2_n\geq1$. Combining two conclusion, we get $s'^2\overset{n}{\longrightarrow}1$.
Write $$S'_n=s'_n\left(\frac{S'_n-\mathscr{E}(S'_n)}{s'_n}+\frac{\mathscr{E}(S'_n)}{s'_n}\right).$$
By definition of $X'_{nj}$, we know $|X'_{nj}|\leq\eta_n$. Then $$\underset{1\leq j\leq k_n}{\max}\frac{\eta_n}{s'_n}\rightarrow0,$$ since $\eta_n\rightarrow0$. Thus, by the corollary above, we have $$\frac{S'_n-\mathscr{E}(S'_n)}{s'_n} \overset{d}{\longrightarrow}\Phi.$$ For the remaining parts, since $ |\mathscr{E}(S'_n)| \overset{n}{\longrightarrow}0 $ and $s'^2 \overset{n}{\longrightarrow}1 $, by Slutsky's theorem we have $S'_n\overset{d}{\longrightarrow}\Phi.$
Finally, for $\eta>0$,
$$\begin{array}{ccl}\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{|S_n-S'_n|>\eta_n\right\}
& = & \underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{S_n\neq S'_n\right\} \\
& = & \underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{\bigcup_{j=1}^{k_n}\{X_{nj}\neq X'_{nj}\}\right\} \\
&\leq& \underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\left\{X_{nj}\neq X'_{nj}\right\} \\
& = & \underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\left\{|X_{nj}|>\eta_n\right\} \\
& = & \underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}\frac{x^2}{x^2}\,dF_{nj}(x) \\
&\leq& \underset{n\rightarrow\infty}{\lim}\frac{1}{\eta_n^2}\sum_{j=1}^{k_n}\int_{|x|>\eta_n}x^2\,dF_{nj}(x) = 0,\\
\end{array}$$ by above Lemma. Hence, $S_n-S'_n\overset{p}{\longrightarrow}0.$
Write $S_n=S'_n+(S_n-S'_n)$. By Slutsky's theorem, since $S'_n\overset{d}{\longrightarrow}\Phi$ and $S_n-S'_n\overset{p}{\longrightarrow}0$, we have $S_n\overset{d}{\longrightarrow}\Phi.$
$\Box$
沒有留言:
張貼留言