2015年9月1日 星期二

Lindeberg's Condition Implies Each Variance to Be Similarly Small

About Posts which Tagged by 'Probability'

Recall the Lindeberg-Feller Central Limit Theorem (short version).

Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables and for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent.  Define $S_n=\sum_{j=1}^{k_n}X_{nj}$ and
$$\begin{array}{ll}
\mathscr{E}(X_{nj})=\alpha_{nj}, & \mathscr{E}(S_n)=\sum_{j=1}^{k_n}\alpha_{nj}=\alpha_n; \\
\sigma^2(X_{nj})=\sigma^2_{nj}, & \sigma^2(S_n)=\sum_{j=1}^{k_n}\sigma^2_{nj}=s^2_n. \\
\end{array}$$Suppose $\alpha_{nj}=0$ for all $n$ and $j$, and $s^2_n=1$.  The Lindeberg's condition for $S_n$ converginf to $\Phi$ is $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{E}\left[X_{nj}^2\,I\left(|X_{nj}|>\eta\right)\right]=0\mbox{  for each }\eta>0.$$This criterion implies that there is no such $X_{nj}$, $j=1,\ldots,k_n$, whose variance dominates the others', i.e. $$\underset{1\leq j\leq k_n}{\max}\sigma_{nj}\rightarrow0.$$

$\bullet$ Proof.

Suppose the Lindeberg's condition holds, i.e. for $\eta>0$, $$\sum_{j=1}^{k_n}\int_{|x|>\eta}x^2\,dF_{nj}(x)\rightarrow0.$$We have for each $j$, $$\int_{|x|>\eta}x^2\,dF_{nj}(x)\leq\sum_{j=1}^{k_n}\int_{|x|>\eta}x^2\,dF_{nj}(x)\rightarrow0,$$and its variance $$\begin{array}{rl}\sigma^2_{nj}&=\int_\mathbb{R}x^2\,dF_{nj}(x)=\int_{|x|\leq\eta}x^2\,dF_{nj}(x)+\int_{|x|>\eta}x^2\,dF_{nj}(x)\\ &\leq\eta^2+\int_{|x|>\eta}x^2\,dF_{nj}(x)\rightarrow\eta^2,\;\forall\,\eta>0\end{array}$$This is also true for $\underset{1\leq j\leq k_n}{\max}\sigma^2_{nj}$.  Since $\eta$ is arbitrary, let $\eta\rightarrow0$. Thus we have $$\underset{1\leq j\leq k_n}{\max}\sigma_{nj}\rightarrow0.$$

$\Box$

沒有留言:

張貼留言