Let $\{X_{nj}\}$, $n=1,2,...$, $j=1,2,...,k_n$, be a double array of random variables. We have the following criteria of "negligibility". For $\varepsilon>0$,
(a) $\displaystyle\forall\,j:\quad\underset{n\rightarrow\infty}{\lim}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$;
(b) $\displaystyle\underset{n\rightarrow\infty}{\lim}\max_{1\leq j\leq k_n}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$;
(c) $\displaystyle\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\varepsilon\right\}=0$;
(d) $\displaystyle\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\{|X_{nj}|>\varepsilon\}=0$.
[Definition] $\{X_{nj}\}$ is called Uniformly Asymptotically Negligible (UAN, holospoudic) if (b) holds.
The implications $(d)\Rightarrow(c)\Rightarrow(b)\Rightarrow(a)$ are all strict. On the other hand, if for each $n$, $X_{n1},\ldots,X_{nk_n}$ are independent, then $(d)\equiv(c)$.
$\bullet$ Proof.
$(d)\Rightarrow(c):\,$ Suppose (d) is true, then by Boole's inequality,
$$\mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\epsilon\right\} = \mathscr{P}\left\{\bigcup_{j=1}^{k_n}\{|X_{nj}|>\epsilon\}\right\} \leq \sum_{j=1}^{k_n}\mathscr{P}\{|X_{nj}|>\epsilon\} \rightarrow 0\,\mbox{ as }n\rightarrow\infty.$$Thus, we have (c).
$(c)\Rightarrow(b):\,$ Suppose (c) is true, then for any $j$, $$\{|X_{nj}|>\epsilon\}\subseteq\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\epsilon\right\}=\bigcup_{j=1}^{k_n}\{|X_{nj}|>\epsilon\},$$which implies for any $j$, $$\mathscr{P}\{|X_{nj}|>\epsilon\}\leq\mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\epsilon\right\}.$$Then it is still true for the maximum of left hand side of the inequality, $$\max_{1\leq j\leq k_n}\mathscr{P}\{|X_{nj}|>\epsilon\} \leq \mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\epsilon\right\} \rightarrow 0\,\mbox{ as }n\rightarrow\infty.$$Thus, we have (b).
$(b)\Rightarrow(a):\,$ Suppose (b) is true, we have for any $j$, $$\mathscr{P}\{|X_{nj}|>\epsilon\} \leq \max_{1\leq j\leq k_n}\mathscr{P}\{|X_{nj}|>\epsilon\} \rightarrow 0\,\mbox{ as }n\rightarrow\infty.$$Thus, we have (a).
Furthermore, if the $X_{nj}$'s are independent in each row, we show $(d)\equiv(c)$ with additional proof for $(c)\Rightarrow(d)$ in independent case.
First, consider $n$ independent events, $\{E_j,\,1\leq j\leq n\}$, we have $$1-\mathscr{P}\left(\bigcup_{j=1}^nE_j\right)\leq\exp\left\{-\sum_{j=1}^n\mathscr{P}(E_j)\right\}.$$Because, by the first two terms of Taylor expansion of $e^{-x}$, we have $$1-x\leq e^{-x}.$$Then
$$\begin{array}{rl} 1-\mathscr{P}\left(\bigcup_{j=1}^nE_j\right)
& = \mathscr{P}\left(\bigcap_{j=1}^nE_j^c\right) \qquad\mbox{(DeMorgan's Law)}\\
& = \prod_{j=1}^n\mathscr{P}\left(E_j^c\right) \qquad\mbox{(Independent)} \\
& = \prod_{j=1}^n\left[1-\mathscr{P}\left(E_j\right)\right] \\
&\leq \prod_{j=1}^n\exp\left\{-\mathscr{P}\left(E_j\right)\right\} = \exp\left\{-\sum_{j=1}^n\mathscr{P}(E_j)\right\}. \end{array}$$Hence, replace $E_j$ by $\{|X_{nj}|>\epsilon\}$, we have, $$\begin{array}{rl}1-\mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\epsilon\right\} &= 1-\mathscr{P}\left\{\bigcup_{j=1}^{k_n}\{|X_{nj}|>\epsilon\}\right\}\\ & \leq\exp\left\{-\sum_{j=1}^n\mathscr{P}(|X_{nj}|>\epsilon)\right\}.\end{array}$$Since $$\underset{n\rightarrow\infty}{\lim}\mathscr{P}\left\{\max_{1\leq j\leq k_n}|X_{nj}|>\epsilon\right\}=0,$$ we have$$1\leq\exp\left\{-\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^n\mathscr{P}(|X_{nj}|>\epsilon)\right\}\,\Rightarrow\,\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^n\mathscr{P}(|X_{nj}|>\epsilon)\leq0,$$which forces $$\underset{n\rightarrow\infty}{\lim}\sum_{j=1}^{k_n}\mathscr{P}\{|X_{nj}|>\epsilon\}=0$$since the probability is bounded in $[0,1]$.
$\Box$
沒有留言:
張貼留言