2015年9月2日 星期三

Independence and Fubini's Theorem

About Posts which Tagged by 'Probability'

A basic property of the expectation of two independent random variables is the following.

[Theorem] If $X$ and $Y$ are independent and both have finite expectations, then $$\mathscr{E}(XY)=\mathscr{E}(X)\mathscr{E}(Y).$$

To prove this, the Fubini's theorem is the quick solution, otherwise, we prove this by the basic definition of the expectation.

$\bullet$ Proof.
(1) Prove by Fubini's theorem.

[Theorem] Fubini's Theorem
Let $(\Omega_1,\mathscr{F}_1,\mu_1)$ and $(\Omega_2,\mathscr{F}_2,\mu_2)$ be two measure space, and $(\Omega_1\times\Omega_2,\mathscr{F}_1\times\mathscr{F}_2,\mu_1\times\mu_2)$ be product measure space.  Consider $f:\Omega_1\times\Omega_2\rightarrow\mathbb{R}$ is $\mu_1\times\mu_2$ measurable, then
(a) $f(\omega_1,\cdot)$ is $\mu_2$ measurable, for all $\omega_1\in\Omega_1$.
     $f(\cdot,\omega_2)$ is $\mu_1$ measurable, for all $\omega_2\in\Omega_2$.
(b) $\int_{\Omega_2}f(\omega_1,\omega_2)\,d\mu_2(\omega_2)$ is $\mu_1$ measurable.
     $\int_{\Omega_1}f(\omega_1,\omega_2)\,d\mu_1(\omega_1)$ is $\mu_2$ measurable.
(c) If $\int_{\Omega_1\times\Omega_2}f(\omega_1,\omega_2)\,d\mu_1\times\mu_2(\omega_1,\omega_2)<\infty$, then $$\begin{array}{rl}\int_{\Omega_1}\int_{\Omega_2}f(\omega_1,\omega_2)\,d\mu_2(\omega_2)\,d\mu_1(\omega_1) &=\int_{\Omega_2}\left(\int_{\Omega_1}f(\omega_1,\omega_2)\,d\mu_1(\omega_1)\right)\,d\mu_2(\omega_2)\\ &=\int_{\Omega_1\times\Omega_2}f(\omega_1,\omega_2)\,d\mu_1\times\mu_2(\omega_1,\omega_2).\end{array}$$In short, by Fubini's theorem, we can treat the integrating on a plane as iteratively integrating each dimension.

Back to the proof.  Since $X$, $Y$ are independent, by Fubini's theorem, we have $$\begin{array}{rl}\mathscr{E}(XY)&=\int_\Omega X(\omega)Y(\omega)\,d\mathscr{P}(\omega)=\int\int_{\mathbb{R}^2}xy\,d\mu_1\times\mu_2(x,y)\\ &=\int_\mathbb{R}\int_\mathbb{R}xy\,d\mu_1(x)\,d\mu_2(y)\quad\left(\because\,\mbox{independent.}\right)\\ &=\int_\mathbb{R}x\,d\mu_1(x)\int_\mathbb{R}y\,d\mu_2(y)\quad\left(\because\,\mbox{Fubini.}\right)\\ &=\mathscr{E}(X)\mathscr{E}(Y).\end{array}$$

(2) Prove by definition of expectation.
In this proof, there are three progressive parts.  In each part, you can see how the concept of expectation is built.  We start with the discrete random variables.

(2A) 
Suppose $X$ and $Y$ are discrete random variables belonging respectively to the weighted $\{\Lambda_j,c_j\}$ and $\{M_k,d_k\}$ such that $\Lambda_j=\{X=c_j\}$ and $M_k=\{Y=d_k\}$.  Thus, $$\mathscr{E}(X)=\sum_j c_j\mathscr{P}\{\Lambda_j\},\;\mathscr{E}(Y)=\sum_k d_k\mathscr{P}\{M_k\}.$$Now, let $\Omega=(\bigcup_j\Lambda_j)\cup(\bigcup_kM_k)=\bigcup_{jk}(\Lambda_jM_k)$, and $XY(\omega)=X(\omega)Y(\omega)=c_jd_k$ if $\omega\in\Lambda_jM_k$.   Hence the r.v. $XY$ is discrete and belongs to $\{\Lambda_jM_k,c_jd_k\}$ where $j$, $k$ are independent.  Since $X$, $Y$ are independent, we have for every $j$ and $k$, $$\mathscr{P}\{\Lambda_jM_k\}=\mathscr{P}\{X=c_j,Y=d_k\}=\mathscr{P}\{X=c_j\}\mathscr{P}\{Y=d_k\}=\mathscr{P}\{\Lambda_j\}\mathscr{P}\{M_k\}.$$Thus, $$\begin{array}{rl}\mathscr{E}(XY)&=\sum_{jk}c_jd_k\mathscr{P}\{\Lambda_jM_k\}\\ &=\sum_{jk}c_jd_k\mathscr{P}\{\Lambda_j\}\mathscr{P}\{M_k\}\\ &=(\sum_{j}c_j\mathscr{P}\{\Lambda_j\})(\sum_kd_k\mathscr{P}\{M_k\})=\mathscr{E}(X)\mathscr{E}(Y). \end{array}$$
(2B)
Then, let $X$ and $Y$ be aritrary positive r.v.'s with finite expectations.  Thus there are discrete r.v.'s $X_m$ and $Y_m$ such that $\mathscr{E}(X_m)\uparrow\mathscr{E}(X)$, $\mathscr{E}(Y_m)\uparrow\mathscr{E}(Y)$, and, for each $m$, $X_m$ and $Y_m$ are independent.  Then for positive integers $n$ and $n'$, we have $$\begin{array}{rl}\mathscr{P}\left\{X_m=\frac{n}{2^m},Y_m=\frac{n'}{2^m}\right\}&=\mathscr{P}\left\{\frac{n}{2^m}\leq X<\frac{n+1}{2^m},\frac{n'}{2^m}\leq Y<\frac{n'+1}{2^m}\right\}\\ & =\mathscr{P}\left\{\frac{n}{2^m}\leq X<\frac{n+1}{2^m}\right\}\mathscr{P}\left\{\frac{n'}{2^m}\leq Y<\frac{n'+1}{2^m}\right\}\\ &=\mathscr{P}\left\{X_m=\frac{n}{2^m}\right\}\mathscr{P}\left\{Y_m=\frac{n'}{2^m}\right\}.\end{array}$$We know that $X_mY_m$ is also increasing with $m$, and $$\begin{array}{rl}0\leq XY-X_mY_m &=XY-XY_m+XY_m-X_mY_m\\ &=X(Y-Y_m)+Y_m(X-X_m)\rightarrow0.\end{array}$$Hence, by (2A) and Monotone Convergence Theorem, $$\begin{array}{rl}\mathscr{E}(XY)&=\underset{m\rightarrow\infty}{\lim}\mathscr{E}(X_mY_m)=\underset{m\rightarrow\infty}{\lim}\mathscr{E}(X_m)\mathscr{E}(Y_m)\\ &=\underset{m\rightarrow\infty}{\lim}\mathscr{E}(X_m)\underset{m\rightarrow\infty}{\lim}\mathscr{E}(Y_m)=\mathscr{E}(X)\mathscr{E}(Y).\end{array}$$
(2C) The most general case is that $X$ and $Y$ are arbitrary independent r.v.'s with finite expectation.  Currently, we have proved the conclusion is true for indep. discrete r.v.'s, and then for the positive r.v.'s, we use a bunch of discrete r.v.'s converging to them to show this theorem is also true.  Finally, for arbitrary two independent r.v.'s, $X$ and $Y$, we define $$\begin{array}{cc}X^+=X\vee0,&X^-=(-X)\vee0;\\Y^+=Y\vee0,&Y^-=(-Y)\vee0. \end{array}$$Then all $X^+$, $X^-$, $Y^+$ and $Y^-$ are positive r.v.'s, and each pair of $X$'s and $Y$'s are independent.  Thus, by (2B), $$\begin{array}{rl}\mathscr{E}(XY)&=\mathscr{E}\left[(X^+-X^-)(Y^+-Y^-)\right]\\ &=\mathscr{E}\left[X^+Y^+-X^+Y^--X^-Y^++X^-Y^-\right]\\ &=\mathscr{E}(X^+Y^+)-\mathscr{E}(X^+Y^-)-\mathscr{E}(X^-Y^+)+\mathscr{E}(X^-Y^-)\\ &=\mathscr{E}(X^+)\mathscr{E}(Y^+)-\mathscr{E}(X^+)\mathscr{E}(Y^-)-\mathscr{E}(X^-)\mathscr{E}(Y^+)+\mathscr{E}(X^-)\mathscr{E}(Y^-)\quad(\because\,\mbox{2B})\\ &=\left[\mathscr{E}(X^+)-\mathscr{E}(X^-)\right]\left[\mathscr{E}(Y^+)-\mathscr{E}(Y^-)\right]\\ &=\mathscr{E}(X)\mathscr(Y).\end{array}$$Thus, by (2A), (2B) and (2C) stepwisely, we have for arbitrary two independent r.v.'s, $X$ and $Y$, with finite expectations, $$\mathscr{E}(XY)=\mathscr{E}(X)\mathscr{E}(Y).$$

$\Box$

沒有留言:

張貼留言