ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • Gaussian Distribution
    카테고리 없음 2020. 8. 21. 16:25

    Definition. A set of $n$ random r.v.'s, $Z_1, Z_2, \cdots , Z_n$ is zero-mean jointly Gaussian if there is a set of iid(independent and identically distributed) normal r.v.'s $W_1, \cdots , W_l$ such that each $Z_k, 1 \leq k \leq n,$ can be expressed as 

    $$ Z_k = \sum_{m=1}^l a_{km}W_m, \hspace{3cm} 1 \leq k \leq n, \tag{1}$$

    where ${a_{km}; 1 \leq k \leq n, 1 \leq m \leq l}$ is an array of real numbers.

     

    Then we refer to a set of $n$ random vectors $Z_1, \cdots , Z_n$ as a random vector $\pmb{Z} = (Z_1, \cdots , Z_n)^T.$ Letting A be the $n$ by $l$ matrix with elements ${a_{km}; 1 \leq k \leq n, 1 \leq m \leq l},$ Eq. (1) can be represented in the matrix form as

    $$ \underset{n \times 1}{Z} = \underset{(n \times l)}{A} \underset{(l \times 1)}{W} $$

    $$ \begin{pmatrix} Z_1 \\ Z_2 \\ \vdots \\ Z_n \end{pmatrix} = \begin{pmatrix} A_{11} & A_{12} & \dots & A_{1l} \\ A_{21} & A_{22} & \dots & A_{2l} \\ \vdots & \vdots & \ddots & \vdots \\ A_{nl} & A_{n2} & \dots  & A_{nl} \end{pmatrix} \begin{pmatrix} W_1 \\ W_2 \\ \vdots \\ W_l \end{pmatrix} $$

    $$ \text{e.g.} \hspace{1cm} Z_1 = A_{11}W_1 + A_{12}W_2 + \dots + A_{1l}W_l \hspace{3cm} \therefore Z_k = \sum_{m=1}^l a_{km}W_m $$

     

    Let $X \sim \mathcal{N}(0, \sigma_X^2$ and $Y \sim \mathcal{N}(0,\sigma_Y^2)$ be independent zero-mean Gaussian rv's.

    Density of $X+Y$

    Let $Z=X+Y$. Since $X$ and $Y$ are independent, the density of $Z$ is given by the convolution of $f_X$ and $f_Y$:

    $$ \begin{align*} f_Z(z) &= f_X(z) * f_Y(z) = \int_{-\infty}^\infty f_X(x)f_Y(z-x)dx \\ &= \int_{-\infty}^\infty \frac{1}{ \sqrt{2 \pi \sigma_X^2} } \exp{ \left[ - \frac{x^2}{2 \sigma_X^2} \right] } \frac{1}{ \sqrt{2 \pi \sigma_Y^2}} \exp{ \left[ - \frac{ (z-x)^2}{2 \sigma_Y^2} \right] } dx \\ &= \frac{1}{2\pi \sigma_X \sigma_Y} \int_{-\infty}^\infty \exp{ \left[ - \left( \frac{x^2}{2 \sigma_X^2} + \frac{z^2 - 2xz + x^2}{2\sigma_Y^2}\right) \right] } dx \tag{1} \end{align*} $$ 

     

    "Completing the square" method

    The quadratic polynomial in the exponent can be re-arranged such that 

    $$ \begin{align*} \frac{x^2}{2\sigma_X^2} + \frac{z^2-2xz+x^2}{2\sigma_Y^2} &= \left( \frac{1}{2\sigma_X^2} + \frac{1}{2\sigma_Y^2} \right) x^2 - \frac{2xz}{2\sigma_Y^2} + \frac{z^2}{2\sigma_Y^2} \\ &= \left( x \sqrt{ \frac{1}{2\sigma_X^2} + \frac{1}{2\sigma_Y^2} } - \frac{z}{2\sigma_Y^2 \sqrt{ \frac{1}{2\sigma_X^2} + \frac{1}{2\sigma_Y^2} } } \right)^2 + \frac{z^2}{2\sigma_Y^2} - \frac{z^2}{4\sigma_Y^4 \left( \frac{1}{2\sigma_X^2} + \frac{1}{2\sigma_Y^2} \right) } \end{align*} \tag{2} $$

     

    For the square, we can write

    $$ \begin{align*} \left( x \sqrt{ \frac{1}{2\sigma_X^2} + \frac{1}{2\sigma_Y^2} } - \frac{z}{2\sigma_Y^2 \sqrt{ \frac{1}{2\sigma_X^2} + \frac{1}{2\sigma_Y^2} } } \right)^2 &= \left( \frac{1}{\sqrt{2}} x \sqrt{ \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2} } - \frac{z}{ \sqrt{2} \sigma_Y^2 \sqrt{ \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2} } } \right)^2 \\ &= \frac{ x\sigma_Y^2 \left( \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2} \right) - z } {\sqrt{2} \sigma_Y^2 \sqrt{ \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2} } } \end{align*}. $$

     

    Substituing $ u = \frac{ x\sigma_Y^2 \left( \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2} \right) - z }{ \sqrt{2} \sigma_Y^2 \sqrt{\frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2} } } $ results in

    $$ \frac{du}{dx} = \frac{ \sqrt{ \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2}}}{\sqrt{2}} \Longrightarrow dx = \frac{\sqrt{2}}{\sqrt{ \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2}}} du $$

     

    The last two terms of Eq. (2) can be simplifed to

    $$ \begin{align*} \frac{z^2}{2\sigma_Y^2} - \frac{z^2}{4\sigma_Y^4 \left( \frac{1}{2\sigma_X^2} + \frac{1}{2\sigma_Y^2} \right) } &= z^2 \left( \frac{1}{2\sigma_Y^2} - \frac{1}{2 \sigma_Y^2 \left( \frac{\sigma_X^2 + \sigma_Y^2}{\sigma_X^2} \right) } \right) \\ &= z^2 \left( \frac{1 - \frac{\sigma_X^2}{\sigma_X^2 + \sigma_Y^2} }{2\sigma_Y^2} \right) = z^2 \left( \frac{ \frac{\sigma_Y^2}{\sigma_X^2 + \sigma_Y^2} }{2\sigma_Y^2} \right) \\ &= \frac{z^2}{2 (\sigma_X^2+\sigma_Y^2)} \tag{2} \end{align*} $$

     

    Substituing $u$ and Eq. (2) into Eq. (1), we obtain

    $$ \begin{align*} f_Z(z) &= \frac{1}{2\pi \sigma_X \sigma_Y} \int_{-\infty}^\infty \exp{ \left[ -\left(u^2 + \frac{z^2}{2 (\sigma_X^2+\sigma_Y^2)} \right) \right]} \left( \frac{\sqrt{2}}{\sqrt{ \frac{1}{\sigma_X^2} + \frac{1}{\sigma_Y^2} }}du \right) \\ &= \left( \frac{1}{2\pi \sigma_X \sigma_Y} \right) \left( \frac{\sqrt{2}}{\sqrt{\frac{\sigma_X^2 + \sigma_Y^2}{\sigma_X^2 \sigma_Y^2}}} \right) \exp{ \left[ -\frac{z^2}{2(\sigma_X^2 + \sigma_Y^2)} \right] } \int_{-\infty}^\infty e^{-u^2}du \\ &= \frac{1}{\sqrt{2 \pi (\sigma_X^2+\sigma_Y^2)}} \exp{ \left[- \frac{z^2}{2(\sigma_X^2+\sigma_Y^2)} \right] } \int_{-\infty}^\infty \frac{1}{\sqrt{\pi}}e^{-u^2}du. \end{align*} $$

    The last Gaussian integral evaluates to 1. Hence, $Z$ is Gaussian with zero mean and variance $\sigma_X^2 + \sigma_Y^2$:

    $$ f_Z(z) = \frac{1}{\sqrt{2\pi( \sigma_X^2 + \sigma_Y^2)}} \exp{ \left[ - \frac{z^2}{2 (\sigma_X^2 + \sigma_Y^2) } \right] } $$

    $$ \therefore Z \sim \mathcal{N}(0, \sigma_X^2 + \sigma_Y^2) $$


    Fourier transform of the pdf $f_X(x)$ of a Gaussian rv $X \sim \mathcal{N}(0, \sigma_X^2)$

    We first compute the Fourier transform of the Gaussian integral

    $$ g(x) = e^{-\pi x^2}. $$

    We have

    $$ \mathcal{F}g(s) = \int_{-\infty}^\infty e^{-2\pi isx}e^{-\pi x^2}dx. $$

    Differentiate both sides with respect to $s$:

    $$ \frac{d}{ds} \mathcal{F} g(s) = \int_{-\infty}^\infty e^{-2\pi isx}(-2\pi isx)e^{-\pi x^2}dx.$$

    This can be evaluated by an integration by parts: setting 

    $$ \begin{cases} u=e^{-2\pi isx}, & dv = -2\pi ixe^{-\pi x^2} \\ \frac{du}{dx} = (-2\pi is)e^{-2\pi isx}, & v = ie^{-\pi x^2} \end{cases} $$

    Thus

    $$ \begin{align*} \mathcal{F}g(s) &= \left[ ie^{-2\pi isx}e^{-\pi x^2} \right]_{-\infty}^\infty - \int_{-\infty}^\infty ie^{-\pi x^2}(-2\pi is)e^{-2\pi isx}dx \\ &= \left[ ie^{-\pi (x^2+2isx)} \right]_{-\infty}^\infty - 2\pi s \int_{-\infty}^\infty e^{-2\pi isx}e^{-\pi x^2}dx \\ &= 0 - 2\pi s \mathcal{F} g(s) \end{align*} $$

    $$ \frac{d}{ds}\mathcal{F}g(s) = -2\pi s \mathcal{F}g(s) \Longrightarrow \mathcal{F}g(s) = \mathcal{F}g(0)e^{-\pi s^2} $$

    But

    $$ \mathcal{F}g(0) = \int_{-\infty}^\infty e^{-\pi x^2}dx = 1 $$

    Hence

    $$ \mathcal{F} g(s) = e^{-\pi s^2}. $$

    Using this Fourier transform pair, we now find the Fourier transform of $f_X(x),$ denoted by $\hat{f_X}(\theta)$:

    $$ \begin{align*} \hat{f_X}(\theta) &= \mathcal{F}f_X(s) = \frac{1}{\sqrt{2\pi \sigma_X^2}} \left( \mathcal{F} e^{-\frac{x^2}{2\sigma_X^2}} \right) (\theta) \\ &= \frac{1}{\sqrt{2\pi \sigma_X^2}} \left( \mathcal{F} g \left( \frac{x}{\sqrt{2\pi \sigma_X^2}} \right) \right) (\theta) \\ &= \frac{1}{\sqrt{2\pi \sigma_X^2}} \sqrt{2\pi \sigma_X^2} \mathcal{F}g(\sqrt{2\pi \sigma_X^2}) \\ &= e^{-\pi (\sqrt{2\pi \sigma_X^2})^2} \end{align*} $$

    $$ \therefore f_X(x) = \frac{1}{\sqrt{2\pi \sigma_X^2}}e^{-\frac{x^2}{2\sigma_X^2}} \hspace{1cm} \longleftrightarrow \hspace{1cm} \hat{f_X}(\theta) = e^{-2\pi^2 \sigma_X^2 \theta^2} \tag{3} $$

     

    Density of $X+Y$ using Fourier transform

    From the convolution theorem, we know 

    $$ f_Z(z) = \left( f_X * f_Y \right) (z) \hspace{0.5cm} \Longrightarrow \hspace{0.5cm} \hat{f_Z}(\theta) = \hat{f_X}(\theta) \hat{f_Y}(\theta) = \exp{ \left[ -2\pi^2 \theta^2 (\sigma_X^2 + \sigma_Y^2) \right] }. $$

    Recognizing $\sigma^2 = \sigma_X^2 + \sigma_Y^2,$ $\hat{f_Z}(\theta)$ is in the form of Eq. (3).

    Thus

    $$ f_Z(z) = \\frac{1}{\sqrt{2\pi(\sigma_X^2 + \sigma_Y^2)}} \exp{ \left[ - \frac{z^2}{2(\sigma_X^2 + \sigma_Y^2)} \right] }. $$

     

    Density of the sum $V=\sum_{k=1}^n \alpha_{k}W_k$ of independent normal r.v.'s $W$

    Note that $\alpha_kW_K$ is a zero-mean Gaussian r.v. with variance $\alpha_k^2.$ Using the same Fourier transform technique as in the previous calculations, we have

    $$ f_V(v) = \frac{1}{\sqrt{2 \pi ( \sum_{k=1}^n a_k^2 )}} \exp{ \left[ - \frac{v^2}{2 ( \sum_{k=1}^n a_k^2 ) } \right] }. $$

     

    The final results shows that any sum of iid normal r.v.'s is also Gaussian, so that each $Z_k$ in Eq. (1) is Gaussian. Moreover, jointly Gaussian r.v.'s $Z_1, \dots , Z_n$ must be related as linear combinations of the same set of iid normal r.v.'s.

Designed by Tistory.