Concept#
Intuition through Convolutions#
[Chan, 2021] section 5.5.1.
Convolutions of Random Variables#
Theorem 44 (Convolutions of Random Variables)
Let \(X\) and \(Y\) be two independent random variables with PDFs \(f_X\) and \(f_Y\), respectively. Define \(Z = X + Y\) where \(Z\) is in itself a random variable. Then, the PDF of \(Z\) is given by
where \(\ast\) denotes convolution.
Sum of Common Distribution#
The following proofs are from [Chan, 2021] section 5.5.3. Sum of common distributions.
Theorem 45 (Sum of Poisson Random Variables)
Let \(X_1 \sim \operatorname{Poisson}\left(\lambda_1\right)\) and \(X_2 \sim \operatorname{Poisson}\left(\lambda_2\right)\). Then
Proof. Let us apply the convolution principle.
where the last step is based on the binomial identity \(\sum_{\ell=0}^k\left(\begin{array}{c}k \\ \ell\end{array}\right) a^{\ell} b^{k-\ell}=(a+b)^k\).
Theorem 46 (Sum of Gaussian Random Variables)
Let \(X_1\) and \(X_2\) be two Gaussian random variables such that
Then
Proof. Let us apply the convolution principle.
We now complete the square:
The last term can be simplified to
Substituting these into the integral, we can show that
Therefore, we have shown that the resulting distribution is a Gaussian with mean \(\mu_1+\mu_2\) and variance \(2 \sigma^2\).
Theorem 47 (Sum of Common Distributions)
Let \(X_1\) and \(X_2\) be two independent random variables that come from the same family of distributions.
Then, the PDF of \(X_1 + X_2\) is given by
| \(X_1\) | \(X_2\) | \(X_1 + X_2\) | 
|---|---|---|
| \(\bern(p)\) | \(\bern(p)\) | \(\binomial(n, p)\) | 
| \(\binomial(n, p)\) | \(\binomial(m, p)\) | \(\binomial(m+n, p)\) | 
| \(\poisson(\lambda_1)\) | \(\poisson(\lambda_2)\) | \(\poisson(\lambda_1 + \lambda_2)\) | 
| \(\exponential(\lambda)\) | \(\exponential(\lambda)\) | \(\operatorname{Erlang}(2, \lambda)\) | 
| \(\gaussian(\mu_1, \sigma_1)\) | \(\gaussian(\mu_2, \sigma_2)\) | \(\gaussian(\mu_1 + \mu_2, \sigma_1^2 + \sigma_2^2)\) | 
This holds for \(N\) random variables as well.
