Advanced Content for Inferential Statistics

Proof of $ E[m'_r]=\mu'_r $

We have a random sample $ X_1,X_2,\cdots,X_n $ from a population with a density $ f(x) $. We begin with the definition of the $r$th sample moment:

$$E[m'_r] = E\left[\frac{1}{n} \sum\limits_{i=1}^n x_i^r \right] $$

From the laws of expected values, we get:

$$ E\left[\frac{1}{n} \sum\limits_{i=1}^n x_i^r \right] = \frac{1}{n} E\left[\sum\limits_{i=1}^n x_i^r\right] = \frac{1}{n} \sum\limits_{i=1}^n E[x_i^r] $$

But we know that the $r$th population moment $ \mu'_r $ is equal to $ E[x_i^r] $ so we can go ahead and replace it:

$$ \frac{1}{n} \sum\limits_{i=1}^n E[x_i^r] = \frac{1}{n} \sum\limits_{i=1}^n \mu'_r = \frac{n \mu'_r}{n} = \mu'_r $$

Proof of the variance of the sample mean

We begin with the definition of the sample mean, and using definitions and rules of expected values, obtain:

$$ \sigma_\bar{X}^2 = E\left[ \bar{X} - E(\bar{X}) \right]^2 = E\left[\bar{X}-\mu\right]^2 = E\left[\frac{1}{n}\sum\limits_{i=1}^n x_i - \mu \right]^2 $$

Using some algebraic tricks, we find that:

$$ E\left[\frac{1}{n}\sum\limits_{i=1}^n x_i - \mu \right]^2 = E\left[\frac{1}{n}\sum\limits_{i=1}^n x_i - \frac{n}{n}\mu \right]^2 = E\left[\frac{1}{n}\sum\limits_{i=1}^n x_i - \frac{1}{n} (n\mu) \right]^2 = E\left[ \frac{1}{n} \left[\sum\limits_{i=1}^n x_i - \sum\limits_{i=1}^n \mu \right] \right]^2 = E\left[\frac{1}{n}\sum\limits_{i=1}^n (x_i - \mu) \right]^2 $$

More algebraic work shows that:

$$ E\left[\frac{1}{n}\sum\limits_{i=1}^n (x_i - \mu) \right]^2 = \frac{1}{n^2} E\left[\sum\limits_{i=1}^n (x_i - \mu) \right]^2=\frac{1}{n^2}E\left[(x_1-\mu)^2+(x_2-\mu)^2+\cdots+(x_n-\mu)^2+2(x_1-\mu)(x_2-\mu)+\cdots+2(x_{n-1}-\mu)(x_n-\mu)\right] $$
$$ = \frac{1}{n^2}E\left[\sum\limits_{i=1}^n (x_i-\mu)^2+2\sum\limits_{i=1}^{n-1}\sum\limits_{j=i}^n (x_i-\mu)(x_j-\mu)\right] $$

By laws of expected values:

$$ \frac{1}{n^2}E\left[\sum\limits_{i=1}^n (x_i-\mu)^2+2\sum\limits_{i=1}^n-1\sum\limits_{j=i}^n (x_i-\mu)(x_j-\mu)\right] = \frac{1}{n^2}\left[E\left[\sum\limits_{i=1}^n (x_i-\mu)^2\right]+E\left[2\sum\limits_{i=1}^{n-1} \sum\limits_{j=i}^n (x_i-\mu)(x_j-\mu)\right]\right] $$

Let us focus on the second part of this equation:

$$ E\left[2\sum\limits_{i=1}^{n-1} \sum\limits_{j=i}^n (x_i-\mu)(x_j-\mu)\right] = E\left[x_ix_j-\mu x_i-\mu x_j+\mu^2\right] $$
$$ = E\left[x_ix_j\right]-E\left[\mu x_i\right]-E\left[\mu x_j\right]+E\left[\mu^2\right] $$
$$ = E\left[x_i\right]E\left[x_j\right]-\mu E\left[x_i\right]-\mu E\left[x_j\right]+\mu^2 $$
$$ = \mu^2 - \mu^2 - \mu^2 + \mu^2 = 0 $$

So we have:

$$ \frac{1}{n^2}\left[E\left[\sum\limits_{i=1}^n (x_i-\mu)^2\right]+E\left[2\sum\limits_{i=1}^{n-1} \sum\limits_{j=i}^n (x_i-\mu)(x_j-\mu)\right]\right] = \frac{1}{n^2}\left[E\left[\sum\limits_{i=1}^n \sigma^2 \right] \right] = \frac{1}{n^2} \sum\limits_{i=1}^n \sigma^2 $$
$$ = \frac{n\sigma^2}{n^2} = \frac{\sigma^2}{n} $$

The Weak Law of Large Numbers

Let $f(x)$ be a density with mean $\mu$ and finite variance $\sigma^2$. Let $\bar{X}_n $ be the mean of a random sample of size $n$ from $f(x)$. Let $\epsilon$ and $\delta$ be two specified small numbers such that $\epsilon>0,0< \delta <1$. If $n$ is any integer greater than $\sigma^2/ {\epsilon^2 \delta}$, then:
$$ P(-\epsilon<\bar{X}_n-\mu < \epsilon) \geq 1 - \delta $$
 

Groups audience 
Inferential Statistics