# Posts by Tags

## Unbiased Estimators

less than 1 minute read

Published:

Let $X$ be a random variable with mean $\mu$ and variance $\sigma^2$. Given iid samples $X_1, \ldots, X_n$, one knows that the sample mean $\bar{X} := \frac{1}{n}(X_1+\cdots+X_n)$ and sample variance $S^2 := \frac{1}{n-1}\sum (X_i - \bar{X})^2$ are unbiased estimators for $\mu$ and $\sigma^2$ respectively, meaning that $\mathbb{E}[\bar{X}] = \mu$ and $\mathbb{E}[S^2] = \sigma^2$.

## Unbiased Estimators

less than 1 minute read

Published:

Let $X$ be a random variable with mean $\mu$ and variance $\sigma^2$. Given iid samples $X_1, \ldots, X_n$, one knows that the sample mean $\bar{X} := \frac{1}{n}(X_1+\cdots+X_n)$ and sample variance $S^2 := \frac{1}{n-1}\sum (X_i - \bar{X})^2$ are unbiased estimators for $\mu$ and $\sigma^2$ respectively, meaning that $\mathbb{E}[\bar{X}] = \mu$ and $\mathbb{E}[S^2] = \sigma^2$.

## Unbiased Estimators

less than 1 minute read

Published:

Let $X$ be a random variable with mean $\mu$ and variance $\sigma^2$. Given iid samples $X_1, \ldots, X_n$, one knows that the sample mean $\bar{X} := \frac{1}{n}(X_1+\cdots+X_n)$ and sample variance $S^2 := \frac{1}{n-1}\sum (X_i - \bar{X})^2$ are unbiased estimators for $\mu$ and $\sigma^2$ respectively, meaning that $\mathbb{E}[\bar{X}] = \mu$ and $\mathbb{E}[S^2] = \sigma^2$.