You want your estimator to have small variance. Always.
Efficiency
The efficiency of statistical estimators refers to how well they utilize data
The criteria used to determine efficiency is the variance of the estimator. An efficient estimator has the smallest variance among all unbiased estimators.
We may need to examine asymptotic or limiting behaviour of the estimator.
If we only have unbiased estimators:
You just need to compare the variances of the estimators. The one with the smallest variance is the most efficient.
Either directly or through ratios.
means that is more efficient than .
means that and are equally efficient.
means that is more efficient than .
If we have biased estimators:
We need to take into account the bias as well as the variances.
Trade-off with estimators when we have data.
MSE: Mean Squared Error
We want to minimize the MSE.
You can either have a tight estimator but not centered around the true value, or you can have an estimator that is centered around the true value but has a large variance. You want to find a balance between these two.
Example:
We have from
for
Which are unbiased?:
We know and
Without loss of generality, say is the minimum.
#tk ask in office hours if this is valid. Why or why not. Why do we use the CDF instead of MGF? Why was I wrong?
Use the CDF technique to find the distribution of the minimum.
First CDF of Exp:
for
The CDF of the minimum:
Take the complement.
If is larger than a value . Then all data points are larger than .
By independence (random sample):
We know
So the CDF of the minimum is
The PDF of the minimum the derivative of the CDF:
Biased.
Finding how likely we are to find the smallest data rather than the smallest of the actual data.
We already know this is .
From 4 unbiased estimators, which is the best? That with the least variance.