MLE of a Normal Distribution: Sample Mean and Sample Variance
Before we begin our section on interval estimation, we will consider the MLE for Normal parameters and . Recall the PDF for the Normal distribution, as given in Table 1:
From which we can derive the likelihood function for a sample dataset as:
Taking the natural logarithm we get the corresponding log-likelihood:
Note that in this case, we need to solve for and simultaneously; this will require setting the partial derivative of the log-likelihood function with respect to these variables to be equal to zero:
We can easily solve the first equation for :
Once again (and perhaps not surprising), we find that the MLE for is the sample mean. We can then substitute our maximum likelihood estimate for into the equation for :
This is a useful property of Normal distribution, whose two parameters correspond to the mean and variance of the population. In other words, sample mean and sample variance are the MLE estimates for these two parameters for Normal distribution. Before stating this formally, we shall first correct for the bias in the expression for .
Correcting the Bias in MLE of
Recall that when we derived the (section 5.2.1), we defined a term known as the bias:
Thus, the bias of an estimator is exactly zero when ; let's analyse this for our maximum likelihood estimator for :
At this point we need to make use of the definition that , and rearrange as to substitute for (noting that the random variable is drawn from the same distribution for ) and (noting our results for the sample mean):
Thus we see that ! To correct for this 'bias' the maximum likelihood estimator we need to multiply by :
We refer to as our sample variance. Finally, we define the sample mean (as before) and sample standard deviation as:
The unbiased estimator for standard deviation is quite useful for some statistical tests, which we will see in the next chapter.