Illustration




In the statistical inferences, estimation of the population parameters is one of the important tasks. Mainly, there are two types of estimates for a parameter, point estimate, and interval estimate.

The maximum likelihood estimation is a statistical technique to find the point estimate of a parameter. We call the point estimate a maximum likelihood estimate or simply MLE. While the unbiased estimator is the point estimator, which has the expected value as the parameter itself.

Let us understand how to find the MLE and the unbiased estimator for the population variance (σ²) with the help of the following example.


Example

Consider a random sample X1, X2… X30 of size 30, drawn from a normal distribution with μ= 30, find the maximum likelihood estimator of the population variance σ².

  1. Is the maximum likelihood estimator obtained above biased? Justify your answer.
  2. If the observed sample is 3, 6, 2, 0, 4, 3; compute the MLE of σ².
  3. How would your answer in part (1) be affected if the number of observations remains 30 and μ is reduced to zero?


To find Maximum Likelihood estimator

Consider a random sample X1, X2 … X30 of size 30, drawn from a normal distribution with mean (μ) = 30 and the unknown variance σ². The probability density function for the normal random variable X is as follows.



Step 1: Find the likelihood function

We will first find out the likelihood for the given random sample of size 30 as follows.



To get the maximum likelihood estimator (MLE) of the population variance (σ²) we need to maximize the likelihood function with respect to parameter σ². The logarithmic function is a non-decreasing function and hence maximizing the log-likelihood function is the same as maximizing the likelihood function.


Step 2: Find log-likelihood function

The log-likelihood function for the random sample of size 30 is as follows.



For the sake of convenience, we denote σ² as γ (we read γ as gamma). Therefore, the log-likelihood function in terms of γ is as follows.



Step 3: Maximize log-likelihood function

To maximize the log-likelihood function with respect to σ² we use the derivative method as follows.



To get the MLE of σ² (or γ) we equate the first-order derivative with 0 to get,



This gives the MLE of population variance (σ²) is,



 

To find an unbiased estimator

Since the random sample X1, X2, …, X30 is from the normal distribution with mean µ=30 this gives the term, ∑ (xi-30)² / σ², follows a chi-square distribution with 30 degrees of freedom.



This gives the expected value



Rearranging the formula, we get 



An estimator T of a parameter θ is an unbiased estimator when the expected value of the estimator equals the parameter, that is, if E(T) = θ. For this example, we get the expected value of MLE is σ². Therefore, MLE is an unbiased estimator of σ².


Finding MLE for the random sample

If the observed sample is 3, 6, 2, 0, 4, 3; then the MLE of σ² is given by,



The MLE of σ² is 732.33.


Change in the estimator with the change in mean

Consider a random sample X1, X2… X30 of size 30, drawn from a normal distribution with mean (μ) = 0 and the unknown variance σ². Then following the same procedure in steps 1 and 2 we get, the MLE of σ² for the given random sample as follows.