6. Maximum Likelihood¶
In inferential statistics, the problem we are often faced is this: we have collected some data, and we have a statistical model for how this data was generated. However, we do not know what the values of the parameters of this model are. We need to find a way to estimate these parameters. In the previous session, we were introduced to the likelihood function, which measures how consistent different values of the parameter are with the data that we have observed. Extending this concept, we used calculus to obtain the maximum likelihood estimator for the parameter.
So far, we have only looked at examples where our data consists of one observation - surely, this is not a sufficient sample size!
We will now consider the more realistic scenario, where we have a random sample of observations from a particular distribution - in this case, we say that the sample is independently and identically distributed (i.i.d).
By the end of this session you will be able to:
Derive the likelihood and log-likelihood functions given an i.i.d. sample
Derive maximum likelihood estimator from single and multi-parameter distributions given an i.i.d. sample
Describe the main properties of MLEs
The following subsections define the likelihood function for \(n\) i.i.d observations and describe the process of obtaining the maximum likelihood estimator in this setting. The session ends with a demonstration of some important properties of maximum likelihood estimators.