3.1 Continuous random variables¶
We have previously seen several discrete probability distributions (including the binomial and the Poisson). We now extend random variables to those that are continuous. A continuous random variable is one that can take a value in continuous space; this may vary from \(-\infty\) to \(+\infty\) (like the normal distribution) or have limits set on the lower (eg. the log-normal) or upper bound (eg. the uniform).
3.1.1. The probability density function¶
Previously we characterised the distribution of a variable by assigning a probability to each specific value. However, because athere are infinitely many values that could be taken by a continuous variable, paradoxically, the probability of a continuous random variable taking any specific value is zero. Therefore, we cannot use a probability distribution function to characterise the distribution of a continuous variable.
Instead, we turn to something called a probability density function. Instead of attaching a probability to each value the variable could take, the probability density function tells us the probability that a continuous variable lies within each possible interval (range of values). Specifically, the area under the curve (of the probability density function) between two limits tells us the probability that the continuous variable takes a value between those two limits.
Generally, a random variable \(X\) has density \(f_X\) where
\(f(x) \geq 0\) for all of \(x\)
\(\int_{-\infty}^{\infty} f(x) \hspace{0.2cm} dx = 1.00\)
which states that the “sum” of all probabilities of \(f(x)\) from the minimum to the maximum is equal to 1.
We can obtain various useful probabilities from this density function. We can calculate the probability that the variable takes a value within a given interval, the probability that it is below or above a given value. For example:
Further information about continuous probability distributions are given in the Refresher.