Discrete Random Variables
A random variable is discrete if its range is countable, that is, if we can enumerate the values it can take. Broadly speaking, discrete random variables represent quantities that are counted.
We define the distribution of a random variable to assign a probability to each value the random variable can take. The distribution is given by the probability mass function.
The probability mass function (PMF) of a discrete random variable is the function:
The function evaluated at a point is the probability that the random variable takes the exact value .
Elementary properties of the PMF can be derived based on the properties of the general probability function we derived in Chapter 1. In particular, a function is a probability mass function for a discrete random variable if and only if:
These results follow as events , etc. are equivalent to events that partition .
Illustrative Example of Discrete Random Variable: Two Dice
Recall our example of a random variable that is equal to the sum of rolling two dice (we will use this example throughout the Chapter to link key concepts):
For each of these values for we can assign the following probability mass function (PMF):
Note that this PMF inherently satisfies the properties and . Often we will graphically represent the PMF using what is known as a histogram, as shown in Figure 3.
Figure 3: Histogram representation of the PMF for the sum of two dice. The height of the bar at each denotes the probability of observing this outcome.
Moments of PMFs
As Chemical Engineers, we frequently utilise integral transforms (e.g. Fourier transforms, which you will have come to love from the other parts of this course) and similar tools to extract features from complex functions to characterise them. In probability theory, the moments of probability distributions produce particularly insightful attributes. For a discrete random variable , we define the -th moment of a PMF about a value as:
Let's illustrate this by analysing a few moments for the PMF for the sum of two dice, as given in Figure 3. First, let us consider the moment about zero given by:
Figure 4: Computing the moment of the PMF of the sum of two dice (see Figure 3) about (i.e. ). The yellow bars correspond to the individual values for at a given (left axis; note this is the same as the histogram in Figure 3) and the blue step-function corresponds to for a given (right axis); this step-function is known as the cumulative distribution function (CDF), , of the PMF.
Notice that results in property 2 of the PMF (i.e. ), as we see on the far right of Figure 4 that the summation of over all values for is 1 . The blue step-function observed in Figure 4 is known as the cumulative distribution function: The cumulative distribution function (CDF) of a random variable is the function:
The function evaluated at a point is the probability that the random variable takes a value less than or equal to .
CDFs for discrete random variables have the following properties:
- If , then
- as
- as
Property 3 also implies that is non-decreasing. The key idea here is that the functions or can be used to describe the probability distribution of random variable .
CDF Computations using Sum of Two Dice
Recall the PMF for the sum of two dice, as illustrated in Figure 3 and the table above it. Let's use the properties of the cumulative distribution function (CDF) to answer the following queries:
- :
Using the definition of the CDF and property 2 we have:
- :
Using property 3 for a CDF:
Note that this informs us that the outcome for the sum of two randomly rolled die is , or 8 almost of the time.
- :
To answer this, we need to make use of the fact that and rearrange our expression to be:
Lastly it is worth noting that does indeed equal 1.
Next let us consider the moment of a PMF centred about zero , given by:
Figure 5 shows the moment for our PMF for the sum of two dice.
Due to the underlying characteristics of a PMF, the first moment gives rise to another important property, which we define as the 'expectation' of a random variable:
The expectation, , of a discrete random variable , also called the expected value or mean of , is defined as:
The expectation is a weighted average of the values can take.
Pause and Reflect 1: Why is the expectation of a PMF the weighted average? Recall the formula for computing a weighted average to be:
Where are the weighting factors.
Pause and Reflect 2: What is the expectation of a constant value (i.e. )?
Hint: We can think of this as for some .
Pause and Reflect 3: What happens to the expectation when we multiply our random variable by some constant (i.e. ?
Figure 5: Computing the moment of the PMF of the sum of two dice (Figure 3) about (i.e. ). The yellow bars correspond to the individual values for at a given (left axis) and the blue step-function corresponds to for a given (right axis). The total summation over all (the value of which is indicated by the final step-function on the far right) results in what is known as the Expectation , or the weighted average. We can see that for the sum of two die example considered here. Continuing on in this fashion, we can compute the moment of our PMF centred around our expected value of :
Applying this to our PMF for the sum of two dice example results in Figure 6.
Figure 6: Taking the moment of the sum of two dice PMF centred about the mean (i.e. . Notice that the individual contributions (yellow bars; ) are symmetric about the mean , as the PMF is a symmetric function (see Figure 3 ). The blue step-function again represents up to a given , and on the far right axis of this plot it can be seen that ; this value is known as the Variance , and it provides a measure of the "spread" of the distribution around the mean.
Let us take a minute to consider the information conveyed in Figure 6. The second moment about provides us with a measure of how the PMF (see histogram in Figure 3) "spreads" around the expected value ; that is, it provides us with a scalar quantity that conveys information regarding the overall spread or variation of the distribution. This second moment of the PMF about the expected value is known as the Variance, , of that random variable.
Pause and Reflect: Notice the units of would be in terms of (e.g. if were measured in terms of distance, this would be ). Thus, for convenience the statistics community has defined the standard deviation as:
Expectation and Variance for a Function of a Discrete Random Variable
We can derive expressions for expectation and variance of PMFs in an analogous fashion by first defining an expectation in general terms for any function of a discrete random variable :
If is a discrete random variable, and is for some real-valued function, then
To evaluate the expectation of a function of a random variable , we apply the function to every value in the range of , then take a weighted average of the results.
From this definition we can derive some very basic properties. For instance, consider the case where :
Using the definition above for the expectation of a function, we can reformulate the variance of a discrete random variable as the expectation of the squared difference between and its mean:
The variance of a random variable is can also be conveniently represented as
The variance is the average squared distance between a random variable and its mean. It is a measure of dispersion, i.e. spread.
Pause and Reflect 1: Where did the latter expression for the variance come from?
Hint: Note that is a constant (say ) and expand inside the expectation.
Pause and Reflect 2: What happens to the variance if we multiply a random variable by some constant (i.e. ?
Pause and Reflect 3: What happens to the variance if translate a random variable by some constant (i.e.