Common Discrete Distributions
In this section we will review some discrete probability mass functions frequently encountered in science and engineering.
Discrete Uniform
A random variable follows a discrete uniform distribution if it takes values with equal probability. We denote this by:
The symbol means 'is distributed as'.
Since the sum of the probabilities must equal 1, the PMF of is
NOTE: Notice the convention where we say that the model parameter is given: . This will be adopted throughout the chapter to distinguish random variables from model parameters.
Using the formula for the sum of the first natural numbers, the expectation of is:
which is halfway between the minimum and maximum values can take. Using the formula for the sum of the squares of the first natural numbers, we have
so the variance of is:
Bernoulli
A random variable follows a Bernoulli distribution if it takes the value 1 with probability , and the value 0 with probability . We denote this by
We will generally assign the random variable values as to denote "success" and to denote "failure".
The PMF of is
The Bernoulli distribution arises in any experiment where the sample space consists of only two outcomes; examples are flipping a biased coin once {heads, tails . We call such an experiment a Bernoulli trial, and arbitrarily assign a random variable value to one of the outcomes as a success (e.g. ) and the other as a failure (e.g. ).
The PMF can also be conveniently summarised as:
The expectation and variance of are computed in a straightforward manner as:
Binomial
Binomial probabilities apply to situations involving a series of independent and identical trials, where each trial can have only one of two possible outcomes (i.e. Bernoulli trials). If we were flipping a coin, and the probability of it landing heads was (and therefore probability of tails ), then the probability of obtaining the sequence would be . In other words, since the coin tosses are independent events, we would multiply the probability of obtaining a head with the probability of the coin flipping tails, and with the probability of the coin flipping heads again.
We can generate a plot of the probability mass function (PMF) for a binomial distribution using the following code:
from scipy.stats import binom
import matplotlib.pyplot as plt
n = 10
p = 0.5
x = np.arange(0, 10, 0.5)
plt.plot(x, binom.pmf(x, n, p))
Which results in:
Or, visualised with different values of and :
Suppose we were interested in the probability of obtaining two (or any other number of) heads from these throws, in any order. For a particular number of coin throws , the number of sequences containing heads can be computed using the binomial coefficient, ( read " choose ", button on your calculator):
In particular, for three coin flips , there are three sequences resulting in two heads :
Since each sequence occurs with a probability of , and there are three sequences, the total probability of obtaining two heads is therefore , where we have simply counted the total number sequences containing 2 heads, but we could have equivalently used the binomial coefficient:
This leads us to the definition of the binomial distribution.
Suppose that we have a sequence of independent Bernoulli trials, each with probability of "success". The total number of successes, , follows a binomial distribution, which we denote by
A binomial random variable with parameters is equivalent to the sum of independent Bernoulli random variables with parameter .
The PMF of a binomial random variable is
Where is the binomial coefficient defined above.
The expectation and variance of a binomial random variable are:
We will prove these later, when we consider the properties of sums of random variables.
An important point to consider is how values for are estimated. Traditionally, historical data is used for this purpose, but this assumes that the process under investigation is the same as that when the data was collected (i.e. that is indeed a constant).
Example: Binomial Distribution - Ebola Virus
You may recall the 2013-14 Ebola virus epidemic that resulted in 8,914 cases and 4,447 deaths, according to the World Health Organisation (data collected on 15 October 2014); this corresponds to a mortality rate (although the WHO estimates that the death rate is likely to be closer to ). From the given data, if 15 people are known to have contracted this virus, what is the probability that:
(a) Exactly 5 people survive?
Solution:
We can model the fate of each individual that has contracted the virus as a Bernoulli random variable with parameter . The number of surviving individuals can then be modelled using binomial distribution.
(b) At least 10 people survive?
Solution:
We can also visually inspect the corresponding PMF by plotting it over all , which is shown for mortality rates of (blue) and (red) in Figure 13. From the figure we immediately observe a dramatic decrease in the probability of survival if the mortality rate is estimated to be .
Poisson
Poisson probability mass function, and this has a very specific application. It looks a lot like a normal distribution, but it's a little bit different.
The idea here is, if you have some information about the average number of things that happen in a given time period, this probability mass function can give you a way to predict the odds of getting another value instead, on a given future day.
As an example, let's say I have a website, and on average I get 500 visitors per day. I can use the Poisson probability mass function to estimate the probability of seeing some other value on a specific day. For example, with my average of 500 visitors per day, what's the odds of seeing 550 visitors on a given day? That's what a Poisson probability mass function can give you take a look at the following code:
from scipy.stats import poisson
import matplotlib.pyplot as plt
mu = 500
x = np.arange(400, 600, 0.5)
plt.plot(x, poisson.pmf(x, mu))
Which results in:
I can use that graph to look up the odds of getting any specific value that's not , assuming a normal distribution. The odds of seeing 550 visitors on a given day, it turns out, comes out to about or probability. Very interesting.
An obvious inconvenience of the binomial distribution is that for large or even moderate the calculations of the binomial coefficient can become laborious. In 1837, the Poisson distribution was published as a limiting approximation of the binomial distribution in cases where and remains constant.
Figure 13: PMF of the number of survivors of the Ebola virus taking the probability of mortality to be (blue bars; ) and (red bars; ) for .
Indeed the approximation is quite good for and .
Example: Poisson Approximation of London Bombings During WWII
In his seminal book Introduction to Probability Theory and its Applications, William Feller discusses the statistics of bombings in the south of London during 'The Blitz' of World War II. If you lived in a district consisting of 10 by 10 blocks (i.e. 100 squares), how likely is it that your square would not be hit given that 400 bombs were dropped?
Solution:
Can this modelled as a series of random Bernoulli trials?
Yes!
We need to identify values for the model parameters from the given data.
We can estimate that the probability of a particular bomb hitting your square to be . The dropping of the 400 bombs can also be approximated as a series of random Bernoulli trials, where the random variable corresponds to the number of bombs dropped on your square. Thus, we could model this using a binomial distribution, but the calculation of the binomial coefficient becomes pro- hibitive. Fortunately, we can approximate this as a Poisson distribution using
Given the model parameters we can formulate our query as:
If we compute this for , we get .
By the end of the century, this distribution found further applications even when no binomial random variable is present and there are no available values for or . These correspond to experiments where the random variable is the number of outcomes occurring during a given time interval or in a specified region, based on a constant rate of occurrence , as described below.
Suppose that we interested in events that occur independently over time, and at a known average rate. The number of events, , that occur in a fixed time interval follows a Poisson distribution, which we denote by
where is the average number of outcomes over the interval (e.g. expressed in time, distance, area or volume).
The PMF of is
Interestingly, the expectation and variance of are
The Poisson Process:
It turns out that the expression of the Poisson distribution, !, captures an extremely wide variety of phenomena that possess similar underlying features. Indeed, we can explicitly state said features, and denote any process that satisfies these conditions to be a 'Poisson Process'.
To illustrate this, let us consider events occurring over a time interval of length , where the events correspond to the arrival of a bus, as shown in Figure 14. We can dissect the total time interval into non-overlapping subintervals, each of length , where is large. A Poisson Process corresponds to any experiment that obeys the following assumptions:
- The probability that two or more events (e.g. bus arrivals) occur in any given subinterval is negligible.
- The number of events occurring in one subinterval is independent of the number that occur in any other subinterval (we shall later refer to this as having 'no memory').
- The probability that an event occurs during a given subinterval is constant over the entire interval (e.g. from 0 to ).
Figure 14: Illustration of a Poisson Process by considering the arrival of buses over time interval , which is dissected into non-overlapping subintervals, each of length .
Here the subintervals are analogous to the independent Bernoulli trials that form the basis of the binomial distribution. In each subinterval, there will either be 0 or 1 event taking place, and the probability of such events is constant. Then corresponds to the total number events occurring during time .
We make a slight, but notable notation adjustment here, and define denotes the rate at which events occur over the interval (e.g. 2.5 events per minute). Thus, it is equivalent to say that , and re-express the PMF as:
So to recap, in very brief terms a Poisson Process occur independently and at a constant rate , and the Poisson distribution is used to compute the probability of specific numbers of "events" during a particular period .
Example: Poisson Model of Bug Consumption
Entomologists estimate that an average person consumes approximately one pound of "bug parts" per year. Apparently, the foods and liquids we eat and drink contain insect eggs and various body parts. To regulate this, the Food and Drug Administration (FDA) has set legal limits on the amount of insect parts 'allowed' in food; for instance, peanut butter can only contain 30 insect fragments per 100 grams.
Figure 15: PMF for the number of bug fragments in 20 grams of peanut butter assuming an average occurrence of 6 fragments.
Now let's consider a standard package of peanut butter crackers, containing approximately 20 grams of peanut butter. What is the probability that the snack includes at least 5 insect fragments?
Solution:
We need to identify values for the model parameters from the given data.
Using a Poisson model, we can define our interval to be over 20 grams (e.g. ). Assuming the worst case scenario, we can compute the average number of bug fragments expected as:
Given the model parameters, we can express the PMF for the number of bug parts, , present in the snack as:
From this PMF we can formulate the query :
The full PMF corresponding to the distribution of bug fragments is presented in Figure 15.
Let's highlight some further properties of Poisson Processes using the example of bus arrivals.
Example: Poisson Process for Bus Arrivals
The '49 Bus' arrives every ten minutes (at least according to the chart at the bus stop). Let us model the arrival of buses using a Poisson Process.
(a) What is the probability that you wait for ten minutes and the bus does not arrive?
Solution:
The model parameter corresponding to the average rate of the process can be estimated as: arrivals per minute.
Which will be used in the following PMF:
Now need to specify and to compute query.
Using the definition above, in the problem statement we are therefore asked to compute is :
(b) What is the probability that three buses arrive in twenty minutes?
Solution:
Evaluate the Poisson PMF in an analogous manner:
CLOSING REMARKS: As with the binomial distribution, we make the assumption that the 'rate' of the Poisson Process is constant, which it might not be. For instance, the average rate of bus arrivals over the period of a day could fluctuate due to rush hour, accidents, etc.