2 Probability Generating Functions
For a random variable that has some mean and variance, what else is necessary to determine the complete probability mass function? It turns out that having a knowledge of all of the moments of a random variable allows us to determine the probability mass function completely. This is in fact how the definition of probability generating functions came about. They are important here in determining an analytical expression for the probability that a surname will become extinct after generations.
Probability generating functions, defined as , are an important topic in probability theory used to reduce the amount of work required to analyze a random variable with a particular distribution. For example, information such as the expected value can often be easily extracted for well-defined sequences of . The expected value is simply the value that is anticipated given many samplings of a particular situation. For example, the expected value for a fair 6-sided die is 3.5. This means that if you roll that die billions of times, sum all the values, and then divide by the number of rolls, you would get 3.5.
The probability generating function of a discrete random variable is given by a power series where the coefficients are determined by the probability function of that random variable. These coefficients are the sequence of probabilities in the probability function that a random variable is equal to . Namely, the probability generation function is
where is simply a parameter which allows the series to converge. For most cases this is when the absolute value of is less than or equal to one, so is often in the range from zero to one. Additionally you can think of as an indeterminate variable which gives some particular property which can be useful as a building block to determine solutions to more interesting problems. For example, it can be easily shown that .
To make this more concrete, consider a fair six-sided die being rolled. The die has a chance that any of the 6 numbers will be selected. In this case is the random variable determined by rolling the die. Thus we know that . In a similar fashion we also know that the probability is 0 since the die does not contain a side with the value zero. Thus the probability generating function is:
From here we can easily show that by plugging in yielding
which simply states that the probability is any value from 0 to is 100%. For the case of the fair die
It can also be shown that is equal to the probability that (i.e. ). This is derived by assuming is approaching zero. By plugging in a very small value of we know that is still equal to one, whereas , , are all approaching zero quite rapidly.
Probability generating functions are particularly useful when the probabilities (i.e. the coefficients in the power series, ) lead to a closed form. This is true of the Poisson distribution which will be used here.