The Galton Watson Process – Part I

2 Probability Generating Functions

For a random variable that has some mean and variance, what else is necessary to determine the complete probability mass function? It turns out that having a knowledge of all of the moments of a random variable allows us to determine the probability mass function completely. This is in fact how the definition of probability generating functions came about. They are important here in determining an analytical expression for the probability that a surname will become extinct after n generations.


Probability generating functions, defined as G(s), are an important topic in probability theory used to reduce the amount of work required to analyze a random variable with a particular distribution. For example, information such as the expected value can often be easily extracted for well-defined sequences of G(s). The expected value is simply the value that is anticipated given many samplings of a particular situation. For example, the expected value for a fair 6-sided die is 3.5. This means that if you roll that die billions of times, sum all the values, and then divide by the number of rolls, you would get 3.5.


The probability generating function of a discrete random variable is given by a power series where the coefficients are determined by the probability function of that random variable. These coefficients are the sequence of probabilities P(X = i) in the probability function that a random variable X is equal to i. Namely, the probability generation function is

G(s) = \sum_{i=0}^{\infty} P(X=i)s^i = P(X=0)s^0 + P(X=1)s^1 + P(X=2)s^2 + \dots          (2)


where s is simply a parameter which allows the series to converge. For most cases this is when the absolute value of s is less than or equal to one, so s is often in the range from zero to one. Additionally you can think of s as an indeterminate variable which gives G (s) some particular property which can be useful as a building block to determine solutions to more interesting problems. For example, it can be easily shown that G(1) = 1.


To make this more concrete, consider a fair six-sided die being rolled. The die has a 1/6 chance that any of the 6 numbers will be selected. In this case X is the random variable determined by rolling the die. Thus we know that P(X = 1) = 1/6, P(X = 2) = 1/6, \dots, P(X=6) = 1/6.   In a similar fashion we also know that the probability X=0 is 0 since the die does not contain a side with the value zero. Thus the probability generating function is:

G(s) = 0s^0 + \frac{1}{6}s^1 + \frac{1}{6}s^2 + \frac{1}{6}s^3 + \frac{1}{6}s^4 + \frac{1}{6}s^5 + \frac{1}{6}s^6          (3)


From here we can easily show that G(1) = 1 by plugging in s=1 yielding

G(1) = P(X=0) + P(X=1) + P(X=2) + \dots = \sum_{i=0}^{\infty} P(X=i) = 1          (4)


which simply states that the probability X is any value from 0 to \infty is 100%. For the case of the fair die

G(1) = 0 + \frac{1}{6} + \frac{1}{6} + \frac{1}{6} + \frac{1}{6} + \frac{1}{6} + \frac{1}{6} = 1          (5)


It can also be shown that G(0) is equal to the probability that X=0 (i.e. G(0) = P(X=0)). This is derived by assuming s is approaching zero. By plugging in a very small value of s we know that s^0 is still equal to one, whereas s^1, s^2, \dots are all approaching zero quite rapidly.


Probability generating functions are particularly useful when the probabilities (i.e. the coefficients in the power series, P(X=i)) lead to a closed form. This is true of the Poisson distribution which will be used here.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s