Professional Documents
Culture Documents
Sources: These lecture notes draw heavily on material from Taylor and Karlin An Introduction to Stochastic Modeling 3rd Edition (1998) and Kimmel and Axelrod Branching Processes in Biology (2000). In places, denitions and statements of theorems may be taken directly from that sources.
Generating functions
One of the most powerful (if somewhat unintuitive) tools in dealing with stochastic processes is the concept of generating functions. A generating function is a compact way to represent and manipulate the probability distribution of a non-negative discrete random variable, as a continuous function on the domain [0, 1]. As above, consider the discrete non-negative random variable given by P r[ = k] = pk . This distribution is represented by the generating function (s), which is the expectation of s raised to the -th power: (s) = E[s ] . Here we require 0 s 1 so that (s) is nite. We can write this out as 1
(s) =
k=0
pk sk
. In this form, we can now see the following properties of the generating function . 1. Each probability mass function for a non-negative discrete random variable corresponds one-to-one with a unique generating function . Weve already seen how to go from the mass function to the generating function. How do we go in the other direction? To do so, we need to nd a way to isolate each of the pi terms out of the generating function . Notice that (s) = p0 + p1 s + p2 s2 . . . Thus (0) = p0 and weve isolated the rst pi term. Now notice that the derivative (s) = p1 + 2p2 s + 3p3 s2 . . . and so (0) = p1 . Similarly, the derivative (s) = 2p2 + 6p3 s + 12p4 s2 . . . and thus (0)/2 = p2 . In general, pk = 1 dk (0) k! dsk
In other words, the successive derivatives of the generating function (s) produce the probability mass function when evaluated at s = 0.
2. Let 1 , 2 , . . . , n be independent (but not necessarily identically distributed) random variables. Then the generating function of their sum X = 1 + 2 + . . . + n is the product of their generating functions: X (s) = 1 (s)2 (s) . . . n (s) This is going to be really useful for working with random sums of even non-identically distributed random variables. 3. The generating function (s) associated with a random variable produces the moments of the random variable in a simple fashion. Since (s) = p1 + 2p2 s + 3p3 s2 . . . it follow that (1) = p1 + 2p2 + 3p3 . . . = E[] Similarly, the derivative (s) = 2p2 + 6p3 s + 12p4 s2 . . . so that
In other words, the successive derivatives of the generating function (s) produce the successive moments of the random variable when evaluated at s = 1. 3
(Unless = 1, in which case V (t) = 2 t.) Variance, like mean, progresses geometrically unless = 1. What is the chance that a branching process dies out by the n-th generation? We can solve this using rst-step analysis. Let un be the probability of extinction by the n-th generation, given that X0 = 1. Then the probability of extinction at time n, looking one turn, later is equal to the probability that the lineages of all of the produced ospring all go extinct in n 1 generations or fewer:
un =
k=0
pk (uk1 )k