You are on page 1of 47

15.

Poisson Processes

In Lecture 4, we introduced Poisson arrivals as the limiting behavior


of Binomial random variables. (Refer to Poisson approximation of
Binomial random variables.)
From the discussion there (see (4-6)-(4-8) Lecture 4)
" k arrivals occur in an  − λ λk
P =e , k = 0, 1, 2," (15-1)
 interval of duration ∆" k!
where
∆ (15-2)
λ = np = µT ⋅ = µ∆
T
karrivals k
arrivals

∆ 2∆
0 T 0 T
1
Fig. 15.1 PILLAI
It follows that (refer to Fig. 15.1)
" k arrivals occur in an  −2 λ (2λ )
k
P =e , k = 0, 1, 2,", (15-3)
 interval of duration 2∆"  k!
since in that case
2∆
np1 = µT ⋅ = 2 µ∆ = 2λ . (15-4)
T
From (15-1)-(15-4), Poisson arrivals over an interval form a Poisson
random variable whose parameter depends on the duration
of that interval. Moreover because of the Bernoulli nature of the
underlying basic random arrivals, events over nonoverlapping
intervals are independent. We shall use these two key observations
to define a Poisson process formally. (Refer to Example 9-5, Text)
Definition: X(t) = n(0, t) represents a Poisson process if
(i) the number of arrivals n(t1, t2) in an interval (t1, t2) of length
t = t2 – t1 is a Poisson random variable with parameter λt.
Thus 2
PILLAI
(λt ) k
P{n(t1 , t 2 ) = k} = e −λt , k = 0, 1, 2, ", t = t 2 − t1 (15-5)
k!
and
(ii) If the intervals (t1, t2) and (t3, t4) are nonoverlapping, then the
random variables n(t1, t2) and n(t3, t4) are independent.
Since n(0, t) ~ P ( λt ), we have
E [ X (t )] = E [n (0, t )] = λt (15-6)
and
E [ X 2 (t )] = E [n 2 (0, t )] = λt + λ2 t 2 . (15-7)

To determine the autocorrelation function R XX (t1 , t2 ), let t2 > t1 ,


then from (ii) above n(0, t1) and n(t1, t2) are independent Poisson
random variables with parameters λt1 and λ (t2 − t1 ) respectively.
Thus
E [n(0, t1 )n(t1 , t2 )] = E [n(0, t1 )]E [n(t1 , t2 )] = λ2 t1 (t2 − t1 ). (15-8)
3
PILLAI
But
n(t1 , t2 ) = n (0, t2 ) − n(0, t1 ) = X (t2 ) − X (t1 )

and hence the left side if (15-8) can be rewritten as


E [ X (t1 ){ X (t2 ) − X (t1 )}] = R XX (t1 , t2 ) − E [ X 2 (t1 )]. (15-9)
Using (15-7) in (15-9) together with (15-8), we obtain
R XX (t1 , t2 ) = λ2 t1 (t2 − t1 ) + E [ X 2 (t1 )]
= λt1 + λ2 t1 t2 , t2 ≥ t1 . (15-10)
Similarly
R XX (t1 , t2 ) = λt2 + λ2 t1 t2 , t2 < t1 . (15-11)
Thus
R XX (t1 , t2 ) = λ2 t1 t2 + λ min( t1 , t2 ). (15-12)
4
PILLAI
Poisson
arrivals
t
0 t1 ti

From (15-12), notice that X (t )


the Poisson process X(t)
does not represent a wide t
sense stationary process. Y (t )

+1 t1
t
−1

Define a binary level process Fig. 15.2


Y (t ) = ( −1) X ( t ) (15-13)
that represents a telegraph signal (Fig. 15.2). Notice that the
transition instants {ti} are random. (see Example 9-6, Text for
the mean and autocorrelation function of a telegraph signal).
Although X(t) does not represent a wide sense stationary process,5
PILLAI
its derivative X ′(t ) does represent a wide sense stationary process.
d (⋅)
X (t ) X ′(t )
dt
Fig. 15.3 (Derivative as a LTI system)
To see this, we can make use of Fig. 14.7 and (14-34)-(14-37).
From there
dµ X (t ) dλt
µ X ′ (t ) = = = λ , a constant (15-14)
dt dt
and
∂ RXX (t1 , t2 )  λ t1 t1 ≤ t2
2

RXX ′ (t1 , t2 ) = = 2
∂ t2  λ t1 + λ t1 > t2
= λ 2 t1 + λ U (t1 − t2 ) (15-15)
and
∂ R XX ′ (t1 , t2 )
R X ′X ′ (t1 , t2 ) = = λ2 + λ δ (t1 − t2 ). (15-16)
∂ t1 6
PILLAI
From (15-14) and (15-16) it follows that X ′(t ) is a wide sense
stationary process. Thus nonstationary inputs to linear systems can
lead to wide sense stationary outputs, an interesting observation.
• Sum of Poisson Processes:
If X1(t) and X2(t) represent two independent Poisson processes,
then their sum X1(t) + X2(t) is also a Poisson process with
parameter ( λ1 + λ2 )t. (Follows from (6-86), Text and the definition
of the Poisson process in (i) and (ii)).
• Random selection of Poisson Points:
Let t1 , t2 ,", ti , " represent random arrival points associated
with a Poisson process X(t) with parameter λt ,
and associated with
each arrival point, " "
t
define an independent 1 2
t t i
t

Bernoulli random Fig. 15.4


variable Ni, where
P( N i = 1) = p, P ( N i = 0) = q = 1 − p. (15-17) 7
PILLAI
Define the processes
X (t ) X (t )
Y (t ) = ∑ Ni ; Z (t ) = ∑ (1 − N i ) = X (t ) − Y (t ) (15-18)
i =1 i =1

we claim that both Y(t) and Z(t) are independent Poisson processes
with parameters λpt and λqt respectively.
Proof:

Y (t ) = ∑ P{Y (t ) = k | X (t ) = n}P{ X (t ) = n )}. (15-19)
n =k n
But given X(t) = n, we have Y (t ) = ∑ N i ~ B( n, p ) so that
i =1

P{Y (t ) = k | X (t ) = n} = () n
k
p k q n −k , 0 ≤ k ≤ n, (15-20)

and
( λ t ) n
P{ X (t ) = n} = e − λ t . (15-21)
n!
Substituting (15-20)-(15-21) into (15-19) we get 8
PILLAI

p k e − λt ∞
P{Y (t ) = k } = e − λ t ∑ ( n −nk!)! k ! p k q n − k ( λ t ) ∑ ( n − k )!
( qλ t ) n − k
n
( λt )
n! = k

n=k k! 
n=k

e qλt
− (1− q ) λ t
e ( λ pt ) k
= ( λ pt ) k = e − λ pt , k = 0, 1, 2, "
k! k!
~ P ( λ pt ). (15-22)

More generally,

P{Y (t ) = k , Z (t ) = m} = P{Y (t ) = k , X (t ) − Y (t ) = m}
= P{Y (t ) = k , X (t ) = k + m}
= P{Y (t ) = k | X (t ) = k + m}P{ X (t ) = k + m}
(λ t )k + m ( λ pt ) n
( λ qt ) n
= ( k +km ) p k q m ⋅ e − λ t = e − λ pt e − λ qt
( k + m )! 
k!  m !

P (Y ( t ) = k ) P( Z (t )=m)

= P{Y (t ) = k }P{Z (t ) = m}, (15-23) 9


PILLAI
which completes the proof.

Notice that Y(t) and Z(t) are generated as a result of random Bernoulli
selections from the original Poisson process X(t) (Fig. 15.5),
where each arrival gets tossed
over to either Y(t) with Y ( t ) ~ P ( λpt )
probability p or to Z(t) with t
probability q. Each such
p p p p
sub-arrival stream is also
a Poisson process. Thus X ( t ) ~ P ( λt )
random selection of Poisson t
points preserve the Poisson
nature of the resulting q q
processes. However, as we Z ( t ) ~ P ( λqt )
shall see deterministic t
selection from a Poisson
process destroys the Poisson Fig. 15.5
property for the resulting processes. 10
PILLAI
Inter-arrival Distribution for Poisson Processes
Let τ 1 denote the time interval (delay) 2nd nth
to the first arrival from any fixed point Ist arrival arrival

t0. To determine the probability 0


t "
t
τ t t t
distribution of the random variable 1 1 2 n

τ 1 , we argue as follows: Observe that Fig. 15.6


the event "τ 1 > t" is the same as “n(t0, t0+t) = 0”, or the complement
event "τ 1 ≤ t" is the same as the event “n(t0, t0+t) > 0” .
Hence the distribution function of τ 1 is given by
Fτ1 (t ) =∆ P{τ 1 ≤ t} = P{ X (t ) > 0} = P{n(t0 , t0 + t ) > 0}
= 1 − P{n(t0 , t0 + t ) = 0} = 1 − e − λ t (15-24)
(use (15-5)), and hence its derivative gives the probability density
function for τ 1 to be
dFτ1 (t )
fτ1 (t ) = = λ e − λt , t ≥ 0 (15-25)
dt
i.e., τ 1 is an exponential random variable with parameter λ 11
so that E (τ 1 ) = 1 / λ . PILLAI
Similarly, let tn represent the nth random arrival point for a Poisson
process. Then

Ftn (t ) = P{tn ≤ t} = P{ X (t ) ≥ n}
(λ t )k − λt
n −1
= 1 − P{ X (t ) < n} = 1 − ∑ e (15-26)
k =0 k !

and hence
dFtn ( x ) λ ( λ x ) k −1 − λ x n −1 λ ( λ x ) k − λ x
n −1
f tn ( x ) = = −∑ e +∑ e
dx k =1 ( k − 1)! k =0 k!
λ n x n −1 − λ x
= e , x≥0 (15-27)
( n − 1)!
which represents a gamma density function. i.e., the waiting time to
the nth Poisson arrival instant has a gamma distribution.
Moreover n
tn = ∑ τ i 12
i =1 PILLAI
where τ i is the random inter-arrival duration between the (i – 1)th
and ith events. Notice that τ i s are independent, identically distributed
random variables. Hence using their characteristic functions,
it follows that all inter-arrival durations of a Poisson process are
independent exponential random variables with common parameter λ .
i.e.,
fτ i (t ) = λ e − λ t , t ≥ 0. (15-28)

Alternatively, from (15-24)-(15-25), we have τ 1 is an exponential


random variable. By repeating that argument after shifting t0 to the
new point t1 in Fig. 15.6, we conclude that τ 2 is an exponential
random variable. Thus the sequence τ 1 ,τ 2 , " , τ n , " are independent
exponential random variables with common p.d.f as in (15-25).

Thus if we systematically tag every mth outcome of a Poisson process


X(t) with parameter λt to generate a new process e(t), then the
inter-arrival time between any two events of e(t) is a gamma
13
random variable. PILLAI
Notice that
E[e(t )] = m / λ , and if λ = mµ , then E[e(t )] = 1 / µ .
The inter-arrival time of e(t) in that case represents an Erlang-m
random variable, and e(t) an Erlang-m process (see (10-90), Text).
In summary, if Poisson arrivals are randomly redirected to form new
queues, then each such queue generates a new Poisson process
(Fig. 15.5). However if the arrivals are systematically redirected
(Ist arrival to Ist counter, 2nd arrival to 2nd counter, " , mth to mth ,
(m +1)st arrival to Ist counter, "), then the new subqueues form
Erlang-m processes.
Interestingly, we can also derive the key Poisson properties (15-5)
and (15-25) by starting from a simple axiomatic approach as shown
below:

14
PILLAI
Axiomatic Development of Poisson Processes:
The defining properties of a Poisson process are that in any “small”
interval ∆t , one event can occur with probability that is proportional
to ∆t. Further, the probability that two or more events occur in that
interval is proportional to ο (∆t ), (higher powers of ∆t ), and events
over nonoverlapping intervals are independent of each other. This
gives rise to the following axioms.
Axioms:
(i) P{n(t , t + ∆t ) = 1} = λ∆t + ο (∆t ) 


(ii) P{n(t , t + ∆t ) = 0} = 1 − λ∆t + ο (∆t ) 
 (15-29)
(iii) P{n(t , t + ∆t ) ≥ 2} = ο (∆t ) 
and 
n (t , t + ∆t ) is independen t of n ( 0, t ) 
(iv) 
Notice that axiom (iii) specifies that the events occur singly, and axiom
(iv) specifies the randomness of the entire series. Axiom(ii) follows
from (i) and (iii) together with the axiom of total probability. 15
PILLAI
We shall use these axiom to rederive (15-25) first:
Let t0 be any fixed point (see Fig. 15.6) and let t0 + τ 1 represent the
time of the first arrival after t0 . Notice that the random variable τ 1
is independent of the occurrences prior to the instant t0 (Axiom (iv)).
With Fτ1 (t ) = P{τ 1 ≤ t} representing the distribution function of τ 1 ,
as in (15-24) define Q (t ) =∆ 1 − Fτ1 (t ) = P{τ 1 > t}. Then for ∆t > 0
Q (t + ∆t ) = P{τ 1 > t + ∆t}
= P{τ 1 > t , and no event occurs in (t0 + t , t0 + t + ∆t )}
= P{τ 1 > t , n(t0 + t , t0 + t + ∆t ) = 0}
= P{n(t0 + t , t0 + t + ∆t ) = 0 | τ 1 > t}P{τ 1 > t}.
From axiom (iv), the conditional probability in the above expression
is not affected by the event {τ 1 > t} which refers to {n(t0, t0 + t) = 0},
i.e., to events before t0 + t, and hence the unconditional probability
in axiom (ii) can be used there. Thus
Q(t + ∆t ) = [1 − λ∆t + ο (∆t )]Q(t )
or
Q ( t +∆t ) −Q ( t )
lim ∆t = Q ′( t ) = − λ Q ( t ) ⇒ Q ( t ) = ce − λt
. 16
∆t → 0 PILLAI
But c = Q(0) = P{τ 1 > 0} = 1 so that
Q (t ) = 1 − Fτ1 (t ) = e − λ t
or
Fτ1 (t ) = 1 − e − λ t , t≥0
which gives
dFτ1 (t )
fτ1 (t ) = = λ e − λt , t≥0 (15-30)
dt
to be the p.d.f of τ 1 as in (15-25).

Similarly (15-5) can be derived from axioms (i)-(iv) in (15-29) as well.


To see this, let

pk (t ) = P{n(0, t ) = k )}, k = 0, 1, 2,"
represent the probability that the total number of arrivals in the
interval (0, t) equals k. Then
pk (t + ∆t ) = P{n(0, t + ∆t ) = k )} = P{ X 1 ∪ X 2 ∪ X 3 } 17
PILLAI
where the events

X 1 = " n(0, t ) = k , and n(t , t + ∆t ) = 0"
X 2 =∆ " n(0, t ) = k − 1, and n(t , t + ∆t ) = 1"
X 3 =∆ " n (0, t ) = k − i, and n(t , t + ∆t ) = i ≥ 2"
are mutually exclusive. Thus
pk (t + ∆t ) = P ( X 1 ) + P( X 2 ) + P ( X 3 ).
But as before
P ( X 1 ) = P{n(t , t + ∆t ) = 0 | n(0, t ) = k}P{n(0, t ) = k}
= P{n(t , t + ∆t ) = 0}P{n(0, t ) = k}
= (1 − λ∆t ) pk (t )
P( X 2 ) = P{n(t , t + ∆t ) = 1 | n(0, t ) = k − 1}P{n(0, t ) = k − 1}
= λ∆t pk −1∆t
and
P( X 3 ) = 0
18
PILLAI
where once again we have made use of axioms (i)-(iv) in (15-29).
This gives
pk (t + ∆t ) = (1 − λ∆t ) pk (t ) + λ∆tpk −1 (t )

or with
pk (t + ∆t ) − pk (t )
lim = pk′ (t )
∆t →0 ∆t
we get the differential equation
pk′ (t ) = −λpk (t ) − λpk −1 (t ), k = 0, 1, 2, "
whose solution gives (15-5). Here p−1 (t ) ≡ 0. [Solution to the above
differential equation is worked out in (16-36)-(16-41), Text].
This is completes the axiomatic development for Poisson processes.

19
PILLAI
Poisson Departures between Exponential Inter-arrivals
Let X (t ) ~ P(λt ) and Y (t ) ~ P ( µt ) represent two independent
Poisson processes called arrival and departure processes.
X (t ) → τ τ τ τ τ
1 2 3 i k
t
Y (t ) →
t
1
t 2
t i
t i +1

Fig. 15.7
Let Z represent the random interval between any two successive
arrivals of X(t). From (15-28), Z has an exponential distribution with
parameter λ . Let N represent the number of “departures” of Y(t)
between any two successive arrivals of X(t). Then from the Poisson
nature of the departures we have
( µ t )k
P{N = k | Z = t} = e − µ t .
k!
Thus
20
PILLAI

P{N = k } = ∫ 0 P{N = k | Z = t} f Z (t )dt
∞ − µ t ( µ t )k
= ∫0 e k! λ e − λ t dt
λ ∞
= k! ∫0 ( µ t ) k e − ( λ + µ ) t dt
k
λ  µ  1 ∞ k −x
= λ + µ  λ + µ  k ! 
0


x e dx
k!
k
=  λ   µ  , k = 0, 1, 2, " (15-31)
 λ +µ   λ +µ 
i.e., the random variable N has a geometric distribution. Thus if
customers come in and get out according to two independent
Poisson processes at a counter, then the number of arrivals between
any two departures has a geometric distribution. Similarly the
number of departures between any two arrivals also represents
another geometric distribution. 21
PILLAI
Stopping Times, Coupon Collecting, and Birthday Problems
Suppose a cereal manufacturer inserts a sample of one type
of coupon randomly into each cereal box. Suppose there are n such
distinct types of coupons. One interesting question is that how many
boxes of cereal should one buy on the average in order to collect
at least one coupon of each kind?

We shall reformulate the above problem in terms of Poisson


processes. Let X 1 (t ), X 2 (t )," , X n (t ) represent n independent
identically distributed Poisson processes with common parameter λt.
Let ti1 , ti 2 , " represent the first, second, " random arrival instants
of the process X i (t ), i = 1, 2, " , n. They will correspond to the first,
second, " appearance of the ith type coupon in the above problem.
n
Let
X (t ) = ∑ X i (t ),

(15-32)
i =1
so that the sum X(t) is also a Poisson process with parameter µt , where
µ = nλ . (15-33) 22
PILLAI
From Fig. 15.8, 1 / λ represents Nth stopping
Ist Y Y2 Y3 Yk arrival time T
The average inter-arrival duration 1

t
between any two arrivals of t k1
t 21 t 11 t i1 t 12 t 31
" t n1

X i (t ), i = 1, 2, " , n, whereas 1
λ
1 / µ represents the average inter-arrival
time for the combined sum process Fig. 15.8
X(t) in (15-32).
Define the stopping time T to be that random time instant by which
at least one arrival of X 1 (t ), X 2 (t ),", X n (t ) has occurred.
Clearly, we have
T = max (t11 , t 21 , " , ti1 ," , t n1 ). (15-34)
But from (15-25), ti1 , i = 1, 2, " , n are independent exponential
random variables with common parameter λ . This gives
FT (t ) = P{T ≤ t} = P{max (t11 , t21 ,", tn1 ) ≤ t}
= P{t11 ≤ t , t21 ≤ t , ", tn1 ≤ t}
23
= P{t11 ≤ t}P{t21 ≤ t}" P{tn1 ≤ t} = [ Fti (t )] . n
PILLAI
Thus
FT (t ) = (1 − e − λ t ) n , t ≥0 (15-35)
represents the probability distribution function of the stopping time
random variable T in (15-34). To compute its mean, we can make
use of Eqs. (5-52)-(5-53) Text, that is valid for nonnegative random
variables. From (15-35) we get
P(T > t ) = 1 − FT (t ) = 1 − (1 − e − λ t ) n , t ≥0
so that
∞ ∞
E{T } = ∫ 0 P(T > t )dt = ∫ 0 {1 − (1 − e − λ t ) n }dt. (15-36)
Let 1 − e − λ t = x, so that λ e − λ t dt = dx, or dt = λ (1dx− x ) ,
and
1 1 dx 1 11− x n
E{T } = λ ∫ 0 (1 − x )
n
= λ ∫ 0 1− dx
1− x x
k 1
1 1 1
n
x
= λ ∫ 0 (1 + x + x
2
+"+ x n −1
)dx = ∑
λ
k =1 k
24
0
PILLAI
1 1 1 1 n  1 1 1
E{T } =
 1 + + + " +  =  1 + + + " + 
λ 2 3 n µ 2 3 n
1
 n(ln n + γ ) (15-37)
µ
where γ  0.5772157" is the Euler’s constant1.
Let the random variable N denote the total number of all arrivals
up to the stopping time T, and Yk , k = 1, 2, ", N the inter-arrival
random variables associated with the sum process X(t) (see Fig 15.8).

1 Euler’s constant: The series {1 + 12 + 13 + " + 1n − ln n} converges, since


1 1
1 n +1
u n =∆ ∫0 n ( nx+ x ) dx = ∫0 ( n1 − n +1 x ) dx = n − ln n > 0 (1)

1 1 ∞ ∞
1
and u n ≤ ∫ x dx
0 n2
≤ ∫ 1 dx
0 n2
=
n2
so that ∑ un < ∑ 1
2
π2
= 6 < ∞ . Thus the series {un} converges
n =1 n =1 n
n n
to some number γ > 0. From (1) we obtain also ∑ uk = ∑ k1 − ln ( n + 1) so that
k =1 k =1
1 1 1
lim{1 + 2 + 3 + " + n − ln n} = lim
n →∞ n →∞
{∑ n
u
k =1 k
}
+ ln nn+1 = ∑ k =1 uk

= γ = 0.5772157 " . 25
PILLAI
Then we obtain the key relation
N
T = ∑ Yi (15-38)
i =1

so that
n n
E{T | N = n} = E{∑ Yi | N = n} = E{∑ Yi } = nE{Yi } (15-39)
i =1 i =1

since {Yi} and N are independent random variables, and hence


E{T } = E[ E{T | N = n}] = E{N }E{Yi } (15-40)

But Yi ~ Exponential( µ ), so that E{Yi } = 1 / µ and substituting this


into (15-37), we obtain
E{N }  n(ln n + γ ). (15-41)

Thus on the average a customer should buy about n ln n, or slightly


26
more, boxes to guarantee that at least one coupon of each
type has been collected.

Next, consider a slight generalization to the above problem: What if


two kinds of coupons (each of n type) are mixed up, and the objective
is to collect one complete set of coupons of either kind?

Let Xi(t) and Yi(t), i = 1 , 2, " , n represent the two kinds of


coupons (independent Poisson processes) that have been mixed up to
form a single Poisson process Z(t) with normalized parameter unity.
i.e.,
n
Z (t ) = ∑ [ X i (t ) + Yi (t )] ~ P (t ). (15-42)
i =1
"
As before let ti1 , ti 2 , " represent the first, second, arrivals
" of
the process Xi(t), and τ i1 ,τ i 2 , " represent the first, second,
arrivals of the process Yi (t ), i = 1, 2, " , n.
27
PILLAI
The stopping time T1 in this case represents that random instant at
which either all X – type or all Y – type have occurred at least
once. Thus
T1 = min{ X , Y } (15-43)
where

X =∆ max (t11 , t21 ," , tn1 ) (15-44)


and

Y= max (τ 11 ,τ 21 ," ,τ n1 ). (15-45)
Notice that the random variables X and Y have the same distribution
as in (15-35) with λ replaced by 1/2n (since µ = 1 and there are 2n
independent and identical processes in (15-42)), and hence
FX (t ) = FY (t ) = (1 − e − t / 2 n )n , t ≥ 0. (15-46)
Using (15-43), we get 28
PILLAI
FT1 (t ) = P(T1 ≤ t ) = P(min{ X , Y } ≤ t )
= 1 − P(min{ X , Y } > t ) = 1 − P( X > t , Y > t )
= 1 − P( X > t ) P(Y > t )
= 1 − (1 − FX (t ))(1 − FY (t ))
= 1 − {1 − (1 − e − t / 2 n ) n }2 , t ≥0 (15-47)
to be the probability distribution function of the new stopping time T1.
Also as in (15-36)
∞ ∞
E{T1 } = ∫ 0 P (T1 > t )dt = ∫ 0 {1 − FT (t )}dt 1


= ∫ 0 {1 − (1 − e − t / 2 n )n }2 dt.

Let 1 − e − t / 2 n = x, or 21n e − t / 2 n dt = dx, dt = 21ndx −x .


1 dx 1  1 − x
n

E{T1} = 2n ∫ 0 (1 − x ) n 2
= 2n ∫ 0  (1 − x n ) dx
1− x  1− x 
1
= 2n ∫ 0 (1 + x + x 2 + " + x n −1 )(1 − x n )dx 29
PILLAI
1 n −1 n −1
E{T1 } = 2n ∫ 0 ( ∑ x k − ∑ x n + k ) dx
k =0 k =0

= 2n { (1+ 12+13+"+ n1 ) − ( n1+1+ n+1 2+"+ 21n ) }


 2n(ln( n / 2) + γ ). (15-48)

Once again the total number of random arrivals N up to T1 is related


as in (15-38) where Yi ~ Exponential (1), and hence using (15-40) we
get the average number of total arrivals up to the stopping time to be
E{N } = E{T1 }  2n(ln( n / 2) + γ ). (15-49)
We can generalize the stopping times in yet another way:

Poisson Quotas
Let n
X (t ) = ∑ X i (t ) ~ P ( µt ) (15-50)
i =1
30
where Xi(t) are independent, identically distributed Poisson PILLAI
processes with common parameter λi t so that µ = λ1 + λ2 + " + λn .
Suppose integers m1 , m2 ," , mn represent the preassigned number of
arrivals (quotas) required for processes X 1 (t ), X 2 (t )," , X n (t ) in
the sense that when mi arrivals of the process Xi(t) have occurred,
the process Xi(t) satisfies its “quota” requirement.
The stopping time T in this case is that random time
instant at which any r processes have met their quota requirement
where r ≤ n is given. The problem is to determine the probability
density function of the stopping time random variable T, and
determine the mean and variance of the total number of random
arrivals N up to the stopping time T.
Solution: As before let ti1 , ti 2 , " represent the first, second, "
arrivals of the ith process Xi(t), and define
Yij = tij − ti , j −1 (15-51)
Notice that the inter-arrival times Yij and independent, exponential
random variables with parameter λi , and hence
mi
ti ,mi = ∑ Yij ~ Gamma(mi , λi ) 31
j =1 PILLAI
Define Ti to the stopping time for the ith process; i.e., the occurrence
of the mith arrival equals Ti . Thus
Ti = ti ,mi ~ Gamma(mi , λi ), i = 1, 2, " , n (15-52)

or
t mi −1 mi −λit
f Ti (t ) = λi e , t ≥ 0. (15-53)
(mi − 1)!
Since the n processes in (15-49) are independent, the associated
stopping times Ti , i = 1, 2, " , n defined in (15-53) are also
independent random variables, a key observation.
Given independent gamma random variables T1 , T2 , ", Tn ,
in (15-52)-(15-53) we form their order statistics: This gives
T(1) < T( 2 ) < " < T( r ) < " < T( n ) . (15-54)
Note that the two extremes in (15-54) represent
T(1) = min (T1 , T2 , " , Tn ) (15-55)
and 32
T( n ) = max (T1 , T2 , ", Tn ). (15-56) PILLAI
The desired stopping time T when r processes have satisfied their
quota requirement is given by the rth order statistics T(r). Thus
T = T(r ) (15-57)
where T(r) is as in (15-54). We can use (7-14), Text to compute the
probability density function of T. From there, the probability density
function of the stopping time random variable T in (15-57) is given by
n!
f T (t ) = FTki −1 (t )[1 − FT i (t )]n − k f Ti (t ) (15-58)
(r − 1)!(n − r )!
where FTi (t ) is the distribution of the i.i.d random variables Ti and
fTi (t ) their density function given in (15-53). Integrating (15-53) by
parts as in (4-37)-(4-38), Text, we obtain
(λi t ) k −λit
mi −1
FTi (t ) = 1 − ∑ e , t ≥ 0. (15-59)
k =0 k!
Together with (15-53) and (15-59), Eq. (15-58) completely specifies
33
the density function of the stopping time random variable T, PILLAI
where r types of arrival quota requirements have been satisfied.
If N represents the total number of all random arrivals up to T,
then arguing as in (15-38)-(15-40) we get
N
T = ∑ Yi (15-60)
i =1
where Yi are the inter-arrival intervals for the process X(t), and hence
E{T } = E{N }E{Yi } = µ1 E{N } (15-61)
with normalized mean value µ (= 1) for the sum process X(t), we get
E{N } = E{T }. (15-62)
To relate the higher order moments of N and T we can use their
characteristic functions. From (15-60)
N n
jω ∑ Yi jω ∑ Yi
E{e jωT } = E{e i =1
} = E{E [e i =1 | N = n ]}


[ E {e jωYi }]n

= E[{E{e jωYi } | N = n}n ]. (15-63) 34


PILLAI
But Yi ~ Exponential (1) and independent of N so that
1
E{e jωYi | N = n} = E{e jωYi } =
1 − jω
and hence from (15-63)

{( }

)
N
E{e jω T
} = ∑ [ E{e jωYi
}] P( N = n ) =E
n 1
1− jω = E{(1 − jω ) − N }
n =0

which gives (expanding both sides)


∞ ∞
( jω ) k ( jω ) k
∑ k! E{T } = ∑
k
k! E{N ( N + 1) " ( N + k − 1)}
k =0 k =0

or
E{T k } = E{N ( N + 1) " ( N + k − 1)}, (15-64)
a key identity. From (15-62) and (15-64), we get
var{N } = var{T } − E{T }. (15-65) 35
PILLAI
As an application of the Poisson quota problem, we can reexamine
the birthday pairing problem discussed in Example 2-20, Text.
“Birthday Pairing” as Poisson Processes:
In the birthday pairing case (refer to Example 2-20, Text), we
may assume that n = 365 possible birthdays in an year correspond to
n independent identically distributed Poisson processes each with
parameter 1/n, and in that context each individual is an “arrival”
corresponding to his/her particular “birth-day process”. It follows that
the birthday pairing problem (i.e., two people have the same birth date)
corresponds to the first occurrence of the 2nd return for any one of
the 365 processes. Hence
m1 = m2 = " mn = 2, r = 1 (15-66)
so that from (15-52), for each process
Ti ~ Gamma (2, 1/n). (15-67)

Since λi = µ / n = 1 / n, and from (15-57) and (15-66) the stopping time


in this case satisfies
T = min (T1 , T2 , " , Tn ). (15-68) 36
PILLAI
Thus the distribution function for the “birthday pairing” stopping
time turns out to be
FT (t ) = P{T ≤ t} = 1 − P{T > t}
= 1 − P{min (T1 , T2 ," , Tn ) > t}
= 1 − [ P{Ti > t}]n = 1 − [1 − FTi (t )]n
= 1 − (1 + nt )n e − t (15-69)
where we have made use of (15-59) with mi = 2 and λi = 1 / n.

As before let N represent the number of random arrivals up to the


stopping time T. Notice that in this case N represents the number of
people required in a crowd for at least two people to have the same
birth date. Since T and N are related as in (15-60), using (15-62) we get

E{N } = E{T } = ∫ 0 P(T > t )dt
∞ ∞
= ∫ 0 {1 − FT (t )}dt = ∫ 0 (1 + nt ) n e − t dt. (15-70)
To obtain an approximate value for the above integral, we can expand
ln(1 + nt ) n = n ln(1 + nt ) in Taylor series. This gives 37
PILLAI
( ) t 2 3
ln 1+ t t
= − 2+ 3 t
n n 2n 3n
and hence
( ) t n
2 t3 )
( t − 2t n +
1+ =e 3 n2

n
so that
( )t n  
3
t3
−t
2
− 2t n t 2
− 2t n
1+ e = e e ≈ e 1 + 2  3 n2
(15-71)
n  3n 
and substituting (15-71) into (15-70) we get the mean number of
people in a crowd for a two-person birthday coincidence to be
∞ ∞ 3 −t2 / 2n
E{ N } ≈ ∫ 0 e −t2 / 2n
dt + 1 2 ∫0 t e dt
3n
xe − x dx = π2n + 2
2 ∞
= 1 2π n + 2n2 ∫0
2 3n 3
= 24.612. (15-72)

On comparing (15-72) with the mean value obtained in Lecture 6


38
(Eq. (6-60)) using entirely different arguments ( E{ X } ≈ 24.44), PILLAI
we observe that the two results are essentially equal. Notice that the
probability that there will be a coincidence among 24-25 people is
about 0.55. To compute the variance of T and N we can make use of
the expression (see (15-64))

E{N ( N + 1)} = E{T } = ∫ 0 2tP(T > t )dt
2

n
= ∫0
∞  
t ∞
2t  1 +  e dt ≈ 2 ∫ 0 t e −t2 / 2n
dt + 22
∞ 4 −t2 / 2n
∫0 t
−t
e dt
 n 3n

= 2n ∫ 0 e − x dx + 22 π 3 (2n ) 2 2n = 2n + 2π n (15-73)
3n 8
which gives (use (15-65))
σ T ≈ 13.12, σ N ≈ 12.146. (15-74)

The high value for the standard deviations indicate that in reality the
crowd size could vary considerably around the mean value.
Unlike Example 2-20 in Text, the method developed here
can be used to derive the distribution and average value for 39
PILLAI
a variety of “birthday coincidence” problems.
Three person birthday-coincidence:
For example, if we are interested in the average crowd size where
three people have the same birthday, then arguing as above, we
obtain
m1 = m2 = " = mn = 3, r = 1, (15-75)

so that
Ti ~ Gamma (3, 1 / n ) (15-76)
and T is as in (15-68), which gives

( )
2 n
FT (t ) = 1 − [1 − FTi (t )]n = 1 − 1 + t + t 2 e−t , t≥0 (15-77)
n 2n
(use (15 - 59) with mi = 3, λi = 1 / n ) to be the distribution of the
stopping time in this case. As before, the average crowd size for three-
person birthday coincidence equals

( )
2 n
E{N } = E{T } = ∫ 0 P (T > t )dt = ∫ 0 1 + t + t 2 e − t dt.
∞ ∞
40
n 2n PILLAI
By Taylor series expansion
2 3
ln1 + t + t 2  =  t + t 2  − 1  t + t 2  + 1  t + t 2 
2 2 2 2

 n 2n   n 2n  2  n 2n  3  n 2n 
3
t
≈ − 3 t
n 6n
so that

E{ N } = ∫ 0 e − t3 / 6 n2 61/ 3 n2 / 3 ∞ ( 3 ) −x
1 −1
dt = 3 ∫0 x e dx
= 61/ 3 Γ(4 / 3) n 2 / 3 ≈ 82.85. (15-78)
Thus for a three people birthday coincidence the average crowd size
should be around 82 (which corresponds to 0.44 probability).
Notice that other generalizations such as “two distinct birthdays
to have a pair of coincidence in each case” (mi =2, r = 2) can be easily
worked in the same manner.
We conclude this discussion with the other extreme case,
where the crowd size needs to be determined so that “all days
41
in the year are birthdays” among the persons in a crowd. PILLAI
All days are birthdays:
Once again from the above analysis in this case we have
m1 = m2 = " = mn = 1, r = n = 365 (15-79)
so that the stopping time statistics T satisfies
T = max (T1 , T2 , " , Tn ), (15-80)
where Ti are independent exponential random variables with common
parameter λ = 1n . This situation is similar to the coupon collecting
problem discussed in (15-32)-(15-34) and from (15-35), the
distribution function of T in (15-80) is given by
FT (t ) = (1 − e − t / n ) n , t≥0 (15-81)

and the mean value for T and N are given by (see (15-37)-(15-41))
E{N } = E{T } ≈ n(ln n + γ ) = 2,364.14. (15-82)
Thus for “everyday to be a birthday” for someone in a crowd, 42
PILLAI
the average crowd size should be 2,364, in which case there is 0.57
probability that the event actually happens.

For a more detailed analysis of this problem using Markov chains,


refer to Examples 15-12 and 15-18, in chapter 15, Text. From there
(see Eq. (15-80), Text) to be quite certain (with 0.98 probability) that
all 365 days are birthdays, the crowd size should be around 3,500.

43
PILLAI
Bulk Arrivals and Compound Poisson Processes

In an ordinary Poisson process X(t), only one event occurs at


any arrival instant (Fig 15.9a). Instead suppose a random number
of events Ci occur simultaneously as a cluster at every arrival instant
of a Poisson process (Fig 15.9b). If X(t) represents the total number of
all occurrences in the interval (0, t), then X(t) represents a compound
Poisson process, or a bulk arrival process. Inventory orders, arrivals
at an airport queue, tickets purchased for a show, etc. follow this
process (when things happen, they happen in a bulk, or a bunch of
items are involved.) C1 = 3 C2 = 2 Ci = 4
P P P
" "
t t
t1 t2 tn t1 t2 tn

(a) Poisson Process (b) Compound Poisson Process


Fig. 15.9
Let
pk = P{Ci = k }, k = 0, 1, 2," (15-83) 44
PILLAI
represent the common probability mass function for the occurrence
in any cluster Ci. Then the compound process X(t) satisfies
N (t )
X (t ) = ∑ Ci , (15-84)
i =1
where N(t) represents an ordinary Poisson process with parameter λ . Let

P( z ) = E{z } = ∑ pk z k
Ci
(15-85)
k =0
represent the moment generating function associated with the cluster
statistics in (15-83). Then the moment generating function of the
compound Poisson process X(t) in (15-84) is given by

φ X ( z ) = ∑ z n P{ X (t ) = n} = E{z X ( t ) }
n =0
k
∑ Ci
= E{E [ z X ( t ) | N (t ) = k ]} = E [ E{z i=1 | N (t ) = k }]

= ∑ ( E{z Ci })k P{N (t ) = k }
k =0

( λ t )k
= ∑ P ( z )e
k − λt
k!
= e − λ t (1− P ( z )) (15-86) 45
k =0 PILLAI
If we let k
 ∞
k 

P ( z ) =  ∑ pn z  = ∑ pn( k ) z n
k ∆
(15-87)
 n =0  n =0

where { pn(k ) } represents the k fold convolution of the sequence {pn}


with itself, we obtain

( λ t )k
P{ X (t ) = n} = ∑ e − λt
k!
pn( k ) (15-88)
k =0

that follows by substituting (15-87) into (15-86). Eq. (15-88)


represents the probability that there are n arrivals in the interval (0, t)
for a compound Poisson process X(t).

Substituting (15-85) into (15-86) we can rewrite φ X (z ) also as

φX ( z) = e− λ1t (1− z ) − λ2 t (1− z 2 )


e "e " (15-89)
− λk t (1− z k )

where λk = pk λ , which shows that the compound Poisson process


can be expressed as the sum of integer-secaled independent
46
Poisson processes m1 (t ), m2 (t ),". Thus PILLAI

X (t ) = ∑ k mk (t ). (15-90)
k =1

More generally, every linear combination of independent Poisson


processes represents a compound Poisson process. (see Eqs. (10-120)-
(10-124), Text).
Here is an interesting problem involving compound Poisson processes
and coupon collecting: Suppose a cereal manufacturer inserts either
one or two coupons randomly – from a set consisting of n types of
coupons – into every cereal box. How many boxes should one buy
on the average to collect at least one coupon of each type? We leave
it to the reader to work out the details.

47
PILLAI

You might also like