You are on page 1of 49

Module-5

Some Special Absolutely Continuous Distributions


Recall that a random variable (r.v.)  is said to be of absolutely continuous type if there exists a
function  : 0, such that the distribution function (d.f.) of  is given by


  =     ,  .


The function   is called a probability density function (p.d.f) of r.v.  and the set  =
   > 0  is called the support of the p.d.f.   (or of r.v. ).

We have seen that the probability distribution of an absolutely Continuous type r.v. is
completely determined by its p.d.f (or its d.f.). Recall that a function : is a p.d.f of
some r.v. if, and only if,  0,  and


   = 1.


1. Uniform or Rectangular Distribution

Let " and # be real numbers such that < " < # < . An absolutely continuous type r.v. 
is said to have uniform (or rectangular) distribution over the interval ", # (written as
 U", # ) if the p.d.f. of  is given by
1
,
  = ( # "
0 ,

Clearly   > 0,   = ", # and

if " <  < #

otherwise

    = 1,


i.e.,   is a proper p.d.f. with support  = ", # .

We have a family U", # : < " < # <  of uniform distributions corresponding to
different choices of " and # ( < " < # < .
1

Suppose that  U", # , for some < " < # < . Then, for 2 1,2, ,
567 = 8  6


=   6   


6
= 

: #"
9

# 6;< " 6;<


2 + 1 # "

"
" A
" 6
#6
>1 + + ? @ + + ? @ C
=
2+1
#
#
#

and
56 = 8  5<7 6

"+# 6
@ E
= 8 D?
2

"+# 6
1
=  D?
@ E

2
#"
:
9

= 

9 :
A
9

6

:# "

0 , if 2 = 1, 3, 5,

= F # " 6
Thus we have

26 2 + 1

567 = 8  6 =
and

, if 2 = 2, 4, 6,

#6
"
" A
" 6
>1 + + ? @ + + ? @ C , 2 = 1,2,
2+1
#
#
#

0, if 2  1, 3, 5,

56 = 8  5<7 6 = F # $ " 6


.
, if 2

2,
4,
6,

26 2 = 1
Consequently,
Mean  N   5<7 

 $ A
#="
; Var X  A 
,
2
12

5U  0, 5V 

 $ V
,
80

Coefficient of Skewness  #< 

5U

U
5AA

 0,

and
Kurtosis  ^< 

5V 9
  1.8.
5AA 5

Thus an uniform distribution is highly platykurtic (i.e., in comparison with normal distribution
having mean " = # /2, p.d.f. of U", # distribution has a flatter peak around its mean). The
flatness of p.d.f. around mean is due to distribution being less concentrated around its mean.
Moreover the value of of coefficient of skewness #<  0 suggests that the distribution of  may
be symmetric about the mean 5<7 . Clearly
 5<7 $     $ 5<7 ,  ,

and, therefore, the distribution of  U", # is symmetric about 5<7  " = # /2.

Figure 1.1. Plot of p.d.f. of U", # distribution.

Since the distribution of  U", # is symmetric about " + # /2, we have




or equivalently

"+# d "+#
=
,
2
2

 = " + # .
b

The distribution function of  U", # is given by




  =    


0, if  < "
"
, if "  < # ,
i. e. ,   = F
#"
1, if  #

Since the distribution of  U", # is symmetric about 5<7 = " + # /2, we have
Median = Mean =

One may directly check that


 ?

"+#
.
2

"+#
1
@ = ,
2
2

implying that " + # /2 is the median of  U", # .

The lower quartile d< and the upper quartile dU of  U", # are given by
1
3
 d< = and  dU =
4
4

d< =
Also,

# + 3"
3# + "
and dU =
.
4
4

Quartile deviation QD =

dU d< # "
=
.
2
4

The moment generating function of  U", # is given by

j  = Nk l


=  k l   

= 

k l

#"

k l9 k l:
i. e. , j  = ( # "  , if  0 .
1, if  = 0

The following theorem provides a characterization of  U", # in terms of the property


that, for any interval n [", #], r n  depends only on the length of the interval n and not
on the location of n on [", #].

Theorem 1.1

Let " and # be real constants such that < " < # < and let  be a random variable of
absolutely continuous type with r"  #  = 1. Then  U", # , if, and only if,
r  n  = r  s  , for any pair of intervals n, s [", # ] having the same lengths.
Proof.

First suppose that  ~ U, # and a < u #. Then

rX v, u  = rX [v, u  = rX v, u] = rX [v, u]

=  u  v

uv
,
#"

depends only on the length (= u v) of the interval v, u /[v, u /v, u]/[v, u].

Conversely suppose that r n  = r s for any pair of intervals n, s [", # ] having
the same lengths. For 0 < w 1, let
x w = r" <  " + # " w =  " + # " w ,

where   is the d.f. of . Then, for 0 < w< 1, 0 < wA 1, 0 < w< + wA 1,

r" + # " w< <  " + # " w< + wA  = r" <  " + # " wA  ,

and therefore

x w< + wA = r" <  " + # " w< + wA 

= r" <  " + # " w<  + r" + # " w< <  " + # " w< + wA 

= r" <  " + # " w<  + r " <  " + # " wA 

= x w< + x wA .

By induction, for 0 < wy 1, z = 1, , { and 0 < }y~< wy 1, we have


Consequently

x w< + wA + + w} = x w< + x wA + + x w} .

xw = xw , 0 < w

1
1.1

w
w
w
and x w = x + + = { x , 0 < w 1. 1.2

{
{
{

Also, for , { 1,2 . . , < {,

1
1

x = x + +

{
{
{

1
= x ? @ using 1.1
{
=

x 1 using 1.2
{

#
{ 

x 2 = 2, 2 0,1 , 1.3

where denotes the set of rational numbers. Now let  0,1 . Choose a sequence
2} : { = 1, 2,  in 0,1 such that 2}  (existence of such a sequence is guaranteed).
Then
x  = lim x 2} since x  =  " + # "  is right continous
}

= lim 2} using 1.3


}

= .
It follows that

 " + # "  = ,  0,1

  =

"
,  ", # .
#"

Also, since  is continuous on and r"  #  = 1, we have


0,
"
  = F
,
#"
1

 ~ U", # .

if  < "
if "  < #
if  #

Theorem 1.2

Suppose that  ~ ", # , for some real constants " and # such that < " < # < . Then

 :
9 :

~ U0,1 .

The p.d.f of  is

Proof.

  = (

Let =

 :
9 :

9 :

 :
9 :

, if " <  < #

0, otherwise

=  , say. Clearly  =

Therefore the r.v. =

and p.d.f.

<

 :

9 :

,   = ", # is strictly increasing on  .

is of absolutely continuous type with support  =  = 0,1


  =  <  <  n  .


We have  = 0,1 and <  = " + # " ,  = 0, 1 . Therefore
  =  " + # " |# "| n,< 
=

1 , if 0 < < 1
.
0,
otherwise

"
~ U0,1 .
#"

Example 1.1

Let v > 0 be a real constant. A point  is chosen at random on the interval 0, v (i.e.,
 U0, v ).
(i)

(ii)

Solution.
(i)

If denotes the area of equilateral triangle having sides of length , find the mean
and variance of .
If the point  divides the interval 0, v into subintervals n<  0,  and nA  p, v ,
find the probability that the larger of these two subintervals is at least the double
the size of the smaller subinterval.

In the equilateral triangle

Figure 1.2

Therefore

  ,


3
and 
.
2
2

1
3
3 A
 
 

2
2
4

N 

N  A 

(ii)

3
3 A
N A 
v ,
4
12

3
3
N  V  vV
16
80

and Var  N  A $ N 

The required probability is

vV
.
60
8

= rmax, v $   2 min, v $ 


v
v
 r v $   2,  c = r   2v $  ,  
2
2
v
2
 r  c = r ?  v@
3
3
v
2
  = 1 $  ? v@
3
3
1
2 2
 = 1 $  .
3
3 3

Remark 1.1

Uniform distribution is applicable in situations where the outcome of random experiment is a


number  chosen at random from an interval p", # q in the sense that if n p", # q is any interval
then r n depends only on the length of n and not on its location in p", # q.

1.1.

Quantile Function and Uniform Distribution

We begin this section with the definition of quantile function.

Definition 1.1

Let  be a random variable (not necessarily of absolutely continuous type) with distribution
function   .
(i)

(ii)

The function  : 0, 1 , defined by,    infw :  w   , 0 % %


1, is called the quantile function (q.f.) of the random variable  (or of distribution
function   .
For a fixed 0, 1 , the quantity    infw :  w   is called the -th
quantile of  (or of   ).

Figure 1.3. Plot of quantile function of Exp1 distribution.


9

Figure 1.4. (a)

Figure 1.4. (b)

Figure 1.4. (c)

Remark 1.2

If the distribution function   is strictly increasing on  then     <  , 0 % %


1.

Lemma 1.1

Let  be a random variable having distribution function   and quantile function   . Let
 , 0, 1 and 0 % < % A % 1. Then
(i)
(ii)
(iii)
(iv)
(v)
(vi)

   c , provided 0 %   % 1;
    ;

    , provided there exists an   such that    . In particular


if   is continuous then     ;
  c     ;
    <  , provided  <  exists;
 < c  A .

Proof. For 0, 1 , define

  w :  w  ,

so that    inf  , 0, 1 .


(i)

Let   be such that 0 %   % 1. Then    w :  w    


and, therefore,   inf       , i.e.,    c .

10

(ii)

(iii)

(iv)

(v)

Let 0, 1 . Then   = inf  . Thus there exists a sequence } : { = 1, 2, 


in  such that lim} } =   . Consequently }   , { = 1, 2, and
 } , { = 1, 2, . This implies that lim}  } . Since   is right
continuous, }   , { = 1, 2, and lim} } =   , we get
   = lim}  } .
Let  such that   = . Then
  = w :  w 
 inf  =   .

Now using (ii) and the fact that   is non-decreasing, we get


=     
   = .
Note that lim    = 0 and lim   = 1. Thus if   is continuous
then the intermediate value property of continuous functions implies that there
exists an  such that   = 0, 1 and therefore    = .
First suppose that   = inf  . Then, since   is non-decreasing, we have
    
  . using ii
Now suppose that   . Then   = w :  w  and, therefore,
 inf  =   .
Since < < A , we have
 = w :  w A  w :  w <  = 
 
 < = inf  inf  =  A .

Theorem 1.3

Let  be a random variable with distribution function   and quantile function   .


(i)

(ii)
Proof.
(i)

(Probability Integral Transformation) If the random variable  is of continuous type


then   U0, 1 ;
Let U0,1 . Then   = .

Let x be the d.f. of   , i.e.,

x = r    , .

Clearly, for < 0, x  = 0 and, for 1, x  = 1. Now suppose that


0, 1 . By Lemma 1.1 (iv) we have

11

w :  w    w : w    

r      r    

r   %   r %   

r   %   r c    . since   is continuous 1.4

Since   is continuous  :      p< , A q, for some real numbers <


and A such that $ % < c A % (see Figures 1.5 (a) & (b)).

Figure 1.5 (a)

Figure 1.5 (b)

Thus, for 0, 1 ,


r      r< c  c A 

  A $  <

 $  0. 1.5
Using (1.4), (1.5) and Lemma 1.1 (iii) we get, for 0, 1 ,

x   r   c   r   %   r c     .

Also right continuity of d.f. x implies that

x 0  lim x  lim   0.




0, if % 0
x   , if 0 c % 1 ,
1, if  1

Therefore we have

(ii)

i. e.,
  U0, 1 .
Let U0, 1 , so that r c   , p0, 1q and r0 % % 1  1.

Then the d.f. of   is


  r c 

12

= r  

= r   , 0 < < 1 since r 0 < < 1 = 1

= r    , 0 < < 1 using Lemma 1.1 iv


= r   

=   ,

= .

Remark 1.3
The above theorem provides a method to generate observations from any arbitrary continuous
distribution using observations from U0, 1 distribution. Suppose that we require an
observation  from a distribution having known continuous d.f.  and quantile function
 . To do so, the above theorem suggests that, generate an observation from the U0, 1
distribution and take  =  .

Example 1.2

Using a random observation U0, 1 , describe a method to generate a random observation


 from the distribution having
(i)

(ii)

Solution.
(i)

probability density function

 =

||

, <  < ;

{
 1 }  , if  0, 1, , {
 = 
,
0, otherwise
where { and 0, 1 are real constants.

probability mass function

For  < 0, we have


 = r 


=    


kl
=  
2


k
= ,
2

13

and, for  0, we have


 = r 


=    


=     +    



kl
k l
= 
 + 

 2
2
k 
= 1
.
2
Thus the d.f. of  is given by
k
, if  < 0
,
 = F 2
k 
1
,
if  0
2
and the q.f. of  is given by

(ii)

1
ln2 , if 0 < <
2.
 = <  = F
1
ln21 ,
if < 1
2
Using Theorem 1.3 (ii) the desired random observation is given by
1
ln2 , if 0 < <
2 .
 =  = F
1
ln21 ,
if < 1
2
The distribution function of  is given by
0, if  < 0

{
if  < + 1; = 0, 1, , { 1,
x  = ? @ 1 } ,
~

1, if  {
and the quantile function of  is given by
 = infw : x w 
=

1,

{,

if 0 < 1 }
<

{
if ? @ 1 }

{
< ? @ 1 }

= 0, 1, , { 1
<

{
if ? @ 1 }

; .

< < 1
14

Now, using Theorem 1.3 (ii), the desired random observation is given by
1,
if 0 < 1 }

<

{
{

if ? @ 1 } < ? @ 1 } ; .
,

~
~
 =
= 0, 1, , { 1

<

{
if ? @ 1 } < < 1
{,

2. Gamma and related Distributions


We begin this section with the definition of gamma function.

Defination 2.1

The function : 0, 0, , defined by,




" =  k l  : <  ,

is called the gamma function.

">0

To examine convergence of the integral




 k l  : <  , " ,
consider the following cases.
Case I

"0

In this case the integral

 k l  : < 

will converge if, and only if, both the integrals


<

 k 

: <

 and  k l  : < 
<

15

converge. Note that, for " 0,

k l  :

<

and the integral

: <
,
k
<

  :

 0,1
<

diverges. This implies that, for " 0, the integral



<

 k l  : < 

diverges. Consequently the integral

 k l  : < 

diverges for " 0.


Case II 0 < " < 1

In this case again the integral

 k l  : < 

will converge if, and only if, both the integrals


<

 k l  :

converge. Note that, for " > 0,


and the integral

<

 and  k l  : < 

0 k l  :

<

<

:

<

 0,1

16

<

  :

is convergent. Therefore the integral

<



<

 k l  : < 

is convergent for any " > 0.

Now let us examine the convergence of the integral




 k l  : < .
<

Fix " and choose k such that > ". Then we know that
kl

0 k l  :

<


,
!

 > 0

!
, 

 :;<

Also " + 1 > 1 and, therefore, the integral




converges. Consequently

<

 :;<

> 0.



 k l  : < 
<

converges for any " . From the above discussion it follows that the integral


converges for 0 < " < 1.

Case III

 k l  : < 

"1

17

In this case the integral

 k l  : < 
will converge if, and only if, the integral

 k l  : < dt
<

converges. We have seen in the Case II above that the integral




 k l  : < 
<

converges for any " .

On combining cases I III we conclude that the integral




 k l  : < 

converges if, and only if, " > 0.

Using integration by parts, for " > 0, we have




" + 1 =  k l  : 

l : <
= [ k l  : ]
,
+ "  k 


= "  k l  : < ,


i. e.,

Note that

" + 1 = " " , " > 0. 2.1

18

1 =  k

dt = 1. 2.2

For { , using (2.1) and (2.2), we have

{ + 1 = { { = {{ 1 { 1 = = {{ 1 3 2 11 = {!. 2.3

On combining (2.1), (2.2) and (2.3) we get

{ = { 1 ! , { , 2.4

with the convention that 0! = 1.

We have

1
 =  k l 
2

= 2  k

1 A
? @ = 4  k
2
 

= 4   k

<A





  k

 ;

 .

On making the transformation  = 2 cos and = 2 sin in the above integral (so that the
Jacobian of the transformation is 2), we have
 A

1 A
? @ = 4   2 k
2

= 2  2 k

=  k

 2

2


19

= .
Since

1
 =  k l  <A <  0,
2

we get

A = 2.5
<

Also, using (2.1),

and

In general

3
1
1

? @ = ? @ =
,
2
2
2
2

5
3 1 1
13
 =  = A .
2
2 2 2
2

2{ + 1
1 3 5 2{ 1
?
@=
,
2
2}

i.e., for { ,

A};<
A

A} !
}!V

{ , 2.6

, { . 2.7

Definition 2.2

An absolutely continuous type random variable  is said to follow a gamma distribution with
shape parameter " > 0 and scale parameter > 0 (written as  ~ x ", ) if its
probability density function is given by

1
 : < , if  > 0
k
 |", = ( : "
.
0, otherwise

20

Note that, for " > 0 and > 0 ,  |", 0,  and




1
  |",  = :
 k
"



 : < 

1
=
 k l  : < 
"

= 1.

Theorem 2.1

Suppose that  ~ x", , for some " > 0 and > 0. Define the random variable = .


Then ~ x ", 1 , i. e, the p.d.f of is given by


k

: <
,
if > 0
  = ( "
.
0, otherwise

Proof. We have  =  :  |", > 0 = 0, and the p.d.f. of  is



1
 : < ,
k
if  > 0
.
 |", = ( : "
0, otherwise

We have = =  , say. Clearly the transformation  = ,   , is strictly monotone




on  with 7  = ,   . Also  =  = 0, , and <  = ,  . Thus


<

=  = has the p.d.f.





  =  <  <  n 

=   ||n, 
k

,
if > 0
.
= ( "
0, otherwise
: <

Note that if ~x ", 1 , for some " > 0, then


21

N 6 =  6   


1
=
k
"
=

Thus

" + 2
,
"

~ x ", 1 N  6 =

Clearly, for 2 ,

:;6 < 

2 > ".

" + 2
,
"

2 > ". 2.8

N  6 = "" + 1 " + 2 1 2.9

Also if  ~ x ", so that = ~x ", 1 then, for 2 > ",




Using (2.8) we get

N  6 = N = 6 N 6 .
6

N  6 =
Therefore,

" + 2 6
.
"

567 = N  6 = "" + 1 " + 2 1 6 ,


Mean = 5<7 = N  = ",

2 .

5A7 = N  A = "" + 1 A ,

Variance = 5A = N  A N = " A ,


A

5U = N  5<7 U = 5U7 3 5<7 5A7 + 25<7 U = 2" U ,

5V = N  5<7 V = 5V7 45<7 5U7 + 65<7 A 5A7 35<7 V = 3" " + 2 V ,
22

Coefficient of skewness  #< 

5U

U
5AA

"

and
Kurtosis  ^< 

5V 3" = 2
6
 3 = .
A 
"
"
5A

Note that, as " , #< 0 and ^< 3. Also #<  0 and ^<  3. Thus the gamma
distribution is positively skewed and has sharper peaks than the normal distribution. For " 0
the distribution is heavily (positively) skewed. For large " " the gamma distribution very
much behaves like the normal distribution.

Figure 2.1. Plots of p.d.f.s of x", 1 distribution

The m.g.f. of  x", is given by

j   N k l  N k l ,

where  x", 1 (see Theorem 2.1). Thus




1
j  
 k


j   1 $ 

< l

,  %

: < 

1
. 2.10

23

The following theorem provides a relationship between gamma probabilities and Poisson
probabilities.
Theorem 2.2

For a positive integer { and for real constants > 0 and  > 0, let  x{, and P .
l

Then

r >  = r { 1

k
k  } <

i. e., 

=

{ }


<
~

l


,  > 0, > 0.

Proof. Let = , so that, by Theorem 2.1, G{, 1 . Then, for  > 0,





r >  = r ? > @. 2.11

For  > 0, we have

r >  = 

with the convention that 0! = 1.

k } <
 = n} say , { ,
{ 1 !

On integrating by parts we get

1
n} =
k
{ 1 !

 } < 


+ { 1  k

1
k } <
+
 k
=
{ 2 !
{ 1 !

k } <
+ n} < , { 2
{ 1 !

 } A

 } <





k } < k } A
+
+ n} A , { 3
{ 1 !
{ 2 !

24

} <

=
~<

} <

=
~<

} <

=
Thus, for > 0,

+ n<
!

+  k
!

.
!

} <

r >  =
~



<


r  >  = r ? > @ =

l


,  > 0, using 2.11 .

Example 2.1

For a positive integer { and > 0, let  G{, . Define the random variable
} <

=
~

 

Find the probability distribution of random variable .

Solution. Note that r  0 = 1 and, by Theorem 2.2, the distribution function of
 G{, is given by
0, if  < 0
 
} <
k
 =
.
,
if  0
1
!

Clearly r = 1   = r 0 = 1, and therefore = 1  . Also by

probability integral transformation  U0, 1 . Moreover = 1 U0, 1 .


From the above discussion it follows that = 1 U0, 1 .

25

Definition 2.3

For > 0, a G1, distribution is called an exponential distribution with scale parameter
denoted by Exp .

The p.d.f. of ~ Exp is given by


Note that if ~Exp , then

1 l

if  > 0 .
  = k ,
0, otherwise
Mean = 5<7 = N  = ,
Variance = 5A = A ,

567 = N  6 = 2! 6 , 2 1,2, ,

Coefficient of skewness = #< =


and

5U

U
5AA

Kurtosis = ^< =

5V
= 9.
5AA

j  = 1  < ,

<

The m.g.f. of ~ Exp is given by

and the d.f. of ~ Exp is given by

= 2,

1
,

 =    


0, if  < 0
l
i. e.,  =
.

1 k , if  0

Clearly, for every w > 0, and  > 0,

26

r > w + |  w 

r   w = 
k
r  w

 P   , 2.12

i. e. , r > s +  = r > w r >  , w > 0,  > 0. 2.13

Let ~ Exp denote the lifetime of a component. Then the property (2.12) (or equivalently
the property (2.13)) about the lifetime of the component has the following interesting
interpretation. Given that the component has survived w units of time the probability that it will
survive additional  units of time is the same as the probability that a fresh unit (of age 0) will
survive  units of time. In other words the component is not aging with time (i.e., the used
component is as good as the new one). This property of a continuous type random variable is
also known as the lack of memory property (at each stage the component forgets its age and
behaves like a fresh component).
In the following theorem it is shown that the lack of memory property characterizes the
exponential among all continuous distributions having mass concentrated on 0, .

Let be a random variable of continuous type with 0 = 0, where  is the distribution


function of . Then has the lack of memory property if, and only if, Exp , for
some > 0.
Proof. We have seen that if Exp , for some > 0, then

Theorem 2.2

r > w + |  w  r   , w,  > 0,

i.e., has the lack of memory property.

Conversely suppose that has the lack of memory property, i.e.,

r > w +  = r > w r >  , w,  > 0.

Let  = 1  ,  . Then we have

s +  = s t , s, t > 0

w< + wA + + w = w< wA w , w< , wA , , w > 0 2.14

1 1
1
= + + +

{
{
{
{

1
= [ ? @] , , { , using 2.14 2.15
{
27

<
1 = [ }]} , { . 2.16

Using (2.15) and (2.16), we get

= ? @
{
{

= [ 1 ] } , , { . 2.17

1 , so that 0 1. Clearly, if = 0, then by (2.16)


Let =
1
? @ = 0, {
{

1
? @ = 1, {
{

1
lim ? @ = 1 ,
}
{

0 = 1 , since  is continuous ,
which is not true as 0 = 0.

1 > 0. Similarly, if = 1 = 1, then using (2.14)


Therefore, =

{ = D1

+ 1 + + 1E
1 ]}
= [

= 1, {
{ = 0 , {

lim { = 0,
}

which is not true as lim} { = 1 .

Thus we have 1 = 0,1 . Then = 1 = k


Using (2.17) we have
2 = k

6

, 2

, for some > 0. 

0, ,

<

= ln .

28

where denotes the set of rational numbers. Now let  0, . Then there exists a
sequence 2} : { = 1, 2,  of rational numbers in 0, such that lim} 2} = .
Therefore
 = lim 2}
}

= lim 2} since  = 1  , is continuous on


}

= lim k
}

= k


.

Also 0 = 0 implies that  = 0 for every  0, i.e.,  = 1 ,  0. Therefore

1, if  0
 = 
k , if  > 0

 =

0, if  < 0

1k


,

Exp .

if  0

Example 2.2

The waiting time for occurrence of an event N (say repair time of a machine) is exponentially
distributed with mean of 30 minutes. Find the conditional probability that the waiting time for
occurrence of event N is at least 5 hours given that it has not occurred in the first 3 hours.
Solution. Let  be the waiting time (in hours) for the occurrence of event N. Then  Exp .
<

By the lack of memory property of exponential distribution, the required probability is

Definition 2.4

r > 5| > 3 = r > 2 = k

For a positive integer {, a G , 2 distribution is called the chi-squared distribution with {


}
A

degrees of freedom (d.f.) (denoted by }A ).


The p.d.f. of }A is given by

29

  =

Note that if }A then

k
}
F2 A {

}
<
A A , if

0

.
2
0, otherwise
Mean  5<7  N   {,
Variance  5A  2{,

Coefficient of skewness  #< 

5U

U
5AA

2
 2 ,
{

and
Kurtosis  ^< 

Moreover the m.g.f. of is }A is given by

5V
12
A  3 = { .
5A

j   Nk l  1 $ 2

}
A , 

1
% .
2

Figure 2.2. Plots of p.d.f.s of }A distribution


30

3. Beta Distribution
We will first provide the definition of the beta function.

Definition 3.1

The function : 0, 0, 0, , defined by,


<

, =    < 1  < 

is called the beta function.


<

Clearly the integral

   < 1  < 

converges for  1 and 1. For  0,1 or 0,1 the integral


<

   < 1  < 

will converges if, and only if, both the integrals


<
A



 <

1 

<

<

 and    < 1  < 


<
A

converge.

Since, for 0 <  < 1,

and the integral

  < 1 
lim
l
 <

<

= 1,

<
A

   < 
converges, it follows that the integral

31

<
A

converges for , 0, 1 .

Similarly, for 0 <  < 1,

and the integral

   < 1  < 

  < 1 
lim
l<
1  <

<

= 1,

<

1  < 
<
A

converges. Consequently the integral


<

converges for , 0, 1 .

   < 1  < 
<
A

From the above discussion it follows that the integral


<

converges if  > 0 and > 0.

   < 1  < 

Using the above arguments it can also be seen that the integral
<

diverges if  0 or 0.

   < 1  < 

Thus the beta function : 0, 0, 0, is well defined. For  > 0 and > 0,
consider

32

  =  k

w  < w  k

 

=   k

;l

w
s =


w



l <



w  <  < w .

Let us make the transformation w = and  = 1 in the above integral. Then the
Jacobian of the transformation is

= .

Also,

Therefore,

0 < w < and 0 <  < 0 < < 1 and > 0.

< 

  =   k

<

=   k

 < 1

; <
<

<

|| 

  < 1 < 

=  +   < 1 < 

=  + ,
, =

<
 
=    < 1  <  ,  > 0, > 0.
 +

Note that , = ,  , , 0, 0, .

33

Let  be a random variable of absolutely continuous type and let v > 0 and u  0 be real
constants. The random variable  is said to follow the beta distribution with shape parameter
v, u (written as ~Bev, u ) if its probability density function is given by

Definition 3.2

 < 1 $  <
, if 0 %  % 1
   (
.
v, u
0, otherwise

Clearly    0 ,   and


    

<
1
  < 1 $  <   1.
v, u

Figure 3.1. Plots of p.d.f.s of Bev, u distributions

Note that Be1, 1 distribution is nothing but U0, 1 distribution.

Suppose that  Bev, u distribution, for some positive constants v and u. The, for 2  $v
<

1
N 6 
  6  < 1 $  < 
v, u


i. e.,

N  6 

<
1
  ;6 < 1 $  < 
v, u

v = 2, u
v = 2 v = u

,
v, u
v v = u = 2

2  $v.

Therefore,
34

Mean = 5<7 = N  =

5A7 = N  A =

v
,
v+u

vv + 1
,
v + u v + u + 1

Variance = 5A = N  A N  A =

5U = N  5<7 U = 5U7 3 5<7 5A7 + 25<7 U =

v +

v +

vu

u A v

+ u + 1

2u v vu
,
+ u + 1 v + u + 2

u U v

5V = N  5<7 V = 5V7 45<7 5U7 + 65<7 A 5A7 35<7 V

3vu2u v A + vu
,
v + u V v + u + 1 v + u + 2 v + u + 3

Coefficient of skewness = #< =


and
Kurtosis = ^< =

5U

U
5AA

2u v v + u + 1

,
v+u+2
vu

5V 3v + u + 1 [2u v A + vuv + u + 2 ]


=
.
vuv + u + 2 v + u + 3
5AA

The m.g.f. of  Bev, u is given by

j  = N k l

<

1
=
 k l  < 1  < 
v, u

<

1
  <
 (
 1  < 
=
v, u
!


<


=
  ; < 1  < 
! Bv, u
~

35

 Bv = , u
=
,
! Bv, u


z, k.,

j  
~

 ,

v = u v = 
,  .
v v = u = !

For v  u  "  0 , say and  0, 1

1
r c   
  : < 1 $  : < 
B", "

<

1

 : < 1 $ : < 
B", "
< 

 r  1 $ 

It follows that

 r 1 $  c  .

r c   r1 $  c  ,  .
Therefore

 Bev, u   1 $   $

1 1
 $ .
2 2

Thus if ~Be", " , for some "  0, then the distribution of  is symmetric about A.
<

36

Figure 3.2. Plots of p.d.f.s of Be", " distributions

In the following theorem we establish a relationship between the beta and the binomial
probabilities.

Theorem 3.1

Let  Be, { , for some positive integers and {. Then, for  0, 1 ,
r  = r  ,

where Bin + { 1,  . Equivalently




;} <

1
+{1
  < 1  } <  = ?
@  1  ;}
B, {

Proof. Fix  0, 1 and define

<

 0, 1 .

n,} = r 


1
=
  < 1  } < 
B, {

 + { 1 !
=
  < 1  } < .
 1 ! { 1 !

On integrating by parts we get


n,}

 + { 1 !
 1  } <
{1
+
=

  1  } A 
 1 ! { 1 !

 + { 1 !
=
 1  }
! { 1 !

+{1
= ?
@  1  }

+{1
= ?
@  1  }

<

 + { 1 !
+
  1  } A 
! { 2 !

<

<

+ n;<,}

<

+ { 1 ;<
1  } A + n;A,} A
+?
@ 
+1

37

;} A

+{1
= ?
@  1  ;}

;} A

+{1
= ?
@  1  ;}

;} <

+{1
= ?
@  1  ;}

<
<
<

+ n;}

<,<

 + { 1 !
+
  ;} A 
 + { 2 ! 0!

Example 3.1

Time (in hours) to finish a job follows beta distribution with mean 1/3 hours and variance 2/63
hours. Find the probability that the job will be finished in 30 minutes.
Solution. Let  denote the time to finish the job. Then  Bev, u , for some v > 0 and u >
0. We have
Mean = N  =

v
1
vu
2
= and Variance = Var =
=

v + u A v + u + 1 63
v+u
3
v = 2 and u = 4,

and therefore the required probability is

 Be2, 4 ,

i. e.,

<
A

1
1
r ? < @ =
  1  U 
2
B2, 4
<
A

= 20  3 A + 3 U  V 

4. Normal Distribution
Recall that

13
.
16

1
= ? @ =  k l 
2

<
A 

38

= 2  k

=  k

2
1

k

k



A 

k

=1


A 






A 

= 1, 5 , and > 0. 4.1

Definition 4.1
(i)

Let 5 and > 0 be real constants. An absolutely continuous type random


variable X is said to follow a normal distribution with parameters 5 and A (written
X ~ 5, A ) if its probability density function is given by
  =

(ii)



A ,

<  < .

The 0,1 distribution is called the standard normal distribution.

The p.d.f. and the d.f. of 0, 1 distributions will be denoted by  and  respectively,
i.e.,
  =


A ,

and z =    =

Clearly if  ~ 5, A then

 5  =  5 +  =

< < ,

2
1

k




A  .


A , 

39

i. e.,

if  ~ 5, A then  $ 5  5 $ .

Thus the distribution of  ~ 5, A is symmetric about 5. Since the p.d.f.   of


 ~ 5, A is strictly increasing in $, 5 and strictly decreasing in 5, the distribution of
 ~ 5, A is unimodal with mode at 5.

Figure 4.1. Plots of p.d.f.s of 0, A distributions.

Figure 4.2. Plots of p.d.f.s of 5, 1 distributions

40

Since the distribution of  ~ 5, A is symmetric about 5 (i.e.,  $ 5  5 $ ) we have, for


 ,

r $ 5 c $  r5 $  c $ 

r  c 5 $   r  5 = 

r  c 5 $   1 $ r c 5 =   .

Thus,

1
 5, A  5 $   1 $  5 =  ,  , and  5  .
2

Figure 4.3
In particular
1
$  1 $  , , and 0  .
2

It follows that the median of  5, A is 5.

Suppose that  ~ 5, A . Then the p.d.f. of 

    5 = || n



A ,

is given by
, 

$ % % .
41

5
0, 1 .

i. e.,  ~ 5, A =

Also if 0, 1 then the d.f. of = A is given by

 = r A  .

Clearly, for  < 0,  = 0, and, for  0

 = r 
l

= 

2
l

2
=  k

= 



A 


A 

<

A < k


A .

Therefore, if 0, 1 then a p.d.f. of = A is given by


1

<

l

A , if 

>0
,
  = (2
0, otherwise

which is a p.d.f. of a <A distribution. Thus


We have the following result.

 A < k

0, 1 = A <A .

Theorem 4.1

(i) Let ~5, A , for some 5 , and > 0. Then


(a)  5 = 5 , i.e. , the distribution of  is symmetric about 5;
(b) =

~ 0,1 ;

(ii) If 0, 1 then = A <A .

In the following theorem we derive some more properties of normal distribution.


42

Theorem 4.2

Let ~5, A , for some 5 , and > 0.


(i)

(ii)
(iii)

(iv)

Proof.
(i)

Then the moment generating function of of  is given by


j  =

Let = v + u, where a 0 and u . Then ~ v5 + u, vA A .

Let =

, so that 0, 1 (Theorem 4.1 (ii)). Then

0 , if r is odd
2!
N  6 = F 6
, if r is even .
2
A
2 2 !

Then
Mean = 5<7 = N  = 5,
Variance = 5A = Var = A ,
Coefficient of skewness = #< = 0,
and Kurtosis = ^< = 3.
For t

j  = Nk l

2
1

k k


l


A 

 k l; k


l
= k l; A

k




A 

 l
A


l
A . using 4.1

= k l;
(ii)

l
l;
A , 
k

The m.g.f. of = v + u is given by

j  = Nk l

43

= Nk l;

= k l Nk l
= k l j v

= k l k l;
= k

(iii)

; l;

, t
, t ,

which is the m.g.f. of v5 + u, vA A distribution. Thus by the uniqueness of m.g.f.s


it follows that ~ v5 + u, vA A .
Let =

. Then, by Theorem 4.1 (i), ~0,1 and by i the m. g. f. of is


l

j  = k A , t


 A
= , t .
2 !
~

For 2 1,2, 

N 6 = Coefficient of 6! in the expansion of j 

(iv)

0 , if r is odd
2!
= F 6
, if r is even.
2
2A !
2
Let =

Also by (iii)

Moreover

. Then, by (iii), N  = 0 and N  A = 1, i.e.,


N

 A

= 0 and N ?

N  = 5 and N  5

@ = 1

= A

N  = 5 and 5A = Var = A .
5U = N  5 U = U N  U = 0
5U
Skewness = #< = U = 0.
5AA

44

5V = N  5 V = V N  V = 3 V
5V
Kurtosis  ^<  A  3.
5A

Remark 4.1

In the 5, A distribution the parameters  and A   0 are respectively the mean
and the variance of the distribution. Moreover is the standard deviation of the distribution.
Moreover, for 5, A distribution, the mean, the median and the mode coincide at 5.

If  ~ 5, A then 

~ 0,1 and therefore

   r c 
 r

 r c


$5

,  .

Thus,
 ~ 5, A   

$5
,  .

45

Figure 4.4. Plot of p.d.f. of 5, A distribution

Let : be such that  :  1 $ ". Then : is called the 1 $ " th quantile of . Clearly
$ :  1 $  :  ".

Figure 4.5. 1 $ " -th quantile of 0, 1 distribution

The following table provides various quintiles of 0,1 distribution


"
:

Table 4.1. 1 $ " -th quantile of 0, 1 distribution for selected values of "
.001
3.092

.005
2.5758

.01
2.326

.025
1.96

.05
1.6499

.1
1.282

.25
.675

 are given in the following table.


z

z


Table 4.2. Values of  for selected values of .

-3.5
.0002

0.5
.6915

-3.0
.0013

1.0
.8413

-2.5
.0062

1.5
.9332

-2.0
.0228

-1.5
.0668

2.0
.9772

-1.0
.1587

2.5
.9938

-0.5
.3085

3.0
.9987

0.0
0.5

3.5
.9998

Example 4.1
46

Let ~2,4 . Find r 0 , r|| 2 , r1 <  3 and r 3| > 1 .

Solution. We have

r 0 =

= 1 = .1587;

A
A

r|| 2 = r 2 + r 2


=

+ 1

A A
A

= 2 + 1 0

A A
A

= .0228 + 0.5
= 0.5228;

r1 <  3 = r 3 r  1


=

U A
A

< A
A

= 0.5 0.5

= 20.5 1 since  +  = 1, 
= 2 .6915 1

and

= 0.383;

r 3| > 1 =

r1 <  3
r > 1

0.383
1 0.5

0.383
1 0.3085

= .5539.

Example 4.2

Let 0, 1 and let = [||], where, for a real number , [] denotes the largest integer
not exceeding .
47

(i)
(ii)
Solution.
(i)

Find N||6 , 2 > 1;


Show that N  = 2 
y~<[1 z ].
Let  = A . Then, by Theorem 4.1 (ii),  <A . Therefore


= 
=

(ii)

N ||6 = N A

 <
6 k A  A <
A

 

6;<
2
2 A



6;<

<
A
k A 

+1

2
+1
2 .
=

Note that r 0,1,2,  = 1. Therefore by Theorem 3.1 (iv), Module 3,


6
2
2A

N  = r {
}~<


= r|| {
}~<


= [r  {  + r { ]
}~<


= [{ + 1 { ]
}~<


= 2 [1 { ].
}~<

Problems

1. Let ~0, , where is a positive integer and let =  [], where [ ] is the
largest integer . Show that ~ 0, 1 .

48

2. Let  be the d.f. of a r.v. , where r = 1 = = 1 r = 0 . Find the


distribution of =  . Does 0,1 . Interpret your findings on light of Theorem
1.3 (i).
3. Let the r. v.  have the p.d.f
6 1  ,
if 0 <  < 1
 =
.
0, otherwise
Show that =  A 3 2 ~ 0, 1 .
4. Let ~0, 1 and let = min, 1  . Find the p.d.f. of = 1 . Does
has finite expectation?
5. (i)
If X~0, 1 , find the distribution of = ln , where > 0;
(ii)
Let the r.v.  have the Cauchy p.d.f.   = < 1 +  A < , <  < . Find
the p.d.f. of =  < .
6. If ~0, , where > 0, find the distribution of = min, 2 . Calculate
r  < < .

7. Let  ~ 0, 1 and let

,
=
,

if || 1
.
if || > 1

Find the distribution of .


8. Let ~5, A . Find the distribution function and probability density function of
= A.
9. (i)
If ~12, 16 , find r 20 , use 2 = 0.9772 ;
(ii) If ~5, A , r9.6  13.8 = 0.7008 and r 9.6 = 0.8159, find 5, A
and r 13.8| 9.6 use 0.9 = 0.8159 and 1.2 = 0.8849 .
10. For  > 0, show that
 <  U  < 1  <  <   .
<
Hint: Use integration by parts in 2 A 1

 =  


<

k

l
A  .

11. Let 0, 1 . Find E and E A  . (Hint: Use the fact that  7 =
 and integrate by parts.)

49

You might also like