You are on page 1of 264

Scanned by CamScanner

Scanned by CamScanner
Section 2. Marginal Distributions 292
2.1 A G e n e r a l Discussion. 2.2 The Discrete Case. 2.3 The Absolutely
r Random Variables 107
Continuous Case
Introduction 107
Section 1. The Notion of a Random Variable 107 9 Conditional Distributions and Independent Random Variables 311
Section 2. Tlie Distribution Function 115
Introduction 311
2.1 The Definition o f a Distribution Function, 2.2 Properties o f a
Distribution Function Section 1. Conditional Distributions 311 . , , .
1.1 C o n d it io n a l Distribution Given an Eveiv c s a r t Prohjbilm.
Section 3. Classification of Random Variables 130
1.2 Conditional Distribution Given a Specij... - alue
3. 1 Discrete Random Variables, 3.2 Absolutely Continuous Random
Variables, S.3 Mixed Distributions, 3.4 Singular Distributions Section 2. independent Random Variables 327
Section 3. More Than Two Random Variables 343
Some Special Distributions 157 3.1 The Joint Distribution Function, 3.2 Tlie Discrete Case,
3.3 The Absolutely Continuous Case
Introduction 157 "
Section 1. Discrete Distributions 157 10 Functions of Several Random Variables 351
1.1 Bernoulli Distribution, 1.2 The Binomial Distribution, 1.3 The
Introduction 351 ■
Hypergeometric Distribution, 1.4 The Geometric Distribution
1.5 The Negative Binomial Distribution, 1.6 The Poisson Distribution Section 1. The Distrete Case 352
Section 2. Absolutely Continuous Distributions 175 Section 2 The Continuous Case 357
2.1 Distribution o f the Sum, 2.2 Distribution o f the Product.
2.1 The Uniform Distribution, 2.2 The Normal Distribution,
2.3 Distribution o f the Quotient. 2.4 Distribution o f the Maximum,
2.3 The Gamma Distribution, 2.4 The Cauchy Distribution,
2.5 The Laplace Distribution 2.5 Distribution o f the Minimum
Seciion 3. Miscellaneous Examples 373
v ) Functions of a Random Variable 195

Introduction 195
Section 1. The Mathematical Formulation 195
11 Expectation-Several Random Variables 389

Introduction 389
Section 2. The Distribution o f a Function of a Random Variable 198
Section 1. Expectation of a Function of Several Random Variables 389
2.1 The Discrete Case, 2.2 The Continuous Case
1.1 The Definition, 1.2 Basic Properties o f Expectation, 1.3 Covariance

7 Expectation-A Single Variable 221


and the Correlation Coefficient. 1.4 The Variance o f a Linear
Combination. 1.5 The Method o f Indicator Random Variables.
Introduction 221 1.6Boundson the Coralatiun Coefficient
Section 1. Definitions and Basic Results 221 Section 2. Conditional Expectation 422 ■
1.1 The Definition o f Expectation, 1.2 The Expectation o f a Function 2.1 The Definition o f Conditional Expectation, 2.2 The Expected

12,
o f a Random Variable, 1.3 Some Properties o f Expectation, 1.4 The Value o f a Random Variable by Conditioning, 2.3 Probabilities by
Variance o f a Random Variable, 1.5 Conditional Expectation Conditioning
Section 2. Expectations of Some Special Distributions 242 i Generating Functions 435

Introduction 435

8 Joint and Marginal Distributions 261


Section 1. The Moment Generating Function 436
1.1 The Definition, 1.2 How Moments are Generated, 1.3 Some
Introduction 261 ■ Important Results, 1.4 Reproductive Properties
Section 1. Joint Distributions 261 Section ; The Factorial Moment Generating Function 452
1.1 The Notion o f a Random Vector, 1.2 Tlie Definition o f a Joint
Distribution Function, 1.3 Properties o f Joint Distribution Functions.
1.4 Classification o f Joint Distributions

Scanned by CamScanner
Scanned by CamScanner
Jt / Preface Prrfecr { %i

measure and their interpretations. I developed in subsequent chapters. Such chapters, with the balance of the book of this manuscript the author lost, in the
find this particularly desirable since the examples are indicated by marking them to be covered in the second quarter. death of Paul Van Wulven, k good friend
student can be made to realize that with a solid circle. The reader would be There is no denying the fact that I and a typist of uncanny genius The
some problems which seem inaccessible well advised to familiarize himself with have derived heavily from the existing final chapters were typed by Cheryl
on first appearance can in fact be at­ their essence, in Section 1 of Chapter 4, literature on the subject, and I acknowl­ Richards, who, in spite of no previous .
tempted in a routine way. edge my indebtedness to these sources. experience with mathematical typing,
some examples are marked with an
Part 2, consisting of Chapters 4 asterisk. These might be omitted at first Somv ^re mentioned at the end of the rose to great heights and did a superb­
through 7, deals with single random reading, especially if the interest of the text; the interested reader might consult job.
variables, and part 3, consisting of reader is nonmathematical. these to broaden his perspective.
Chapters 8 through 11, treats several No mathematical book at this level On a personal note, during the typing Ramakant Khazartie
random variables. There is a common is complete without an adequate number
theme adopted in the development of of exercises. I have met this requirement
these two parts. I have found that con­ by providing a wealth of exercises which
siderable mileage can be gained if, touch on every aspect of the theory dis­
before embarking on pan 3, the student cussed in the text. They are given at the
is made aware that the broad approach end of each section, and—as far as pos­
adopted in part 2 is maintained in part 3. sible—are arranged in the order in which
This approach uses the following se­ the material is developed in the particu­
quential developments: (1) mathe­ lar section. N.o important results which
matical description of a function are needed for further development of
defined on the sample space; (2) intro­ the subject are relegated to the exer­
duction of the concept of a distribution cises. The exercises are initiated with
Junction 2 :ong with its properties; simple routine problems which increase
(3) classification of random variables in complexity, but none should be con­
on the basis o f the nature of the distri­ sidered beyond the prowess of a diligent
bution function; (4) treatment of student. Hints are appended for prob­
^ functions o f random variables; and lems which might call for undue insight.
(5) the treatment of expectation. It is The extent of coverage in a semester
also helpful to make the student aware or a quarter will depend largely on the
o f how, for instance, the definitions of level and background of the students.
random vector, distribution functions, Even so, it is inconceivable that the
and so on mimic those in part 2. entire book would be covered in a one-
Part 4 consists of Chapter 12, treat­ semester offering. Based on my own
ing generating functions, and Chapter experience, a one-semester course can
13, which involves the study o f limit be outlined as follows; most of the
theorems in probability. topics in Chapters 1 through 9, with
There are a wide variety o f illustra­ varying degrees of emphasis; Section 1
tive examples throughout the text, and o f Chapter 11; a brief touch on the con­
I consider this to be one o f its strong tents of Chapter J 2, finally, Chebyshe\’s
points. Thorough explanations are inequality and the central limit theorem
given so that the student can read these in Chapter 13.
on his own, thereby allowing the in­ In a two-quarter course the pace
structor more time to discuss questions could be more leisurely, allowing more
of a more fundamental nature. Some o f time to discuss topics in Chapters 12
the examples contain important results and 13. In this type o f offering, the first

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
4 / Bine Probability Theory emi Application!
Building Blocki o / Ihe h v b sU lty S tn e m r ,! s
The classical theory was not equipped to handle problems o f loaded dice or
biased coins. Consideration of problems of this type led to the axiomatic theory. For example, HA = ITom, Dick, Maryl, then Ton. (A and Jack t A
A giant step in this direction was taken as a result of pioneering work of A. Kol­ If a set has a large number of elements, it might be tedirus or som«,m,<
mogorov, who provided« sound mathematical foundation for the subject of impossible to specify the set by a complete list of its elements. A
probability. been devised to describe such sets is the so-called set-builder notation. If we r e L f
How does one devise > set o i axioms in a mathematical discipline? Of course, it sent a typical member of the set by x, then the set of all elements* such that f l m
is always possible to propose a system of axioms and derive results from them. some property, say property P, is written as
The only requirement would be that the axioms be consistent. However, if the Ix I x has the property P I
axioms are such that they have no connection with reality, then the whole exercise
becomes purely academic and of very little practical use. To have any relevance at For example, we could write the set of real numbers greater than 4 as
all, the axioms should be motivated by our experience in real world and should |x |x a real number, jc > 41
reflect it as closely as possible. In other words, the axioms should serve to provide
an idealization of what we observe in nature. Such an axiomatic presentation As another example, the set consisting of pairs of real numbers where the first
governing the behavior of chance phenomena was given by A. Kolmogorov (1933) component is twice the second component can be written as
in The Foundations o f Probability Theory. Our introduction to the subject will be Ku. >’)!«,>> real numbers, and u = 2v\
mainly axiomatic. The classical theory will turn out to be a special case.
The braces should be read as “ the set of a ll. . . ” and the vertical bar as “such
1. ELEMENTS OF SET THEORY that”
A very important set is the set of all the real numbers. We shall denote it by R.
1.1 The Notion of a Set Using the set-builder notation
Since the concepts of set theory are at the very heart of the treatment of prob­ R = lx |x a real number, - “ < x < “ l
ability, we shall begin by presenting a detailed outline of the basic ideas.
The word set is meant to indicate a gathering of objects which we choose to In the sequel we shall also need the following sets: Suppose a and b are real
numbers with a < b . Then
isolate because they have some common characteristic. However, any attempt to
define a set is fraught with logical difficulties. For our purpose, we shall adopt the [a, 6) = |x | x e R, a < * < bl ■ (closed interval)
intuitively familiar notion and regard a set as a collection of objects, requiring only (p, b) = |* |* e R, a < x <61 (open interval)
that it be possible to determine unambiguously whether or not any given object is M M x Ix e R , ff< x < fc |
a member o f the collection. («./>] = lx |x e R , «<*<41
When a complete list of the members of a set is given, it is customary to write k “ ) = lx lx e R , «<*<»1
them within braces, separated by commas. For example, a set that contains the four ( a ," ) = |x |x e R , a<x<~l
letters a, b, c, and d may be written as Ia, b, c, d\. Since we are talking only about (-“ ,« ] = |* Ix e R , -» < * < « 1
the objects in the set, there is no reason why the members should be written in any (-“ ,« ) = l x |x e R . - « < x < a |
particular order. For example, the s e ts k b, c, d\, Id, b, a, c|, Id, c, a, ¿1 represent
A set which has no elements in it is called the empty set, or the void set We
the same collection and consequently the same set. Hence order is irrelevant in
shall denote i. by the symbol 0 . The following are examples of empty sets:
listing members o f a set. Also, no purpose is served by repeating the same element,
so only distinct elements are listed in a set (0 The set o f all equilateral triangles with one angle equal t0 4S°
We shall denote sets by upper case letters and the elements of these sets by lower (H) The set o f all odd integers divisible b y 4 '
case letters. If x is in the set A we shall write (i/0 l(x, y) | x .y e R, |x| + b>| < 01
(fv) l x |x e R, x* = —1 |
xeA
A set which is not empty is called nonempty
to mean “jr is an element of the set A."
If* is n o tin g , we write

x (A
to mean “x is not an element of the set A."

Scanned by CamScanner
0 / « rat n o M m t, /« « , ,, ,,«J

/
Building Blocks o f the Probability Structure 7
Ä » I."«t t £ £ * h 1 . The notation AB, juxtaposing the iwo letters^ and B. is also used in place of
A riB .V Ie shall find this latter notation more convenient for our purpose.
■ h a tf* T r T i KI1 T ’" ' ' ' f m*n,b" i ,j| i Thtn B ii till;,] i
Consider the following examples:
also
(0 If .4 =|a, b. c, d\ and B -]a. tl. c .f. p\ then AB = |a, c. dl.
(it) i f A is the set of people who wear a tie, and B the set of people who wear a
SCX means that if x c B then x e A jacket, then AB represents the set o f people who wear a tie and a jacket
(iit) Let '

A = \( x .y ) \x ,y e R ,x > M
A ' * " * K no1 in B We give the'followingexampies*"'1 ' h" e “ S° m' membe' ° f
B = l(*. y ) U y e R. y > - l l
to 'llT r ,Harrly,C ,T 0I" - Dick- Hirryl then
triangles; in fact, a ^ r o ^ r sub«t ^ t' Un8''iS “ * S“bse‘ ° f thc of ^ the lsos“ l« AB = f t x ,y ) l x . y e R . x > 3 and y > - I I

o f * ™ ; thC SpadCS in “ bridg' deck of 52 cards is a subset Two sets A and B arc said lo be disjoint if they have no elements in common;
that is, if AB = 0 . More generally, if A ,, A . . . is a collection o f sets, then we say
("■) 1C*, ^ U . ^ f R , and i= > - |C |(*, y J U . y e R I
that the sets are pairwise disjoint if, whenever i # / , A / and A / are disjoint.

In *iew o f this comTem, w e 'h L IC * ' U' en th" ' “ n 0 t h i n 6 ° Which is " ol in A The concept of intersection carries over to any arbitrary collection of sets and,
in particular, to a countable collection of sets. Thus, if A ,, A 3___ represents a

com, table collection of subsets of S. their intersection, symbolically fl A,, is the


0CA for any set/4
set whose elements belong to every set A,-: that is,
ance the empty set has nothing in it that A does not have

:,h = P A j - ) x \ x ( A j for every i|


tain the v Z k TL. represent the same set; that is, they con-
» ^ “T u s ^ ‘heCaSe * -» * •« - AS an example, if A , = |, | 2 - j < x < 5 ♦ ) ( , , = 2......... then j( can
that shown

>4 * B means ,4 C B and B C A


,0,1^ 12 _ 7 C 5 + j{ =|.x: I 2 < x « S 51 = [2. 5]

For example, if A represents the set o f (dangles will, all sides equal, and B the Notice that S is in the intersection since 5 e A , for every i. For the same
set of triangles with all angles equal, then A = B. also in the intersection. reason, 2 is
If set A is not equal to set B, we w rite r * B. As another example, we have
For instance, if A = la, b, cl and B = la, b. e. d |. Ihen A * B.

1.2 Set Operations P i f ~ 7 ' a ] =lal - -


We shall now consider how sets can be combined to produce new sets. There are a set consisting of a singleton.
t h :« main operations: set interaction, set union, and set difference.
From the d e fin ite 0| intersection it is obvious that
Interaction of acts
AB = BA (commutativity)
The intersection o f two sets A and B, written A n fl, is the set of elements that {AB)C = A (BC) (associativity)
are common to A and B; that is,

A r iß = \ x \ x e A and x e B \

19 i

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
r<- .
; H i Bcsk bobabt/iry Theory and Applications
Buildwt; Blocks nj the Probability Structure / I S
if. EXERCISES-SECTIONI
10. Supposed = R X R where R denotes the set of real numbers.
1. SupposeS = II. 2 ,3 .........121,,4 = II, 3.4,6, 10, l l |, f l = 12, 3, 5, 6, 9,10 121
Let A = \( x .y ) e S lx < u \,B = \lx ,y ) e S \y < v \.
Find:
(a) Find A B .A U B, and A - B .
(i) A U B (b) AB (c) A - B
(b) Using the Cartesian plane and the appropriate regions of the plane,
(d) B - A (e) A'
indicate (as Venn diagrams) the sets AB. A U B. and A - B.
2. Suppose /* is the set of positive integers and A .B .C ire three subsets of it
defined as follows: 11. Let A = la, 4, cl. Answer true or false:
(a) 0 i A , b u t0 e i? (/l).
A = |2* | * e/*l (b) a t A. b u la ^ 3 \A ).
« = ¡2Jfc- 1 | t £ r i (c) |¡7, A! f A. but !j2. 61(
C = l 3 * |* e n (d) A is an clement aii?(A) but is not a subset of it.
Describe the sets in words. (e) lla,4Uc!ICi?(-4).
12. Let A =1a. b .c\.B = 16,71, C - 17,81. Find:
3. Let A B C be as described in exercise 2. Describe in words the following sets
(a )X X (fiU C ) (b) 04X fi)U (y4X C )
U A U B (b) (A U C)B (c) A U C
(c) A X (BC) (d) (A X B)(A X C)
(d) A U (BC) (e) AB (q g(;
13. Let A = 11,2,31 andfi = II, 2 ,5 ,6l. List the elements in the following sets:
4. Use Venn diagrams to establish the following:
(a) G4- B ) X ( A - B ) (b) A X ( A - B ) (c) (A X A ) - ( B X B )
(a) If ,4 C fi and fi C C, then A C C .
14. Consider the following sets which are subsets of R X R. Indicate which ones are
(b) A ( B - A ) * 0 .
Cartesian product sets, and write them in product form.
(c) \iA C B ,\h e n A = A B .
(d) If A B = 0 and C C B , then /4C = 0 .
(a) K-V.r) 1Jc< 21
(b) l(x. 1 0 < jc< j - <21
(e) I f A B = 0 , then A'B = B.
(c) I(jc, ji) 10 < jr < I. -2< *y< 3\
(i) (A U B ) - B = A - A B .
(d) l(x,y) iy > 2 l
(g) ( A - A B ) U B = A U B . ' (e) \ ( x ,y ) \ x - 2 y \
(h) (a U B ) - A B = (A B) U (A’B).
15. Let 5 = 1 1 ,2 ,3 ,.. .|and<4n = lx e S | i > n l . Explain why lim A„ exists.
5. State whetlS! the following are true or false for all sets A and B. Find it. . n~*~
(a) If ,i4 U fi = 0 , then A =B = 0 .
16. Consider the following sequences\A„\ of subsets of R. In each cas< state
(b) I f A B = 0 , then A - B = 0 .
whether the sequence is contracting or expanding and find lim A„.
(c) I f A = 0 o rfi = 0 , then A B = 0 .
(d ) lfA C B ,lh e n A < J B = A . 00 -4n =!* I| 1 + - < X < 4 - - !
n n1
6. Prove that A C fi if and only if S ’ C A 1.
7. Prove the following identities: (b) A„ =!x 1 ~ — < * < 4 + —!
(a) (A ')‘ = A for any set A.
(b) S ' - 0 and 0 ' = S. (c) A n =[x | jr > 2 - ^ - j
(c) A U A '= S for any set A.
(d) A A ' = 0 for any set/). (d) / 4 „ = j x |x > 2 + i |
8. If S is the universal set, and if A is a subset of S, find: II
(e) A„ = |x | 4 - - < x < 4 +
(a) SA (b) ( 0 L M / (c) S 'A '
(d) S A ' (e) (0 /1)' (/) (A 'U A )'
9. Let S = I* e R 1 0 < x < 41, in d Jet A and B be subsets o f S defined as 2. SAMPLE SPACE AND EVENTS
^ = lx 11 < x < 31, fi = Ix IJt > 21. Find: From the Vitnfpoiixt of | lei»/, wc jjiall be interested in a » x ip £ » H
(a) A ' (b) B 1 (c) A U B
iie W m p
(d) A ' u t f (e) (A U fi)' ( 0 a 'b 1 the ti(uvei51 s& M w ill-be<silled'theiim ple «pace. As i mathematical entity the
(?) (AB )' sample space itonly * set. Throughout we shall denote this set by S. Every conceiv­
able outcome of the experiment judged pertinent to i discussion is in this set, and
there is no member in this set which is not a possible outcome of the experiment.
Members of S are called simple points. Consider the following examples:

Scanned by CamScanner
¡4 / Banc Probability Theory and Application

CO W*" « toss i coin, usmg the letter« tor heads and the letter T for tails
could represent S as S = \H. 71. it is not always possible to decide ahead of time the kinds f
(b) If we roll a die once, we could write S as S = 1 1,2,3,4.5,61. might want answered. To be on the safe side i, „ 7 questions that one
(of) If we roll a die and toss a coin, writing the outcome’s as pairs with the first tion of the sample space as fully as possible jo thaUUs d *° *)r0V'l*t 1 <*“ CT1P‘
component representing the outcome on the die and the second the outcome on the possible questions one might pose ’ •'¡equate i0 answer all the
coin, we could write

(4 ,7 1 ,(5 ,7 ), (6, D l - 'M - .O .

(iv) If we pick one card from a standard deck of 52 cards, then we have
nrnhaKM t * ^ P0Sc a <lu«tion of the following type: What is the
S = U «p. K,p. ■■- , 2 A„. X„, f t .........2h, A„, K i, . 2d. p bability that an even number will show up? We will say that we are interested in
*Cl» *cl» ficl» • • • . 2C|| ie event an even number will show up.” This event will occur if and only if 2
ows up, or shows up, or 6 shows up. In other words, we are interested in the
with an obvious notation. For example, fisp denotes the queen of spades mem ere o the set 12,4,61. Thus any verbal description of an event has a set- ^
(v) Suppose a basketball player keeps throwing a ball at the basket until he makes eoretic representation to it. On account of this, an event is defined as a subset of ^
the basket for the first time. Once he scores he quits. Let us agree to write m if he e sample space. An event consisting of a single outcome is called an elementary
, misses and j if he scores the basket. What are (he possibilities? He could score on the event, or a simple event. 4
fust a(temp(, or miss on the first attempt and score on the second, or miss on the An axiomatic treatment of probability requires a more formal definition of an ^
first two and score on the third attempt, and so on. Writing these possibilities as event, this aspect will be discussed in the next section. From now on, we shall
s, ms, mms, and so on, we have identify a verbal description of an event with the underlying set. ^
An event E will be said to have occurrcd i f one o f the outcomes belonging to the j
S = |j, ms, mms, mmms, . . .1 set f takes place
(W) Suppose a person speculates on the length of time he will have to wait at Understanding this is crucial lor a good grasp of the algebra of events, where we *
the bus stop for the bus to come. In this case, we have S =)* I x > 0, x e Rl. combinc two or more events to get a new event. The algebra of events, as we shall
soon see, is precisely the algebra of sets that we have already discussed.
It should be noted that there is no unique way of describing a sample spacc and The event consisting of all members of 5 is called the sure event. The name is
considerable skill and insight is called upon in deciding what outcomes will be quite suggestive because one of the outcomes belonging to S is bound to occur when
relevant. Faulty descriptions of the underlying sample spaces have led to several the experiment is performed. Hence the sure event is the set S. '
paradoxes in the theory of probability. Let us consider the following examples: The event which contains no members of S, namely 0 , is called the impossible
event. This makes sense because the event 0 will never occur since it has no out­
(w'O Suppose a coin is flipped. In this caw the sample space can be given as comes in it.
S = Ihead, taill, assuming that the coin will land either heads or tails. However, if
we entertain the possibility that the coin might land on its edge, then£ - Ihead, Example 2.1. Suppose we toll two dice. Write as sets the following verbal descrip­
tail, edget would be a more appropriate sample space. tions of events.
(viiff Suppose a box contains three letters a, b, c and vve pick two of these O') E, = The sum on the two dice is 7.
letters, one by one, w ithout returning the first letter to the box before the second (u) £ , = The two dice show the same number.
one is picked. Here one might give the sample space as S| -\ab, ba, ac, ca, be, cb\, (iii) £ j = The sum on the two dice is a prime number.
listing the order in which the letters are picked. For example, ba would represent
«
that letter b was picked first and the letter a was picked next. With this description Solution. The sample space can be written as a set of ordered pairs in the following
of the sample space we are in a position to answer a question such as Which out'
comes resulted in picking the letter b first?” Now suppose we choose to give the s = (1 .1 ).0 ,2 ),(1 ,3 ),( 1 ,4 ) ,(1 ,5 ). (i
sample space as S , = lei, ac, cb\. recording all the outcomes on the basis of what two
n ! w i 2)’ (2’ 3)'<2 . 4) . ( 2 . 5), (2 .6 ),
dijtinct letters are picked, but with no regard to the order. This second description,
( 3 .1),(3.2), ( 3 ,3 ),(3 ,4 ), (3, 5) (3 6)
though satisfactory to answer some questions, is totally inadequate to answer the
« '> . ( 4 ,2 ) . ( 4 , 3 ) ,( 4 ,4 4 5 4 61
question “ Which outcomes resulted in picking the letter 4 first? ’ ^"ce ^ fad o
record the outcomes according to the order in which the letter, are picked. Of the
(6.1). (6,2), (6 ,3 ), (6 ,4 ), (6, 5), (6 ,6 )
two sample spaces S „ it would seem that S, is more appropriate for our Hfrp in n r h
A,„

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
26 / B «sk ProbaM iry th eo ry ana Applications fMitlilWf 0101. ut me 1 iitvuui'ii ‘ t •-
k c, rft |61, )a, ft, c, dll is a sigma field. Because of the way we have constructed it sistencies in defining probabilities, this is precisely what wc do. But there are situa­
'V
i t u the smallest sigma field containing the set 10, lil, la, 611. It is ciUed the sigma tions where inconsistencies crop up. This happens when 5 has an uncountable
2 . field generated by the set 10,1*1, )*, j ||.
number of sample points. We shall take this problem up in Chapter 2
As . further «lustration, ifS = l„ , b, c, dl, then it cm be easily verified that
T / " ' i,d ,g' n' rattd by Ia. « «>0.1«. <>1. I*. d{ U. ft, c, d ll («,) that , EXERCISES-SECTION 3
gewrated byllal, Ifclt ls 10, | 8|, |tf, k b[ |£ dy |a_ £ d , ^ c ^ j d |, d
1. lf S » la , b, c, d, ei, give three distinct sigma fields of subsets o f S.
W I that generated by |!a |, | W, |e || is the power o f k h f rf|
V 2. If 9 is a sigma field, show that for any A e 9 . B e 9 . it follows that
it ihTrlii P“ iIi0n diSCUSS'he Borel "dJ of th' real li»- To construct
3 procedure:
it, we use the following 10 (A -B )U (B -A )e 9 .
1/3. Suppose A and B are subsets o f 5, ^ t S e : of which is 5 01 the empty set Find
the sigma field generated by \A ,B \.
i" Cludi,"8 a " ,he ,n" rvals ( - “ .«I whete , is any real number. 4. Discuss the Borel field o f the subsets o f the interval [0,1 ]. Hint: Start with
m ,n h,K r ° u • (~“ ^ fo™ a I » 1 “ f ,he “ Section. the set of intervals of the form |x 10 < x < 61, where 0 < b < 1.
I ) For $ to be a sigma field we now require, on account of axiom { 9 \ \ that
it contain complements o f the intervals that we included under (0. Since the 5. Discuss the Borel field o f the subsets o f .9 = R X R - f(*. y ) I x , y € Rl.
complement o f ( - « a] is fe « ) , the collection 4 will contain all the intervals of Hint: Start with the collection of sets o f the form Ia,b = l(*. J’) I
the type {a, ~ ) where a is any real number. For example, intervals of the type -00 < y < 6 |f where a, b are any real numbers.
( i °°); ( 2 , « ) , (> /3 , « ) will be members of 5 .
(nr) Suppose a and b are any two real numbers with a < b. Since by (0
( b] c S , and by («) (a,°° ) e <B, their intersection, ( - « ¿] n (a, « ) = ¿] >
is also, in S . In other words, all the intervals o f the type {a, b] , where a and b are
real numbers with a < b , are in S . For example, ( 2 .3 ] . (-2 , V I ] , and so on
are in S . ’

The collection is thus a very large collection of subsets of R which, starting


with the sets o f the type (-<» a ] , is obtained by repeatedly carrying out the opera­
tions of union, intersection and complementation. That is, the Borel field o f the
real lin t is the smallest sigma fie ld containing all sets o f the form ( - « , a]. The
members o f & are called Borel sets of the real line.
Do singletons belong t o # ? For example, is it true that )5( c 5 ? To answer this,

we note from (iti) that sets o f the type ¡5 " , sj are in iJfo r every natural number n.

Hence,' due to closure o f iB under intersection, we must have H (5 - —, 5 e <8.


. «=l\ n J
T lu t is, ISI € iB, In general, Ifll e for any real number a. This in turn implies that
dosed intervals (a. b ] , open intervals (a, b), and intervals like [a, b) are in for
any real numbers «, b with a < b . (Why?)
In summary, $ is a very large collection o f subsets of the real line and it includes
“ju st about” every subset o f R that we can conceive of. However, it should be
mentioned that S is n o t quite the power set o f R. That is, there do exist subsets of
R which are not Borel sets. The proof o f this sutem ent requires rather sophisticated
tools in m a th e r-tic s 2n d will not be atte m n tiA e re .
We have discussed the notion o f a sig m a ^ f^ M some detail. It might be a moot
point to enquire, “ Why n o t take the power s e t B ? as the underlying sigma field and
be done with it?” After all, the power set is ttf ^ r g e s t sigma field and contains
every conceivable subset o f4 . In most c a s e m h e re it does not lead to incon-

Scanned by CamScanner
Scanned by CamScanner
o» 3 0 /Basic Probability Theory and Applications

(C l) The probability of the impossible event is zero. That is, For example:
Definition o f Probability I S I

(f) If the probability of precipitation is O.SO, then the probability of no precipi­


W )=o tation is 0.20.
(li) If the probability that all often children are sons is 0.05, then the proba­
To prove this, we have for any set A, bility that at least one is a daughter is 0.95.

A - A U 0 U 0 U . ..
(C3) If A and D are any two events, then
Since A n 0 = 0 and 0 0 0 = 0 , we get, using axiom (/>3),
P {A -B )= P (A )-P {A B )
^ ) = W * / ,( 0 ) + f ( 0 ) + . . .

3» Now. using the fact that f(>4) is finite, this gives P (0 ) = 0. (Recall that A - B represents the event that A occurs but not B\ that is, o f the two
3* events A and B, only A will occur.)
Comment. This result incidentally proves that P is finitely additive. That is, if
To prove this, we note that we can write the event A as the union of two
3 Ax A 2. . . . . A„ are n mutually exclusive events, then ^ = X H&4/)- This mutually exclusive events A - B and AB. (See the Venn diagram in Figure 1.1.)
Hence
d follows simply from the fact that
3 P ( A )= I\A -B )+ J \A B ) .

à ^ M £ ^ u0u0u--) and consequently

= ^ , ) + . . . + />(-4„) + n 0 ) + / X 0 ) + . . . P {A -B )= P {A )-P {A B )
3
by axiom (F3). Hence Comment. We must be careful to take note of the fact that, in general,P{A - B ) is
3>
not equal to P(A )- P[B). However, in the special case when B C A , the result is
? true. Tl»!it is,

since P (0 ) - 0.
if B C A then P{A ~ B )= P {A )-P {B )
« Note: A collection o f events is said to be mutually exclusive if no two of them can
ê occur simultaneously.
(C4) (Monotone Property) If events .4 and B are such that B C A , then
$ (C2) For any event A. P(A ') = 1 -P [A )
The proof o f this relies on the comment at the end o f the proof o f (C3). Wc
I
. know that if B C A, then
» 4*V In words, the probability that an event will nor occur is equal to one minus (he
probability that it will occur. P (A -B )= P {A )-P {B )
>f We can prove this as follows: From the definition of > 4 we know that
B utP(A - B ) > 0, by axiom (PI). Therefore,i*(i4) - P{B) > 0 and consequently
i» Sf
~ , S ~ A U A \ N o w by axiom (P2) , P(S) = 1. Hence
P (A )> H B ) -
K ** ' /X A U A ') = I
1 V
> >. Therefore, since A and A ' are mutually exclusive,

I & /C O * 4 0 - 1

1* Consequently,
> 1 4 o = i - w
) ■

\
Fi*ure I 1
!

Scanned by CamScanner
/
Definition o f Probability ¡3

P(Ai <JA1 \J A i ) - l \ A l U{A1 V A 3))


x flA r i+ W tV A J -P iM A iU A i))
by consequence (C5). Now, by the distributive \i *.A ,(A 7 U A ,)-( A ,A 2)
U (AtA j), and consequently

I\A ,(A t K J A , ) ) - m , A t )U (A ,A 3))


^ F i A M +P i A M - r i A ^ A . A , ) ,
= ftA ,A 2) +P(A,A3} - ! ,iA ,A ,A 3)

Therefore,
P{A, U A i U A ,) =P(A,) + [f(A7)+ P (A ,)-JK A ,A })]
-I J X A .A j + f C A t A j - P f A ^ A , ) ]
=JXA, )+P(A2)+ /K A ,)-/X A ,A ,)
-P ( A l A 3) - i ,{A1A , ) + f( A , A 2A 3)

We could write the result more compactly as

P(A I U A ! U A 1) = ;X P(A,) - . r , ■
H A iA j) + K A ' A i ■
A 3)
i<i

An astute observation might lead one to conjecture that, ¡M ,, -42, • • • >».. a[C
n events, then

p {u a !\= i l \ A i ) - i f t A . A j ) * 2 f\A ,A jA k )
\/= 1 I 1=1 1.1s 1 U.*=l
I </ !< /< *

- . . . + ( - l y *‘^ , - < 2 ■■'<»)•

This conjecture is indeed correct and can be proved by induction. The proof of
this result will be left to the exercises. .
Comment. The preceding formula is useful for finding the probability that at least
one of the events A ,. A .......... A„ will occur, if we can find the probability of the
simultaneous occurrence of any subcollection of the events. Also note that if the
events are mutually exclusive, then the probability of the simultaneous occurrence
/ ** \ W *
of two or more events is zero, and consequently j4,j will equal .X P(Ai), as it
should.
Example 1.1. Suppose/i and B are two events for which PM ) ■ 0.6,/*(B) = 0,7,
and H AB) = 0.4. Find the following probabilities:
(a) P (A U B ) (b )P (A B ') (c) l\B A ')
(d) P((AB)') (e) P((A U B)') (J) P(A'B')

Scanned by CamScanner
Scanned by CamScanner
M ÍY ÍV V Í
4« « W
«

Scanned by CamScanner
Scanned by CamScanner
$ 4t/B * sic Probability Theory and Application
ifM ^f is an expanding sequence with lim A„ = S then Definition of Probability / 41
n-*- ’
A ( 0 ] ) = 0 - fl
lim PiA,,) = /X lim 4 „ ) = P(S) = 1
n-»- n -f- 7

Example 1.9. Suppose probabilities are assigned to the subsets of S =


(0 ,1 1 in Now using a highly sophisticated argument which relies heavily on whai u
such a way J u t , if 0 < « < K 1 thenP((fl, *1) = 6 —a. (There are some mathe- caUed the axiom of choice, H is possible to express the interval [0 ,1] as a count­
SfUb,le l" th ,t ' hlS " 'e end^ . but such considerations are extraneous to the able union of disjoint setsf,-. 1 * 1 ,2 ,---- w ith /ff;) the same for each set. Thus
thrast o f our discussion and will not concern us here. We shall allude to these briefly we have
in the digression that follows.) Find:
(a) P{\ r l ) , for any real number r e [0,1 ] (0. IJ = U E,
i- 1
Q>) A Q ), where Q is the set o f rational numbers
(c) P ([a ,b )), w h e r e < 1 where EjEj = 0 if i * /, and P[Ej) is the same for each i = 1,2.
However, by axiom (£3) of probability, we ought to have
Solution

(a) For any real number r we can write Irl = O (r - - r] Since \lr - -
n= l\ n I \\ n ’
rl! is a
contracting sequence o f intervals, we get This leads to an inconsistency because P( (0, 1J ) s 1, whereas £ J \£ ,) is either

= lim - = 0 zero (when P[Et) = 0) or infinity (when P(Ej) > 0).


1 n-*~n Hence, if S = [0, I ], there does not exist a set function that coincides with the
length on the subintervals and which at the same time satisfies the third axiom
Hence singleton sets are assigned zero probability.
(P3) in a consistent way. We would not face this situation if we had confined our­
(6) Since the set of rational numbers is countable, we can enumerate them as
selves to »he Borel field of subsets of (0,11. The sets £, mentioned above are not
f i ' f a , . . . • Henct

4
members of the Borel field. However, they are, of course, members of the power
set of [0,1 ].
m = H Û . l r i = Z P0r„|)
Vi=l / n=1

by countable additivity. Therefore,/XQ) = 0.


EXERCISES-SECTION 1
1. Suppose A and B are mutually exclusive events for which P(i4)= 0.4 and
4
(c) We shall find P([a, 6] ) in two ways. One way is simply to note that P{B) = 0.3. Find the following probabilities:
•*A f c ¿>1) = A M U (fl. 6 ]) =PQa\) + P((a, b]) = b - a , since /’Qffl) = 0 by pari (a). (a) P[A') (b) P(AB) (c )P (A V B )
(d) P{AB') (e) A W ) (0 PiiAKJB)')
An alternate way is to note that [a, b] = H (</ - ¿ |, *hich yields
2. Suppose A. B, and C are mutually exclusive events for which A U B U C - S.
If/\i4 ) = 2P(B) = 3P{C)t find:
(a) P{A U B) (b) PiAB') (c) P t f g C )
(d) P [ A 'y J B 'v C ) (e) P[A\BVJQ ) (f) /’(¿(B 'U C '))
(why?)
3. (a) UP(ABC) - 0.2 *ndP(A) = 0.8, find/*(j4(£’u C*)).
(b) if P(A) = 0.6,P(AB) = P(AC) = 0.35, andP(ABC) = 0.2, findP{AB'C).
= lim ( b - a + ^-) = b - a 4. Show that P(A ) = P{B) if and only if P {A g) = P{A 'B).
rt-f- \ M
5. If A, B, C are three events, show that P(ABC) = P{AC) + P{BC) - / \ ( 4 U B )Q .
Digression. We have dwelled at length on the concept of a sigma field. Is this con­ \J 6. If A and B are any two events, show that 1^/4) ~ P(B)\ < P[(AB') U (¿ '5 )).
cept essential? The following discussion will indicate its merits. 7. The probability that a person is a lawyer is 0.64, the probability that he is a liar
Suppose we pick a point at random in the interval [0 ,1 ]. Hence, the sample is 0.75, and the probability that he is a liar but not a lawyer is 0.25. Find the
space consists o f all real numbers x such that 0 < x < 1; that is, S - (0 ,1 J . If (a, ¿>), probability that—
where 0 < f f < £ < U s a n interval, then it seems reasonable to assume that the (a) he is a lawyer and a liar
probability that the point lies in the interval (a, b J is equal to the length of the (b) he is a lawyer or a liar
interval. Let us therefore define a function P by (c) he is neither a lawyer nor a liar.

Scanned by CamScanner
' Definition o f Probability 143
42 ¡Bene Probability Theory and Appticanons
yy elementary events. This « suflfcent. l-.o b .b iim « » « th«n ..signed m 3 nan.nl
#. A student is taking two courses. History and English. If the probability that he
will pass either of the courses is 0.7, that he will pass both the courses is 0.2, and wav to ail the events as follows: . , . _ _
that he «ill fail in History is 0.6, find the probability that— Suppose A = l«/t .1 ,,____ I t i *«•> * oulcomcs-Then A ™ b‘ ' Xp" W d **
r tj (a) he will pass History union of k mutually exclusive elementary events as
jje'Ai (*>) he "HI pass English
A = lJ/,1 U li/jl U . . . U lj,*|
fe ' fc) he will pass exactly one course.
V Suppose A. B, C, I) are four events. Derive an expression for the probability that Using axiom (/>3), we therefore w
i exactly k o f the events occur (k - I 2. > 41 in terms of the probabilities of their
intersections. -4 ^ k '0
10. Ann, Betty, Cathy, and Dorothy are invited to attend a party. Let A. B, C. Tims for a m e simple space, the probability ofan c e n t A is «ju„/ to the mrr.
and D represent respectively the events that Ann, Betty, Cathy, and Dorothy attend o f the probabilities assigned to each o f t h e o u t c o m e that make up t h e event A.
the party. IfP(A) =P(B) =P{C)= P{D) = 0.6, f\AB)=P{AC} =P[AD) = PfBC)
= PfBD) = P(CD) = 0 36, P(ABC) = P{ABD) = P(A CD) = P(BCD) = 0.216, and The classical definition of the probability of an event is based on two fund*
menial assumptions. One of these is to assume that the performance ofanexpen-
ffABCD) = 0.1296, find the probability that exactly k girls attend the party, k - 0.
I, 2 ,3 ,4 . ment results in a finite number o f outcomes. The other is to assume that all the
elementary events have the same probability; that is, the outcomes are equa y
11. Supposed - II, 2 , . . and ^Q/Q = Ar/3* for all ie S, where it is a constant.
(a) Determine it. likely or equiprobable. •
(b) Find the probability of (/) the set of even numbers, (if) the set of odd In what follows let us assume that the outcomes are equally likely ; that is,
numbers.
12. Prove by induction that
N L _ 1
Then, since E /^IS/I)= 1, we 6et Np = 1, so that P ~
A *1 w - J e /h ^ > +
, i=l
i<j i<j<k Hence, if the outcomes of a sample space S with N outcomes arc equally likely,
then the probability of cach elemenury event ¡j 1/M the reciprocal o f the number
r lK * iA 7 . . . A n) of outcomes in S.
13. Establish the following inequalities: Next, suppose A is an event with k outcomes, s /,, s,7, . . . , s ^ . Then, as we have
(a) P { A B ) > \- I \A ') - P { B ') already seen,

(b) f f U A ^ Z P i A , ) m = n i i , 1o + « i i i l i ) + . • .+ * 0 * 1 * 0 = j j

Thus we have a very fundamental formula of classical probability:


(c)
I f the sample space is finite and the outcomes are equally likely, then the prob­
14. Suppose M /I is a sequence o f events where J%A/) > 1 - [(O .I)^ ), / = 1 ,2 ,__
ability o f an event A is equal to the ratio o f the num ber o f outcom es in A to the
Determine a lower bound for Ajj. number o f outcomes in the sample space.

Comment. We shall have occasion to use phrases like “ an unbiased coin is tossed,”
2. FINITE SAMPLE SPACES “a fair die is rolled,” “ an object is picked at r a n d o m and so on. They are all
We shall devote the rest of this chapter to the discussion of finite sample spaces; meant to suggest that the outcomes in the sample space are equally likely.
that is, sample spaces which have only a finite number o f outcomes. Consider a
Example 2.1. Let the sample space S = taj, s2, j 3, $4, ss , s6l be given. Probabilities
sample space S with N outcomes so that we can write S as S = Is,, s2, . . . , fy l.
are assigned to some (.vents as follows:
Recall that the power set oTS has 2N members. In other words, there are
7 ^ possible events. I f we define a function/* which assigns numerical values to
these events in a way which is consistent with the three probability axioms ((PI), and
(P2), and (P3)), then the fu n ctio n ? is a probability measure. Thus, in order to
define a probability function on a sample space withiV outcomes, we need specify
at most 2n values. In practice, this is accomplished by assigning probabilities to the

Scanned by CamScanner
Scanned by CamScanner
/ J W r Probe W iry Theory ami Applteatiout Definition o f Probability / V7

& h each cu e, whether the sampling is carried out wit]} or without replacement. Comment. When n objects are picked and the ordei is important, it is convenient
' p e m ay or may not be interested in the order in which the objects are picked. As to write the sample points as ordered n-tuptes(X|, .........x«) where the rth
isn a a lt. we have the following four situations: component Xj represents the ith object picked- Thusxi represents the result of the
first draw, of the second draw, and so on.
without replacement with replacement We shall now provide a general formula in each of the above four cases Towards
this, we state the following basic rule of counting techniques.
\ The Basic Counting Principle If a certain experiment can be performed in r
order no order order writer
ways and, corresponding to each of these w ay, -I’ -rperiment can be per­
formed in k ways, then the combined experiment can be performed in rk ways.
Let us consider the following illustration which brings out the essential ingredi­
ents o f our discussion: Suppose there are four distinct objects represented by the To understand this principle, suppose the outcomes of the first experiment are
letters a, 6, c. d , and two of these letters are picked. The following four cases are written as A - laj, flj,. . . , arl and those of the second experiment as
posible: B= |A,, ¿ 2 . ___ 6*1. Then the outcomes cf the combined experiment can be repre­
sented in a rectangular array as ordered pairs (a/, 6;):
b c d

a ab ac ad ab ad 6*
6, »,
b ba be bd be (a i, t¡) . ■ («!.**)
bd fli < « I,M fri. *>) • .
c ca cb cd cd (a2, b 7) . . (dì. */> («2, 6*)
*2 (flj. b x )
d da db dc

C a ie 1. Without replace­ Case 2. Without replacement, «1 («i. M ■ («/. b¡) . («1. bk )


ment, with order. without order

a b c d a b c d Or (flr. M (<*„ b7) .• (Or b¡) . - («r. bk)

a aa ab ac ad a aa ab ac ad
In other words, the outcomes of the combined experiment can be represented as
b ba bb be bd b bb be bd
m 9 : the Cartesian product A X B. Clearly, there are rk pairs. Indeed, this shows that
c ca cb cc cd c cc cd n (A X B ) = n(A) X n(B). •
da db dc dd d dd Another way of illustrating the above principle is by a tree diagram, as shown
d
in Figure 3.1. First we list all the outcomes of one experiment, and then, corre­
Cate 3. With replacement, Case 4. With replacement, sponding to each of these, those of the other experiment. The total number of
with order without order
branches, namely rk, gives all the combined possibilities.
The basic counting principle can be extended to any number o f experiments
In cases I and 2, the sampling is carried out without replacement, and conse­ in an obvious way. We shall now give some examples.
quently there are no possibilities like aa, bb, cc, dd. This explains why there are no
entries along the diagonal in these cases. In case 2, moreover, we are not interested (i) If a die is tossed twice, then there are 6 X 6 = 36.possible outcomes.
iiuirder so that, for example, cb is listed, but not ba. (if) If a person has 3 different shirts, 6 different ties, and S different jackets,
v I d case 3, we list all sixteen possibilities. In case 4, since the sampling is with then he can get dressed for an occasion in 8 X 6 X 5 = 240 ways.
»placem ent, we certainly have outcomes like qq. bb, cc, dd However, since order (Hi) If the purchaser of an automobile has a choice of 3 makes, 5 body styles,
¿ n o t relevant, ab is the same as ba, and so on. Hence there are no entries below and 6 colors, then ne can choose from 3 X 5 A t> = 90 different models.
(if) Suppose license plates are formed with three distinct letters followed by
the diagonal.
three distinct digits. Then there are 26 choices for the first letter, 25 for the second,
Comment. When order matters each possibility is called an arrangement, or a and 24 for the third. Also, there are 10 choices for the first digit, 9 for the second,
permutation. I f order does n o t m atter, it is called a combination. and 8 for the third. Therefore, there are 26 X 25 X 24 X 10 X 9 X 8 = 11,232,000
different license plates.

Scanned by CamScanner
Scanned by CamScanner
s ? 50 f Bask Probability Theory §nd Applications Ltrjmtuon oi rrvfiarxhiy /

Que 2: Without replacement, without order (combinations) For convenience, the following conventions are adopted.
^ pw We shall discuss this case in conjunction with case 1. We have seen that if we
p - 4*ck three letters out of«, b, c, and d, and if order is important, then we get 24
tM\
v ; permutations. In the present case, however, we are not interested in order, and as and - 0 if n < 0 or n > M
M «ich there are juit 4 possibilities, namely, abc, abd, acd, and bed. Each of these 0 - .
possibilities is called a combination. Among the 24 permutations of case I the first
column consists of the permuutions of the letters a, b .c and, as we know, there are Comments. (1) Picking n objects out of A/ to form a group is tantamount to pick­
3. of these. This is why there are 3! = 6 arrangements in column 1. The same is ing M - n objects out of M not to belong to the group. Thus, for example, the
true o f columns 2 .3, and 4. Consequently, we get from our example, that the number of ways of choosing 3 books to read from a set of 8 books is *h- /■-
number of combinations, multiplied by 3!, is the number of permutations. the number of ways of picking 5 books not to read from the 8. Theretore we
Let us now take up the general case where we pick n objects without replace­ always have
ment from M distinct objects, where order is not important. Symbolically, we

■hall denote the number o f ways of doing this by and call it the number of
C K - J
combinations of n objects from a set of M. Our objective is to derive an expression

for0 This can also be seen by observing tliat and j are both equal i
Towards this, we see that if a combination has n elements, then there are n\
possible arrangements o f its elements. Each combination gives rise to n\ arrange­ Af!
ments, thereby giving rise to all the permutations, namely, M(M - 1) . . . (M - n + 1). n \(M -n )\’
Hence we have (2) For any two real numbersx andy the expansion of (x + j ’V1* can be
written as

o ■n! = M (M - 1 ) . . . (M -'n + I) =
M\
(M -n )l (a* + v>v = I (M\ x n YKi- n
' w=0 \n t '
Therefore,
1 his is called the binomial expansion. Since occurs as the coefficient of
(M\ _ M'
x " y '- " in the binomial expansion. (' ). i; - 0. I ......... M, arc called the binomial
coefficients.
is the number of unordered samples of size n that can be drawn without replace­ (3) If a set h3S M objects, then the number o f different subsets o f size n is
ment from M distinct objects. ( „ ) Tl,is is because, as we know, order is not important in listing the members of
For example:
a set. --------
((') The number o f ways o f choosing a set of 3 books to read from a set of 8 (4) We have mentioned above the following identity, which holds for any real
numbers*, y . '
books is = 56. (Note that we are not interested in the order in which the

booki ire read.)


< ■ * '< '- O '-
(it) The number o f ways in which a five-card poker hand can be dealt from a
deck o f 52 cards is In particular, if we set* = y = | we get

2« = i M +...+
( s H l r 2'598'960

(«/) From a group o f 8 seniors, 6 juniors, and 4 sophomores, there are This shows that the total number o f subsets that can be formed from a set with M
(i.*) ways o f picking a five-member committee. elements is 2 . (Recall that we mentioned in Chapter I that the power set of a set
with n elements has 2" members.)

Scanned by CamScanner
¡ fr a c Probability Theory and Applications

3 : With replacement, with order Definition o f Probability ! 53

D ie num ber o f ways o f picking n objects from M distinct objects is M" when Solution
¡he objects are picked with replacement and when order is important. This is easy (a) In this case we are interested in die order. Since we are picking 5 cards with­
to see because at every draw there are M different choices. ' out replacement and the order is relevant, there are (52)s = 52 X 51 X 50 X 49

I For example .

(i) With the eight digits 1 ,2 ,3 .4 . S. 7 ,8 ,9 , one can form 83 distinct three­
X 48 possible outcomes in the sample space. For example, three of the outcomes
in this sample space can be written as(/f,p, Jh< 3h. 7 d » 8 ci),(/h > ?d>
Mh, fii./d - 8C|. 6d) How many of these (52)s outcomes are favorable to the
r' ’ SP 1

digit numbers. event that there are 3 black cards and 2 red cards? Let us call this event I le even
(ii) If there are Af cells, then n objects can be placed in them in M" ways. A. First of all we observe that there are 5 locations, of which 3 are to he assigned
(We are assuming that a cell can have more than one object.) Placing an object in
to the black cards and 2 to the red cards. This can be done in Q = 1« ways.
a cell amounts to picking one o f the M cells, and allowing a cell to have more than
one object amounts to sampling with replacement. Consider just one of these, and say we have black cards in the first, third, and fourth
(iii) If 10 people are in a train which stops at 6 stations, then there are 610 locations, and red cards in the second and fifth. There are 26 X w
filling the first, thtrd, and fourth locations with the black cards and corresponding
possible ways that the 10 can get o ff the train. Notice that a person can get off at
to any of these there are 26 X 25 ways to fill locations two an >ve wi r
any one of the 6 stations so that he has 6 choices. This is true of each of the 10
people. Also, if one person gets off at a station, it does not preclude other persons Hence, by the basic rule of counting, there are (~\ X 26 X 25 X 24 X 26 X -5
\3/
from getting o ff a t that same station.
outcomes favorable to A. Hence
Case4 With replacement, without order * /* )
The derivation of a general formula in this case is rather tricky and we shall not ( 3) ( 26) 3( 26)3 (236) f
pursue the matter here. For our purpose it will suffice to know that the number o f m <52)s (52)
O r d e r e d sample: o f size n when objects are picked with replacement from M
i M + n - 1\ (6) In this case order is not of interest. Hence there are (5S') possible samples
distinct objects is \ j
For example, the number of ways of placing n nondistinguishable balls into Af of size 5. Now there are ways of picking 3 black cards o ut o f the 26 black
cells is {M * n ~ ‘) ■(Try to see the analogy between the indistinguishable balls and
\ n ' cards, and corresponding to cach of these ways there are | 0j ways of picking the
the irrelevance o f order.)
2 red cards. Therefore, by the basic counting rule, there are ( ^ ) ( ^ ) possible

samples of size 5 where each sample has exactly 3 black cards and 2 red cards.
Consequently,
’ ( ,) _ L if sam pling is ordered, w ithout replacement
(M)n ( X )
if sampling is unordered, without replacement
(2)
ô (?)
(3) i f sampling is ordered, with replacement Comment. In Example 3.1 we see that P(A) is the same in both cases so that, for
n findingP{A), it does not make any difference whether we observe the cards all at
if sampling is unordered, with replacement. once or one by one. It should be borne in mind that the event A specifies that
there are so many cards of one kind (black) and so many of the other kind (red).

Example 3.2. If a person is dealt 13 cards from a standard deck of cards, what
is the probability that he is dealt:
(a) the complete suit of spades;
red cards’ We shall consider the following two cases. (b) a complete suit?
t . \ #h. r*rM are seen one by one;

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
¡ g P '# « * Theory <„d Appllcan0m

" ~ *> - * a - * s I 0utcom « favorable to it. Hcnce


/
Dcfini'io* <>,' probability W
„ . , , - wins one motor* ^ iJ S
W U M ,= H e w i n ,o n e c « a n d n o o ü ,e r p n * j n ^
cycle and no other prize. Then ,4 1 and i4 2 are
Also, it can be easily seen that
4 ?

. .'-Examp/e J. 7. If 10 cards ire <u»ii r. „ . . f l f f l . f l S

8V87US'
/ ‘Î W ' I

? '3 in eaCh SU“ ’ 'ht probablll,y * equal to *


lA 3 Ao.
/ Wi ) * *

*
- Ci0nSider 4 ' 0Uery th3t Sells 100 »<*«’ offers five cars and
Consequently,

(а) A - He wins one car and two motorcycles.


(б ) B - He wins no prize. w -
M .f X)
{c) C = He wins at least one prize.
(</) D = He wins exactly one prize.
(«) E = He wins with every ticket.
noting that there are 13 ijrize-winning
j? Solution. There are 13 tickets that carry prizes and, consequently, 87 that do not. and 87 non-prize-winning tickets, of which he has to pick 3.
tickets from among the
(*) If a person wins one car and two motorcycles, then he lias to pick one (e) The event E signifies lhat the person picks 4 winning
jianwinning ticket. Hence 13 tickets and 0 nonwinning tickets from the remaining 87. Hence
&

&
: m =

n m n
In the following, we present a slight variation of the hypergeometric probabilities, ^
(4) The event B specifies that the person picks 0 tickets from among those that
‘ * effer i car. 0 from those that offer a motorcycle, and 4 from 'hose thai offer no in that the sampling is carried out with replacement To start with, consider the
t j r i z e . Hence following example : ^
Example 3.9, Suppose 5 cards aie picked from a standard deck of 52 cards with ^
replacement. What is the probability that there are 3 black cards and 2 red cards? V

,
-
n w
(c) The even! C is simply the complement o f the event B above!. Therefore,
Solution. Since we are picking 5 cards with replacement, there are 52s ways of
doing this. For example, (K jp .ih .C c l.^h .^sp ) is a possible outcome. We shall
next find the number of outcomes favorable to the event A representing 3 black
^
^

cards and 2 red cards. To find this number, we note that there are 5 locations, of '•
» /8 7 \ which 3 are to be assigned to the black cards and 2 to the red cards. There are ^

•V M = i - ' ; 10 ways of doing this. Consider one of these ways, and suppose we have black £

n Ml
cards in the first, third, and fourth locations, and red cards in the second and fifth.
There are 26 X 26 X 26 ways of filling the first, third, and fourth locations with
:0

the black cards, and corresponding to any o f these there are 26 X 26 ways o f “
filling locations two and five witfi red cards. Applying the basic counting rale, (4

C<

Scanned by CamScanner
Probability Theory and Application • ¿>, fininon o} rrotwwit? 1 <*•

The number of outcomes favorable to the event is

52* / \ y \ f 39 \ /13 - 1 y*26 + 1 Wl 3 - 1 - A/! 3 + 1 + A


\/A l3 - /A i A 13- / A * n \3 ~ k I

* * » - » (Why?) Hence the probability is

’• * * * « * * * defective objects U ' " * * S" n " ’al * * proi,a-

! ;- • j ? ( J ) i> * ( w - o ) " - * - 0 0 »
|- a. AT Example 3.12. In a bridge game, find the probability that NoTth gets exactly *

F ^ probability can be rewritten as aces, k - 0 ,1 ,2 , 3 .4 . /5 2 \/3 9 V 26\


Solution. From Example 3.11, we know that there are ( 13 / ( 1 3 X 1 3 ) w aysoi
*
n7-
( 3 S )‘ ( ' - f dealing cards to the four players. . , . „ var,ivfc
: Now it can be easily seen that the number of hands where North gets exactly K
i-sv- •
vr ^ m0r' °" th" Whe" - discuss the binomial probabilities in aces is
48 V 39 \/2 6 \
3 —AcA 13A 13/

f S ~ S S S 5 = = = - . Hence the probability that North gets exactly k aces is equal 10

probabUi,y lhi* 3 s,uden-,s wai * - p ^ z ^ v : : S e 48 V 39\/26\


;) o y y
5F~ i3 —IcA 13A 13/

^ !a S
S £ ^
n
3
! h t Pr0blT l b0ilS d0Wn ,0 the followin8: There «
« l7 n .< T
3 courses and we arc
r\ P r ement' N 0,i“ ,hat there “ 0nly 1 course in probability
o eH ‘ 0

V f " ' , StUdeJ l tS t 0 P ‘c k fr0m and there « e 2 courses in statistics for the 7 students We observe that this probability is the same as the piobability that an arbitrary
. ^ y o - p d t from. The probability of the desired event is therefore equal to hand of 13 cards contains exactly k aces.
Example 3.13. Find the probability that eight players on a team will all have their
birthdays o n -
(a) Monday or Tuesday (but not all on one day)
(b) exactly two days of the week.
| g | - J n (he rest o f this section we shall consider miscellaneous examples which unify
allieren t ideas developed thus far. Solution. There are 7 days of the week on which each of the players could be born.
Hence there are 7* possibilities.
?le 3.11. Find the probabUity that in a bridge game North, East, South, and (a) If each person is bom on Monday or Tuesday, then each person has two
I get, respectively, i, /, k, and I spades ( i + / + * + / = 13).
choices of days, and as a result there are 2* possible ways this can happen. However,
n . The num ber o f ways o f dealing 13 cards to one player is . There are the men cannot all have birthdays on M onday, nor all on Tuesday. Therefore,
there are 2* - 2 outcomes favorable to the event, and consequently the desired
cards left from w h ~ h th r second player can re w v e 13 cards in ( ' ^ ) ways. probability is equal to (28 - 2)/78

(i>) There are Q ways of picking 2 days out of 7. Hence the probability of
lUnuing the argum ent, the third player can be dealt 13 cards in ^ ways, and,
having all of the birthdays on exactly 2 days of the week is
ly, the fourth player can be dealt the remaining cards in ways. By the basic

^«M inting rule, there are Q ( ^ ) ( ^ ) ( | j ) = (^ 3X 13) ^ ) ways ,0 deal four brid8 '

Scanned by CamScanner
p i Probability Theory and Application*

m n £ amP, t 3 1 4 (77,e n ic h in g problem) Suppose n neoulc , Definition o f Probability ¡63


P ; * e party starts, each person deposits his coat in the S , P y iore
l ^ « f t h e party, picks one coat at random Find th k . *r00ra' Jnd’at ,he en<*
Thus if n is large, is approximately equal to I - e x. a
Lpidcs his own coat. m - F," d ' he Pr0babll"y 'ha> at leas, one person

The following example, called the birthday problem, indicates a context in


which one might be interested in finding the probability that there is no repetition
coats in all possble ways. Hence there are „! p o ssib le “ “ "“ '° arrang'n8 when sampling is carried out with replacement.
Now let A k be the even, that the pe.son „ Picks his own coat. Whal we arc Example 3.15. If there art n people in a room, what is the probability that no two
of them will have the same birthday. (Assume n < 365.)
interested in finding is , which, as we know „ gjv{n fay
Solution. We shall ignore the fact that there are leap years and assume 365 days to
r . n b ’ ^ a year. When we consider the birthdays of n people, in essence we pick n days with
replacement from the 365 days. (Remember, if Tom is bom on January 21. it does
^ not preclude Jane from being bom on that date. Also, order is relevant because
Tom being bom on February 3 and Jane on October 9 is different from Tom being
- ■ ■ ■ * H r ' I \ A lA , . . . A „ )
«£‘ . bom on October 9 and Jane on February 3.) Hence there ¿re 365" possibilities in
. To find P(i4|), we note that if person ir/ gets his coat, the remaining n - 1 coats the sample space.
can be in any order. Hence there are (n - 1)! possibilities where person ir, gets his If no two people have the same birthday, then the first person has 365 choices
coat. Therefore, for his birthday, the second 364 days, and so on, and, eventually, the last person
has (365 - n + 1) choices. (Here first, second, and so on are used in the sense of
writing the outcomes as n-tuples.) Hence there are 365 X 364 X . . . X (365 - n + 1
t ^ «! n = (365)„ outcomes favorable to the event. Therefore, the probability that no two
ft* (Observe that PiA,) = I/n no matter what i is.) people have the same birthday is
Next, fo find P[A,Aj) we see that if persons ir/ and ir/ get their respective coats,
t ’ the remaining n - 2 coats can be in any order, and, therefore, there are (n - 2)! 365" V 365/1 365/ V 3651
g£: possibilities favorable to A/A/. Hence
In general, prompted by Example 3.15, i f we pick n objects with replacement
from M distinct objects (with n <M), then the probability o f no repetition is
n(n- 1)
M UST.
U g t any combination i, j. In general,
EXERCISES-SECTION 3
I. Find it if-
M i t A i l ■■ A ir ) *

(a)
I .’ r. Finally, since there are terms in the sum terms in the sum

« o - w

'.ÿ< fM i'A/)< . / |« '‘
and, in general, •J terms in the sum 2. Show that

fw e get
Inb

Use this result to complete the following pattern for m = 5 ,6 ,7 .


É .
jKi-" m =0 1
I « ' Or, simplifying,
m= 1 I 1
m -2 1 2 1
m=3
m=4 1 V y V ft
c

Scanned by CamScanner
' Banc Prohahüiry Theory mtd Applications
D tfm tion of Probability / 45
I* the following, n is a n o n n e jjiivt integer. Use the fact that
15. From an ordinary deck o f 52 cards.4 cards are picked at random. Find the
(!♦*)"* X probability that—
for any x
m '5 0 * ' (a) exactly one is an ace
(b) exactly one is i face card
the following identities
(c) all are black cards
s&s (.) 2 ( - , r (d) each is from a different suit
Uc\ (e) at least two are aces.
16. Find the probability that a hand o f five cards selected from j L‘ai-iard deck
(b) Í
M 0
/ y>-l
Hint: Differentiate and set x = 1 h as-
(a) an ace, king, queen, jack, and ten of spades
(c) =0 (b) an ace, king, queen, jack, and ten of the same suit
(c) an ace,king, queen, jack, and ten.
(d) I 17. Two cards are drawn, with replacement, from an ordinary deck of 52 cards.
///«f: C o n sid er(l + x ) w = ( 1 + x )« (i + x yi
Find the probability that both cards belong to the same suit.
18. Work exercise 17, this time assuming that the cards are drawn without
replacement.
19. In a five-card poker hand, find the probability th a t- .
v 5 ‘ Aud,e 15 l0SSCd a x times and a co‘n is tossed
tossed four times. How many outcomes
n there in the sample spaced (a) there are three kings and two aces
(b) there are exactly three kings
^ 0 *(i), ^ o w , m a iv < *"* * Ch°!Ce o f *n,werini «ight out o f ten questions. (C) there are exactly three kings and one ace
(•J Mow m any ways are ihere to answer the test’
(d) there is one ace and at least three kings
a « in I m any WJyS ' here if ‘>UeStions 1 and 2 « • obligatory7 (e) there are at least three kings.
t ™ 1 how m an>' P ° « ‘*ve. integral divisors 3500 has.
20. From a group of 10 lawyers, 8 doctors, 6 businessmen, and 9 proleisors, a
t WayS ° f arra" Eing the leUets o f lh t W°'<1 SUCCESSION committee of six is selected at random . Find the probability th at—
| i 2 5 y 2 5 W° rd V° LUMES a " an* 'd in a11 <“ « ay s, find the (a) the committee consists o f 2 lawyers, 2 doctors, onc businessman, und
rX. (*) the w ord ends w ith a vowel one professor
(b) the committee contains no lawyers
the w ord starts w ith a consonant and ends with a vowel.
(c) the committee contains at least one lawyer.
fe e !? * i ,git* *re pickcd at ran d o m . Without replacement, from the digits 1 21. From a group of S lawyers, 7 accountants, and 9 doctors, a committee of
^ t t r o u g h 9. Find the probability th at the digits are consecutive digits.
three is selected at random. What is the probability that the committee has more
• o nUmbCrS * rC PiCkCd W ith ° Ut r c P ,a c e m e n t from the b lo w in g numbers: lawyers than doctors?
gi 8 . 9 , 1 1 J 2 , 1 7 ,1 8 . Find th e probability that they are relatively prime
22. Suppose ei$ht books are arrayed on a shelf in a random order. What is the
A box contains eight balls m arked 1 , 2 , 3 , . . . . 8. If four balls are picked at
probability that three particular books will be next to each other?
& M o m , find the probabiU ty th a t th e balls m arked 1 and 5 are among the four
* ¿ I f c t e d balls. 23. The numbers 1 , 2 , . . . , n are arranged in all possible ways. Assuming that all
the arrangements are equally likely, find the probability th a t-
¿ r j i l * S u n d r y b ,g c0013*™ fo u r black and eight white gloves. If two gloves are
(a) 1*2,3, and 4 appear next to each other in the order indicated
one onc a l random , w hat is the probability th a t—
(b) 1 ,2 ,3 , and 4 appear next to each other.
p«r ( i ) they are b o th black
24. Suppose 4 letters are placed at random in 4 addressed envelopes. What is the
(b) they are o f the same color.
probability that exactly k o f the letters are in their correct envelope, k = 0 , 1,2 ,
M. A box contains ten item s, o f w hich foui are defective. If three items are
3 ,4 .
^ced w ithout replacem ent, find th e probability th a t -
25. Eight cards are dealt from a standard deck o f 52 cards. Find the probability
(a) ill are defective
of obtaining either 3 aces or 3 kings or 3 queens. (This does nof preclude, for
(b) exactly tw o are defective
instance, getting 3 aces and 3 kings.)
(c) a t m ost tw o are defective
26. Suppose 13 cards are dealt fiom a standard deck o f 52 cards. Find the prob­
(d ) a t least tw o are defective.
ability th a t the hand will contain all the face cards in at least one suit.

Scanned by CamScanner
iJB K k Probability Theory and Application,

On i ten-question true-false test a ttnd»nt «


ab ility that the student answers-
(a) no question conectly
(b) at least one question correctly
(c) exactly r questions correctly r = 0 l o
**** ° n CVCry queslion Find the

3
Conditional Probability
and Independent Events
I
An elevator stops at ten floors If th*r» ’ * • ’ ' ' *’
~r*^p 3babftity that— ’ ,re *“ peop,e in thc elevator. find the

S "n T u ° f thCm Wil1 gCt 0ff 0,1 the floor I


(b) all o f them will get off on the same floor
29. A box contains six pencils of lengths 1 3 4 5 7 « ir,K „ ,
at random, find the probability that a trianelr ran h r a P* 5 P‘cked
£
£
_ (a) w hat is the probability of being able to form a rectangle with them?
(b) what is the probability of being able to form a square?
/
31. A party is attended by twenty males and twenty females. If the party is
divided at random into two equal groups, find the probability that there are an
equal number o f males and females in each group.
#
INTRODUCTION #
32. \ party consists of n males and n females. If these persons are seated at
The groundwork for an understanding of basic probability was laid down in the
w?’ a row>find the probability that no two members of the same sex will previous two chapters. In this chapter we shall consider principally two topics 4
JJ* - be stated next to each other. %
which come under the purview of probability theory. 1 he first ot tne^e topics will 4
33. A box contains four books on mathematics and twelve books on history. If the cover conditional probability, and the second, independent events.
books are distributed equally at random among four students, find the probability #
that each student will get a book on mathematics.
I. CONDITIONAL PROBABILITY <
To discuss conditional probability, suppose we pick a person at random and
pose the following three questions: I

(/) What is the probability that the person is in the United States? Assuming «
that there are 200 million people in the United States and 3 billion people in the «
world, the answer would be .
00 Given that the person is in Australia, what is the probability that he is in the
i
United States? The answer would be, obviously, 0. i
(/«) Given that the person is to Iowa, what is the probability that he is in thc
United States? Here the probat^Ky is, o f course, 1.

We see from the above three situations that prior knowledge of the person’s
location influences the probability of his being found in the United States. Thus it
often happens that partial information is available about the outcome of the under­
lying experiment and this, in turn, leads to appropriate adjustment of the probabil­
ities of the associated events. In summary, the notion o f conditional probability
involves the probability of an event, say A, given the information that an event fl
has occurred.

67

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
\jBeac Probability Theory end Applications
Conditional Probability and Independent Events / 73
ipie 1 .7. Three balls arc picked al random one by one and without replace-
irom a box containing four white and eight black balls. Let (ib) Here we want to find P(C\A). We have
A—
A T h l first
= The (iMt ball
V. ■1is
• white
a.
B =The second ball is white “ c ,i4 )' W A u ■
C = The third ball is white
Suppose (S, 9 , P) is a probability space and B is a fixed event with P(B)> 0. We
I: («) P[A) (6) P(B\A) (C) P(C|/IB) have just defined* the concept of conditional probability. In essence we hive intro­
Solution duced a function P{ • \B), which, for any event A (that is, A e SF), is given by
P(A 1 ^ * %tu)!P(B). To make the following result more appealing, we write t B

^ ^ S S S i S ;fx u x ‘^ 7 ,het; - « x 11 x .o o u , in p la u v l/V lfl)


(why?). Hence P (^ ) = £ ’ outcomes favorable to the event A The function PB has the following properties:

(i) 0 < Pb (A) < 1 for any event A


W We have " " * in pickin* * white M - (£»*) Pfi(B) = 1
4 X 3 X 10 outcomes in the evenfiM
and consequently I\B \A ) = ^ ' 1U)/U 2 X 11 X 10), (iii) If U ,l is a sequence of mutually exclusive events, then

F°(kA)
atiuuons this will be the nature o r our argument in f i m L g ^ c o n S t o L ' ^ r
The properties (i) and (u) are obvious and we shall prove only (iii). We have

4 X3 X ^ lm « ^ WI,° bS" V
u ,ha' m A B ) = M U W A B ) . Now there are
- ( 4 X 3 X 2)/(12X u T f o i Ht ' . ?V' n,C4B “ W pb(9iAi)=p(9\Ai]B)=K(9/ ‘H I " bydefiniiion
P[C\AB) = ^ X 3 X 2 \ ^ / 4 X 3 X 10 \ 2 = l\ B ) , by the distributive rule
\12 X 11 X 10/ Vl2 X 11 X 10/ “ 10
Now since the events Mil are mutually exclusive, so are the events\A jB \. Hence
° z r n r : n ih,at ,he nrst 2 baiis are wi,i,e' ,hwe ■ » « * * ™ wh«n «he
k i o f thtse are whi,e-whercver convenieni’ ° * read“ pb (?-i A) i ^ H A M I W ) * 1 , ^ . 1 « ) = S f f lW i)
p Aould adopt this type o f argument. It is assumed that he is aware of the back-
^«ground leading to such an argument.
It follows from properties (r), (it), (iii) th a t the function PB , th a t is,P (* |fl),
L S - JThrec baUs are Picktd >1 random one by one from a.box containing satisfies all the axioms of a probability function, and hence is a probability
^ « V .w hite and eight black baUs. Find the following probabilities:
■ (fl) The first and third ball are white. Comment. For completeness, we should add the following—keeping in mind that
( b) The third ball is white given that the first ball is white. our treatment of the subject will n ot be impaired if such fine details are relegated
- Solution to obscurity: Conditioning on the event B am ounts to choosing B as the new
sample space. As such, the appropriate sigma field is a sigma field o f sublets of B.
t- w t e t A * The first ball is white, B = The second baU is white, and C = The
I third ball is white. Then Denoting this sigma field by SFB, it is given by

ttA Q P({ABC) U (A B 'Q ) (why?)


= P iA B Q + PiAB'C) (why?) where 9 is the original sigma field.
= PiA) ' W A ) • P(C\AB)+P(A) • P[B'\A) * P(C\AB')
A ^ A o -i t S f o s e ^ ^ fg jv e n event B iF('\B) is a probability function, all the consequences
i_ A A + A ._ 2 _1_
12*11 10 12 11 10 11
a p p ffljiu W tb r* ffob&bility function fallow* For exam ple, if A \ and A i are any
two evenii, then

Scanned by CamScanner
/ Basic Probability T htory and Applications
Conditional Probability and Independent Event /75
(b) In this case we want to find P [ A ,- A 2\B): this is equal to /( ¿ J A )
” / \ A kA 21B). Hence V
f \ A \ \ S ) ~ \ - l^ A ^ B ) HA -A \P \- *1 41 - 21
1 11 * 250 250 ' 125 .

^ W ld" m i* ht * is h to prove these directly


(c) Given B, the probability that the number is divisible by exactly one o f the
integers six, eight is equal to f(A , IB)+ P(A2\B) - 2
y
tU im 5 ,g ,h a ,^ ' S ') - 1 As a matter o f fact, this
Example 1.11. The probability that a person passes a proficiency test on the fust

I S lS ^ S rA<**Z'reven,sw,thiW>=°'4>"«w-
^ T W . <«) * A \ B ) (b )H A \B ) (c) P(A'B ') (d ) P (A U B ) 0 .2 .
attempt is 0.5, that he passes on the second attempt is 0.7 (of course, given that
he failed on the first attempt), and the probability that he passes on the third Y
attempt is 0.8 (given that he failed on the first two attempts). If the person is
Solution. We have: allowed three attempts, what is the probability that he will pass the test?

Solution. Let A , represent the event that the person passes the test on the rth
attempt, / = 1.2 ,3 . Then A , U (A \A 2) U {A [A \A 3) is the event that the person
passes the test. The probability o f this event is equal to /X ^O + P (A \A 2)
(4) A ^ 'l f i ’) = 1 -P (A \B ')= 1 ~ =1
^ 1 \A \A \A $ ) (why?). Next,

f? - (c) F{A 'B ')= P {B ') - P[A'\B’) = 0.8 X = 04 / H ) = 0.5, = ^ < i ) ^ M 4 ) = 0.5 X 0.7 = 0.35

and
••:•' (<i) U B) = 1 -P {A U fi)' = I -P iA 'B ) = 1 - 0 4 = 0.6
P (A [A \A > )= P (A [)IU 2\A i)P (A 3\A [A 2) = 0 5 X 0 3 X 0.8 = 0 .1 2
j. could have found P(A U B) in an alternate way by noting that P{A U B)
Hip.ce the probability of passing the test is equal to
¿ ¿ t e * ftA B ') + P{B). (Draw a Venn diagram to see this.)
0.5 + 0.35 + 0.12 = 0.97
R ^ C ^ xwnpie I ’M- A number » picked at random from the integers 1 ,2 ,3 , . . . , 1000.
'7 !f the number is known to be divisible by four, what is the probability that—
(d) it is divisible by six or eight?
EXERC1SES-SECTION I
1. Suppose probabilities are assigned to the simple events o f S = Is ,. s2, s4,
&
(ft) it is divisible by six, but not by eight?
i 5l s6| as follows:
(c) it is divisible by exactly one of the integers six, eight?
PQfi\) = 2 f ( M = 3*fli3l) = 4Pfls4|) = SFQssD = 6P0s*[)
'-■*
lution. Let £ represent that the number is a multiple of 4,A X that it is a multiple
i , i n d A t that it is a multiple o f 8. , Find the conditional probability of: &
i X f f t want to find P(Ai U A 2\B) which, as we know, is equal to P{A[\B)
i p l M a W - / U M B ) . U ow
(*) III, *3.s<l given ls2, i 3|
(b) |J2, Jjl given Is, t i 3 , i 4|
*
(c) 1 1 1 , i j , s3, s4l given ls2, s4, ss , s«l
> H J A m .S d iH . A - « . (d) U i, I 4I g iv e n li,,jl t j s, j 4l.
T£ 250

_ ¡ U i B ) ii& 125
2. Suppose A and B are tw o events with i\>4|/?) = 03,P(v< 'IB ') = 0 4 and
P(B) = 0.7. Find: 1 ’ m
h 4 , ib )
m " » 250 (a) f U lA ') (b) P(A ) (c) P{B\A) *■
W ,A ,IB )
f ( A ,A ,B ) rcfe 41 3. S u p p o s e ^ ,|B ) = 0 .7 .f U ,U f ) = 0.4 , and H A tA %\B) = 0.3. Given that B has
occurred, find the probability that:
#■
m " m " 250
(a) at least one of thc events A j , A 2 occurs *
(b) exactly one of the events j4 ,. A 2 occurs
(c) only A j occurs.
%
HA U A
250
+
250
iL -]6 1
250 ' 250 = = O - W ’ lB) = 0 S ,tU ,A ,\B ) - 0.3,
#
i v ” and
that, o f the e v e n t s , A. A ,\
= 0 Given event B t find the probability 1

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
f B t t k Probability Theory and Applications
Conditional Probability and ludcpundcm h rcuts I
n * i)= A fi,) = } ,^ l f l ,) = l ,.n d
EXERC1SES-SECTI0N 2
1. Three boxes contain white, red, and black balls in the numbers given below.

Box White Black Red


1 8 2 6
2 4 S 3
3 2 7 3

A i,U ) = ______________ S _ A box is picked at random, and a ball is drawn from it at random.

S. I'(3*0 (a) Find the probability that the ball is red.


(b) Given that the ball is red, what is the probability that it came from box i
0 = 1 ,2 ,3 ).
2. Suppose 3 cards are picked w ithout replacement from a standard deck of 52
cards. Find the probability that the third card is a spade.
¡SSiLSEE: z ^ s t: °'"Mr "• '■“><■»
flw students have been inoculated A bai h " " W “ “ "d batch ,wo thirds of
3. The probability that a student studies for a test is 0.7. Given that he studies, the
probability is 0.8 that he will pass the test. Given that he does not study, the prob­
picked from it at random“ ™ o u . t h k^ 31 rand° m' and a stud'"< ability is 0.3 that he will pass the test. What is the probability that the student will
returned to his batch w hatT sT , m ln0Cula,ed- « * B student is pass the test?
batch will not have been inociuted? "y thi' ' piCk' d from 4. On a true-false test, the probability that a student knows the answer to a ques­
Solution. Let tion is equal to 0.7. If he knows the answer, he checks the correct answer , otherwise,
he answers the question by flipping a fair coin.
. A ~ Thc first student has been inoculated (a) What is the probability that he answers a question correctly ?
(b) Given thsi he answers the question correctly, what is the probability ’h i’,
and P -- Batch
zf/ Bhf f t00" 11 StUdCnt has
is picked, i - not
\ ? been inoculated he knew the answer?
t " ’ 5. A bowl contains/! white chips and n black chips. A number is picked at random
g f' We want to fin d P(C\A ). Now
from the even integers 2 , 4 , 6 , . . . , 2n and that many chips are drawn at random
' “ P {C \ A ) = ^ 3 = f\AC\B, W B .) t P(AC\B,WB,) from the bowl, without replacement.
(a) Find the probability that the same number o f chips o f each color are
picked. •
l i y ipplying the theorem o f total probability to the events A C w d A .W t are given
(b) Civen that the same num ber o f chips o f each color are picked, what is
the probability that 2k chips were picked, where 1 < 2Jt < 2ii?
6. A box contains ten white and eight black objects. A fair die is rolled. If the
p H ,- W )=a«2) =j, ^ i S i ) = i, ^ ib ,)= |
number is even, then as many w hite objects are added to the box, and if it is odd.
3 then as many black objects are added. From the new com position o f the box three
S f c i J T ^ r u » T ni U“ t ^ l) = W
( W C W - K A I B ,) • n c M i,) objects are picked at random w ithout replacem ent.
“ 5 - v \ C | i 4/r2) - § , since the first student is returned to the b atch )
(a). Find the probability of picking tw o while and one black object.
W f c Thus, finally, *'
(b) Given that tw o white and one black object were picked, w hat is the
!•* probability that (i) a 4 showed u p on the die, (if) an even number
* CU )= F F F ] = is showed up on the die.
7. In a high school the sophomore class has 6 girls and 6 boys, the junior class ha
8 girls and 10 boys, and the senior class has 3 girls and 9 boys. Suppose a student
is picked at random from each class. Given that the sample contains exactly 2
boys, find the probability that the stu d en t picked from the sophomore class is
a boy.
8. A party is divided into tw o groups, one of w hich has 6 males and 4 females.
*1---- .A .L lU t

Scanned by CamScanner
Scanned by CamScanner
rrovmunu v i neory ana Applications
Conditional Probability end lndcf>enJeiit Eiwrs ¡89

I (A and B arc two events with f \ A ) > 0 and P(B) > 0, then, as can be seen
immediately,
W - J , /(B )" j , and P(AB) = 0
( 0 if A and B are independent, they cannot be mutually exclusive; and
\u J \A B ) * I \ A ) - f\ B ) , so that 4 and B are not independent events. (11) if A and B are mutually exclusive, they cannot be independent.

Example 3.4. Suppose we draw a card from a standard bridge deck of cards. Give
two events B and D which a re -
(a) mutually exclusive and independent
W i l • « ..» . HOT».»«.
(fc) mutually exclusive, but not independent

s-£ ■ “ ; ssrw (c) not mutually exclusive, but are independent


(tf) not mutually exclusive and not independent.
Solution. It follows that Solution. In each case we shall give the events B and D . The reader is asked to
provide the justifications.
f \ A ) • K W , H T 0 = 0.16 + 0.24 = 0 40
(a) Note that one o f the events has to have zero probability since, by previous
m = i ( W , 7TD = 0.24 + 0.36 = 0 60
discussion, mutually exclusive events can be independent only if one o f the events
~P{\H T\) - 0.24
has zero probability.
H e n c ii0 4 fl) = p (A ) ■P(B), and, consequently,/! and B are independent. Let

kn0W" Uom past « P 'rie n c e that the probability that a D = The card is an ace
B - The card is a black card o f hearts
-' ~ PCTOnhaS iS 0 2 - and * » Fobability that he has hear,
,W° eVen,Sa" 'ndePen<*e n l.w hat is the probability (b) Let
a person has (a) at least one ailment? (b) precsely one ailment?
D = The card is an ace of spades
^ ! f ~ r ’j *2rS° n has cancer»and B = The person has heart disease. B - T’.ic card is a 4 o f a red suit
We w ant to find P(A U B ). Now
(c) Let
1 / t * U £ ) = P (y !)+ P (£ )-i> (/M )
= * X A )+ P ( B )- P (A )-P (B ) , since the events are independent D - The card is an ace
= 0.2 + 0 .1 - 0 .2 X 0 . 1 = 0 .2 8 B = The card is a spade

J W The probability o f precisely one ailment is equal to flyO + P(B) - {¿) Let
^ ) * / W = 0.26. Ki m
D = The card is a king
m*mfr should be aware o f the distinction between independent B - The card is a face card
. • ■.v r „ .... . _
and m utually exclusive events. The two concepts are often confused.
are independent i f the occurrence o f one does not influence the
1 events are m u tu a lly exclusive when th ey are no t compatible, that is, they
o & m f t c e ^ f (he other. I f (J4s js Ih e ja se , tjjeu it seems intuitively obvious that
occur together. "M utually exclusive" is a property o f sets In this case
4 j o - l h a t / ( M ) B 0. * r . o ith e o th e r .N o r
•o f o ra influence the nonoccunence o f the other. The
events are independent when the occurrence o f one event does not
otot’
“ th e occurrence o f th e other. Thus no inference can be drawn regarding
rence o f one event on the basis o f the knowledge o f the o c c u r r e d 0 f *he
independence is a property o f the probability measure. In this case If A and J9 are independent events, thei.
t* * V f)» /t0 ). (i) A and B l are independent
a m atter o f fact, the following result shows how divergent the two concepts
(it) A ' and B are independent
are:
(iu) A ' and B' are independent

We shall prove (/) and leave the other cases to the reader. We want to show that

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Conditional Probability and litdepmJtnt Ewarts 1 91
With (his assignment we see that
Í,=C,X SXSX ...XS
* M ) » P J + e + p< l - p ) - t = p fl, = S X C , X S X . . . X S
<1 , similarly,

W 'P fi,,=SXSX...XSXCn

then
¡ \B ,) = f,{ C ,), H B i) = P,(Ct), P{B„) = P,(C„)
» p . « we would in u L e i v d e Z d H ' ^ * !,' ,dS ° " any ,oss ">ual
and
quently the trials are not independent In that L * th r ill n ° 7 C' P {B ,riB i r \ . . . n B „ ) = P,(C l ) - P , ( C i ) - . . . ■Pi(CK)

The verification of these results is omitted.

Comment. What is the benefit of all this discussion of independent trials? 77i<

ass-'s-* s s w important fact is that i f the trials are independent, then we can compute the proba­
bilities o f the events in the composite experiment on the basis o f the probabilities
o f the events in the basic experiment. For instance, if we want to find the proba­
- L i ‘T „ the pr° bib,ii,y ° f hMds ° n any i° “ * ^ »>-
^ that the events A and B are independent. As a matter o f fact, it can be seen that bility that, in rolling a fair die three times, we get an even number on the first toss,
any event determ ined only by the first toss and any event determined only by the a 5 on the second toss, and a multiple of 3 on the third toss, we do not have to
second toss are independent. In order fo r the trials to be independent, this is the consider the set of triplets !(*, 5, z) I * an even number, z a multiple of 31 from
only way to assign probabilities to the outcomes o f the composite experiment. among the 63 outcomes in the composite experiment. Instead, we can argue as
To generalize from the above discussion, consider an experiment consisting follows: the probability of getting an even number on a roll of a die is \ , of getting
o f n identical trials, each trial defined by the sample space S with a finite number a 5 is J, and of getting a multiple of 3 is and, consequently, the probability of
o f outcom es. L et P , be the probability measure of the events of S. The ;;mplc the desired event is J • J • J =
7? *P*oe appro p riate for the com posite experiment consisting of n trials is the
-> £ Cartesian pro d u ct S " w here At least one. and exactly k of n independent events
We open this discussion with the following example:
S " * K*i. *a..........In ) I i | is the outcome ol the ith trial, i = 1 , 2 ...........nl
Example 3.11. Suppose A, B, C are mutually independent events with f(A ) -
An event B (that is, a subset o f S") is said to be determined by the ith trial if l\B ) = l \ 0 = p. Find the probability that (a) exactly k (* = 0 ,1 ,2 ,3 ) of the
fi= 5 X 5 X ...X 5 'X C X 5 X ...X S events occur, ( 6) at least one of the events occurs.
t Solution
ith trial (a) We shall calculate only the case k = 2. We see that

—- where C is some subset o f S. ^/exactly two of the\ = u u (4 'B Q )


d'fin* the n trials to be independent if every set of events i f , , B , ......... 8„, \events A ,B ,C /
where Bt is determined by the ith trial, is a set of mutually independent events. = •PMBC') ♦ P(AB'C) + f\A 'B Q -
«*L _ = P(A)P(B)F(C) +
v„To define i probability measure on i " , it suffices to define it for each sample
since the events are independent
¿-^poirt in S ". Let us denote this probability measure by P. I f the trials are to be
= p !( l - p ) + p \ l - p ) + f \ ' ~ P) = V O “ P)
:,J~ Independent this car. be accomplished in one and crJy one way:
There is an alternate approach. Recall that

m il **)l) - A fls.l) • (M /.(exactly two of th e\ _ +^ +^ . m B Q


\events A, B, C I
fiS.'
= P(A)P(B) + f W W + H P )H .Q - 3 l\A )H P )F (C )
[l) . This assignment
---------w —----- or i-------------
probabilities is acceptable, since it can be easily verified that
= 3p1- 3 p i = 3 p \ l - p )
the probabilities add to un ity.
a ... ....................
Furthermore, with this “assignment, it can be shown that if Bi depends only on

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
nun i tu ititi t ttttmtf ttmtm,

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
%
%
%
*
%
*
«
«
«
*
*

Scanned by CamScanner
Scanned by CamScanner
A
A
à

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
r n n n n m u in T n ir m r m m n ?

Scanned by CamScanner
Scanned by CamScanner
%t ^ v v w w W W W W X ^
Scanned by CamScanner
Scanned by CamScanner
^ v n m m m u m iiu n //7 7 7 7 7 7 ¡

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Iffftfttfîîtîi ff ttf ««««« « « « « « • • • • •
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
f* / hnhohih’rv ---* • — *•'• *

Some Special Distributions / ¡63


f ° " 0 W ,n g P r0 C e d U re “ d ' Vi“ d : T h e v a c c i n ' w ‘" b e , h e d u n
The most probable number
I t r , ' f ! " m 0 r e p e 0 p l * d e v e l o p ' m m u n i , y . t h e c o m p a n y c l a i m w ill

i i wXcu yS\ , Sis, whenr the


S federal
l ltydrug
h‘l:agency
W ,htisCOmp,ny dlim Wi" bt ««"*«1 incor-
correct in its assertionV (h\ th*
For fixed n and p , the binomial probabilities b (k ;n , p ) depend on k. We now
propose to investigate the behavior o f these probabilities as k goes from 0 to n.
Towards this we shall use the following identify which can be verified w ithout
S I S y m C O rre C ,l)' < " " * i s ' w h c " » " v a c c i n e is i n d e e d
much difficulty:

H k ; n ,p ) = ( n - k + Ijp (n ♦ < > - *


7 7 ,he ,en w,,° d- ' » - Z>(*" I;«. p) *0 - 1>) A (- p )
With n = 10 ,p = 0.9. On the other hand if",he flderal ao ' “ 0" f * “ b "’° mia!
X is binomial with n - \0,p = 0 4 ’ agency claim is valid, then Now:

(« I f * « / » + « f t then (n; ( |' y > 0 , so that > '


that the claim win be ^ m c T T n ^ ^ c c n y ls ^ ^ ^ ^ r ^ i?r° babi,ny
^ y is equal to the probability that X > 8
Hence, if k < (n + I )p, the terms b (k ,n , p ) increase with k

r r <' “ “ “ M ) - 0 .0 ,0 6 , 0 0 0 ,6 . I S , , («) If k > (n * I t h e n < 0, so that < '•

Consequently, if k > (n + l)p, the terms 6 (* ; n, /?) d eaease w ith *.


“ rrectly is equal t0 .he probability that X < 8 when ..... (///) If (n 1 1 )p is an integer, then (;/ + !)/? - k = 0 for some A:, say k = 0 1

J , , , , , | i m i For such m we then have = ].


¿>(01 “ I ; « . P)
In conclusion:

£ ! £ £ i , ' V ’' X T " " ................. b(k,n , p ) increases with k i ( k < ( n + 1)p and decreases with k if k > (n ♦ IJp. If
(n + I)p is an integer, say equal to m, then b(m — J ,n , p ) - b (m ; n r p). The integral
part o f the number (n + 1 )p represents the m o st probable num ber o f successes.

« r s t t i r s s - - - ... If (n + IV? is an integer m, the largest value o f the probability b (* . .. p ) i> atiained
for two integers m - I and m .
1-(1
For instance:
g(Why?) Hence we are given that (a) Suppose n = 20 and p = 0.30. Then {n + I)p = 6.3, so that b ( k ; 2 0 ,0 .3 ) in­
creases monotonically as k goes from 0 to 6 and then decreases as k goes from
1 - ( I - p)* * 0 .9 9 9 9 3 6 7 to 20.
sat is, (/>) Suppose n = 24 and p = 0.4. Since (n + I )p = 10, an integer, b (k , 2 4 .0 .4 )
increases as k goes from 0 to 9 and decreases as k goes fro ir 10 tcf24 with
(1 ~ p )* ~ 0.000064 b(9-, 24,0.4) = 6(10; 2 4 ,0 ,4 ). ~

[Hence I - p = 0. 2 and, consequently,p = 0.8. (c) Consider the graphs o f binomial probabilities in Figure 1.3. Figure 1 .3 (a)
<ff) The probability function o f X is given as corresponds to n = 1 0 , p = 0.25; in this case, (n + l)p = 2.75 and the maximum
value is attained for k = 2, the integral part o f 2.75. Figure 1 .3(b) corresponds to
n - 5. p - 0 JO. Here (/i + l)p = 3.0, an integer, and the maximum value is
:* W - A r ) = Q ( 0 ^ (0 .2 )-*
attained for k = 2 and it = 3 .

(b) H ere we w ant P {X > 3). We have Example 1.6. Thirteen machines are in operation. The probability th a t, at the end
of one day, a m -chin- is still in operation i f ' »o. ;f the machines function inde­
pendently, find the most probable number o f machines in operation a t the end o f
that day and the probability that these many machines are operating.
= 0.9830
« n g the table.

Scanned by CamScanner
Scanned by CamScanner
(fc) We want lo firK) P(X = 0); this is equal to until he misses a shot. Thus, as an idealization describing these situations, the
experiment consists of a sequence o f independent Bernoulli trials with probability
of success p on any trial, where 0 K p 1, and the random variable X represents
the number o f trials required for the first success to occur. The random variable is
(3 '(3 commonly called a geometric random variable; it is also referred to as the waiting
time for the first success. It should be realized that-unlike the binomial distribu­
(c) The probability o f a I least one defective tube is tion, where the number of trials isfixed-in the present case, the number )f trials
is the random variable of interest.
The possible values of X are obviously 1 , 2 , 3 , . . . . and

*" (3 X -r
/the first r — 1 trials are failures\
\and the rth trial is a success /
However, an easier way to compute this is to note that Therefore, since the trials are independent.

( ' 2\
n x =r) = { \ - p t ' p . r - 1,2,3,...

(3 The distribution is called the geometric distribution because the terms


£rampte A bowl contains A/ beads o f which W are white and M - Ware black p{ 1 - pY~\ r - 1 , 2 ,3 , . . . , represent the successive terms o f a geometric series.
Observe that we have a genuine assignment o f probabilities because.
are Pif ' d- L " X denote the number o f white beads in the sample.
Find the distribution o f X assuming that—
( 0 For r = 1 , 2 , 3 , . . . , P(X = r) = (1 - pY~lp > 0.
(ff) the sample is drawn without replacement
(b) the sample is drawn with replacement. (ri) Since the series 2 ( 1 - pY~x is a geometric series with 0 < 1 - p < 1,
r=1
Solution
^ (j) In this case the distribution o f X is clearly hypergeometric and is given by Z A * =r ) = p £ ( l - p r l = p £ 0 “Pt
r-1 r=l s=0
M -W \ -
" = 1
n -k / l - ( l -p )
W = *) = k - 0 ,1 , . . . , n
Example 1.9. In order to attract customers, a grocery store has started a SA VE
game. Any person who collects all four letters o f the word SA VE gets a prize. A
(b) Here n beads are picked with replacement. As a result, we have n inde­ diligent Mrs. Y who has three letters S, A, and E keeps going to the stoic until she
in d e n t rials each with the probability o f sucess equal to W/M. (“Success” stands gets the fourth letter V. The probability that she gets the letter V on any visit is
for "getting a w hile bead on a pick.") Hence the distribution o f X is binomial with 0.002 and remains the same from visit to visit. Let X denote the number of times
she visits the store until she gels the letter V for the first tim e. Find:
(a) ihe probability function o f X
.........................
(¿>) the probability that she gets the letter V for the first time on the twentieth
Hrii result was derived in Chapter 2 (see page 60) using a purely combinatorial visit
ujument. (c) Ihe probability that she w ill not have to visit more than three times

Solution
IA The G eom etric D istribution
(a) The distribution o f X is clearly geometric. Since p = 0 .0 0 2 , we have
The geometric distribution finds applications in situations o f the following
M ure: A person tosses a coin until heads show up for the first lime; or a basket- P(X = r) = ( \ - 0 .0 0 2 /" 1(0.002)
*11 player attem pts a basket until he scores one; or a billiards player keeps shooting = (0.998)r"‘(0 .0 0 2 ), r = 1,2,...

(b) P(X = 20) = (0 .9 9 8 )l9(0 .0 0 2 ) = 0.0019.

M W» wmi M Y i\~ t* /n rw ivn oooV-l - n nru.

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Uííftnmi imiiiinmmmm

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
» v u u i I I 111 i 11 i I

Scanned by CamScanner
Scanned by CamScanner
Hence

The pdf and the D.F. Tor the Cauchy distribution with b * 0 and a * I are drawn
in Figure 2.13. The reader will see a close resemblance between the above graphs
and those forihe normal distribution. However, it should be realized that the t* j
distributions are quite different.

(*)

Figure 2.13

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
J0UUBOSLUBQ Aq pauueos

Futvrtnnt „/ a / jf-y
" " ’" , W " n * ««« * <1« p r o : « , « S . J ? , ,n d “ J ™»
m p* « o » iiy p o m u < S l rea! „ U m b n ¿ T . " ! * f“ nC" ° n fr0 m * < » R . «hill nol verify th e iw tlx in hul will te n ? , n » such Hrn n by the
F o r any p o m , i e S . i , „ M in e à hy M X Ir» W* « 1 denole lhl5 functjon Hon of an r.v., it follows that lt(X) is a random variable
Lei us denote the random variable MX) by Y. Now. as we are aware an rv
induces a probability measure on the Borel sets of the real line In our dacimion,
ft tt io find MX) ai * * c r.r*« r j » Uiere are two random variables involved, namely X and Y, and ihese wifi indure two
probability measures which, using our previous notation, we ¿hall denote respective 1
md /«. « indicated m Figure I . | ^ nUmb<f X{s) and for ,I,IS ^
ly b)7*r aiid?r . Thus, for any Borel sei B o f the real line, we have
5
/* (* ) = />()* I * (!)€ 0 1 )
and Pr{B) = /»()j I Y{s) e B\) = P[)s Ih(X)(s) ( B\)

The question is, how do the two probability measures Px and PY relate 10each
other? To answer this, suppose Cis a Borel set of the real line. Let

Z M x f R | h{x)eCl A =\seS\X(s)eB\
A is a subset of S that consists precisely of prcimages (under AT) of members ot
B which, in turn, is a subset of R and consists precisely of preimages (under h) of
the members of C. (Sec Figure 1.3.)

Figure 1.1

’l.A(Af) is what is called in mathematics a composite function defined on S,

i this definition o f h(X), it can be shown that if li is a piece-wise continuous


inaction and X is a random variable, then for any Borel set C o f the real line the
iftlJkWC*) f Cl is a member of 57. the sigma field of subsets o f S. (See Figure 1.2.)
&
JUX)
\ s
The set C.

Figure 1.3

As a natural consequence, A consists precisely of preimages (under li(X), that is,


I') of members of C. Thus

>4 = Is e S | h(X)(s) e Cl = Is e 5 I K(s) f Cl

In summary, since A = Is e S I X(s) e B\ = Is e 5 | Y(s) c Cl, the probability measures


Px and PY are related to egch other by the following assignment of probabilities:

The l e t C e ^ K / i ) * e x { f i ) ' r Y{ Q

Figure 1.2
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
ttm w v W W W

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
. « 1 1 1 11 m « \\W \\W \\\\\\\\\\Û
Scanned by CamScanner
Scanned by CamScanner
hnc rronchnny i henry ana Applications
Expectation-A Single Variable / 243
expected number o f defective items, given that there are at most three
live items ' ' The proof depends on the easily proved fact th3t if the distribution o ( X is
t f X h is a continuous distribution with the pdf given by symmetric about a, then X - a and - X + a have the same distribution. (See Chapter
6, exercise 2 on page 217.) Therefore.
**> = 0< JC < 2
elsewhere E ( X - a ) ^ E ( - X + a)

£ ( X n ¡ X > I) for any nonnegative integer« That is,


m e length o f a telephone conversation (mtasured in minu(es) haj ^ f E ( X ) - a = -E (X ) + a

Consequently, since E(X) < we get E (X ) = a.


0< x< 2 Comment. The Cauchy distribution provides an example o f a distribution which is
/(* ) = symmetric, but the point of symmetry is not the mean. The m ean does not exist
*>2
for the Cauchy distribution.
elsewhere
The Bernoulli distribution
(a) Find the expected length o f a conversation If X has the Bernoulli distribution, then
(b) Find the expected length o f a conversation, given that it lasts a. leas,
one minute. P { X = x )= p x (\ - p ) l~x , x = 0,1

where 0 < p < 1 . Therefore,


r" m • " — « > • ~
E(X ) = 0 • (I - p ) + 1 • p = p
/0 if \X \< o E (X 2) = 07 - ( \ - p ) + \ 2 - p = p
" [a2 if \X\ > a ^
Hence,
S h o w that £ ( r * -- --JXIXI > a).
11. Let X be a continuous random variable with finite range [a, b ) . Show that Va. {X) - E{X*) - |£ t ¥ ) P = P “ P2 = P ( I “ P)

W O = b - f F (x ) dx. Thus

M X is a continuous random variable, show that E(2F(X) - I) = 0 where F is


:‘J#»e D .F. o f X. ' ' J E (X ) - p
Var(Af) - p ( \ - p )
|3 3 . Suppose X is an absoluiely continuous random variable having a unique
•>n m . If b is a real number, show that E { \X ~ b \) is a minimum when b = m .
^ ‘nce P )= I “ we see that the variance is the largest when p =
nt: First show that E ( \ X - b \ ) = E ( \X ~ m \) + 2 J b ( b - x ) f ( x ) d x . Then consider
m This stands to reason in view o f the fact that the outcom e o f the experiment is
least predictable w henp =
% t tw o casesm < b and m > b , and show that / 6 (b - jc)/(jc)d x > 0 and that
it’ m
'¿his integral is zero when b = m . The binomial distribution
W*r Suppose X has the binomial distribution consisting of n independent trials with
I EXPECTATIONS O F SOME SPECIAL DISTRIBUTIONS probability of success equal to p, 0 < p < 1. Then the probability function o f
— In C hapter S wc discussed some important discrete distributions iih e binomial, X is
"Jlie Poisson, etc.) and continuous distributions (the uniform, the normal, etc.).
referred to the constants associated with these distributions as their parameters,
m t ire now in a position to provide physical meanings to these constants. Before
embarking on this, we shall prove a result which applies to symmetric distributions.

If the distribution o f X is symmetric about a, and if E{X) exists, then E{X ) = a,


Ihe point o f sym m etry.

Scanned by CamScanner
rffX ilillll

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
1 11 i 1 1 1 MU w w w vw w w w x
Scanned by CamScanner
ifff f f f f t t l t U l W U i n u u
u u u u u u m i i «

Scanned by CamScanner
Scanned by CamScanner

T a b le 2 . 1 . A G lo s s a r y o f th e M o re C o m m o n D iscre te D istrib u tio n s


T h e D is trib u tio n P a ra m e te rs Probr h ility F u n c tio n , p ( x ) Mean V arionce
B ern o u lli 0 <p < 1 px (\ x a 0, 1 P p (1 ~ p )

n * 1 , 2 __
B inom ial np np (I “ p)
0< p < 1 ( x ) pX ° ~ P )” ~X' x ~ 0 i ____ "

1
G e o m e tric 0< p< 1 p d - p / ' 1, x = 1 .2 . . . .
P

N egative B inom ial* r - 1, 2 ____ « Ì.-E )


(P ascal) 0 < ] P*

I
/> 0
Poisson
X >0
x. 0 , X/ \t
x!

¿V= 1, 2, . . .
« = 1 . 2 .......... N œ j G ,
H y pcrgt c m e tric * np rtpi 1
\ /N \ * ~ ° ' 1--------" P )\ N - i )
.......... ' 0
* T h c c o m p u la lio n o f the e x p e c ta tio n a n d v aria n c e arc left to th e ex ercises. (Also, a n a lte rn a te m e th o d o f c o m p u la tio n will be g iv e n for
th ese eases in C h a p te r I I . )

T able 2.2. A G lossary o f th e M ore C om m on C o n tin u o u s D istributions

T h e D is tr ib u tio n P a r a m e te r s P r o b a b ility D e n s ity F u n c tio n M ean Variance


-« o < a < oo
— , a < x < b a + b (b-a)7
U n if o r m o v e r th e 00
— < b < 00 /( * ) « l b —a
2 12
in te r v a l [a, b J w ith a < b 10, e lse w h e re

__ '__ e - ( x - a ) 7 / 2 b 7
-°°<a < 00 f(x) =
/>V2 tt a Aa
N o rm a l b > 0
- o° < X < 00

\ e ~ kx . x> 0 1 1
E x p o n e n tia l X>0 /<*> = 0, else w h e re X X5
1
!

— XP ~ ' e - K x , x > 0
00
V V
s v

G am m a* /< * )-; r ( p )

0. e lse w h e re

11

1
11

j
i
11
j

• T h e c o m p u ta tio n o f the expectatio n and variance axe left to the exercises.


► îîttîiiiiil

Scanned by CamScanner
Scanned by CamScanner
M
f U tttlU l\ì\ììììn \\
Scanned by CamScanner
u m u u u a u u iilli/ttö z flö n

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
?î m m f11111fmum\\\\\vvs\v
Scanned by CamScanner
\\\\\\\^
m
m
im
n
n
u
u
u

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Joint anJ Margino! D iuntnitinni / 291

r ]0. If X and Y have • joint distribution F, show that 16. Show why the function / defined by
Fx ( x ) * F y(y) - I < F\x, y ) < yJFx {x ) ■FY(y) „ . |or(3x-A 0<*<2, -i<^<4x
f for ill x . y. Hint: Consider the regions o f the xy-plane /(*< y ) ~ j q elsewhere
11. If the distribution function of X and Y is such that F[x, y ) - u(x) • v(y) for cannot represent a pdf for any choice of c.
[ e ra r x - y ( u l t * function of x only and v is a function of y only), show that 17. The functions p defined below represent joint probability functions for appro­
priate choice of the constant c. Determine c.
•a" u(*) = :
*-►- lim p(x) x * 0 , 2 , y 55 - 2 ,- 1
elsewhere
12. The jo in t D.F. o f X and Y is given by X = " 2 ,0 ,2 , > -= -2 ,3
\c \x -y \,
<b> p ( x ,y )-
x<0 elsewhere
or ><0 |0 ,
0 < x < l and 0<y< 1 18. Determine for what constant c the functions given below W ill represent pdfs.
fb ,y ) = Kx’ +X), 0 < jr < 1 and y> 1
è 0 '2 + > ). X >1 and + 3y), 0 < x < 1, 0 < y < 1
0<><1 (a)
v i. X >1 K * .y ) = elsewhere
and y> 1
Find: (a) P Q < X < 1 , (b)
l(x, y ) = elsewhere
(b )* (i< J r< i, y > { )
(c) The jo in t pdf o f X and Y I cxy. 0<><2
(c)
(S )P (X > 7 Y ) l( x ,y ) = elsewhere
( e ) W '+ y < l )
(d)
13. For each o f the discrete probability functions given below, obtain the joint H * .y ) = j 0, elsewhere
distribution function. . '
(<■; | ce*. y< C , x< 2 y
(a) K x .y ) = (0 , elsewhere
^ vi = l i ( 2 x - y + l ) , * = 0 , 1 ,2 , > = 0 ,1
10 , elsewhere
(0
(b) _ /r v i . I » x ( x t y ) , x = 2 ,3 , y = - 1 ,0 ,1 f(x .y )=" | o , elsewhere
* X , y ) ~ |0 , elsewhere
19. Two random variables X and Y have a joint discrete distribution with probabil­
(c ) i t y function p{x, y ) as described in the following table:
J . i~ , x = 1 ,2 ,3 , > = 1 ,2
p(x.y)= j 12 ’ ' ’' 1
( 0, elsewhere -3 0 2 4
J.
14. For each o f the jo in t p d f’s given below, obtain the joint distribution function. -4 11 0 £ A
X
3 0 11 0 Ä
(a) 0 < jc < I, 0 < > < 1 5
A 0 A
12
elsewhere

(b) 0< y< 2x< 1 Find:


elsewhere (a )^ > K ) (b) P { X f Y < 0 )
(c) P ( X ' > Y 2) (d) P ( X Y > 0 )
(c ) 0 < > < x « il
20. Suppose the joint pdf o f X and Y is given by
0, elsewhere .

15. Show w hy the function p given below cannot represent a jo in t probability u K ' ' x>0- ' >0
l\x ,y > - | elsewhere
unction for any choice o f c.
Find:
\c x ( .2 x - y ), * =0,1,2, > = 0,3
f a y ) - j0 elsewhere
(a) P { X * Y < 2 ) (b) P { X > 2 Y )

Scanned by CamScanner
n I Basic Probability Theory and AppJicgijo m

Joint end Marginal Distributions / 293


| 2 I. ^ the joint pdf of Af and K is given by

K x ,y )= Sq4^ 1 -jc ), 0 < x < 1, 0 < j.< Let us prove this. Since (X, K) « a b.vamte random vector, by definition.

elsewhere Is |* (i) < u , Y{s) < v\ * 9 *


find;
for any pair of real numbers u, v. In particular, then
(*) n x = y )
(h) 1 \X < 2 K ) uiJir(j)<«, y ( s ) < ~ u 2

(d) ^ < ^ < 1 )


(0
2’
However,
ij|Ar(5)<u. r ( j) < ~ » = is i * (* )< * ! n i s i r( s )< ° ° i
= Is IX(s) < u I n 5. Since Y is real valued
i
§ (0 A l j < j r < j i u i r < ; 5 i )
22 . if ithe joint distribution o f* a n d Kis described by the pdf
= |s |* ( s ) < u !
Hence, if (X. Y) is a random vector, then Is I X(s) < u\ e 9 for every real number u
and, consequently, X is a random variable. A similar argument shows that Y is a
xU y^l
random variable.
elsewhere .
Incidentally, the equality
23d w r f „ r t v ° :a"y r a l " umber “• m '- U“ P»'a- coordinates,
\s\X (s)< u. K(i)< ®°l = Is|Af(s)< ul
following pdf: ** * J° 'nl abso,ulc,y con,'nuous distribution with the
yields

2 w ’ ><y<> F{u, °°) = Fx (u)

elsewhere Similarly,

\s \X (s )< ~ , K (s)<i'i = l i | K(s)<i>l


ootU6 -- XX +^ Y ^? /ft«/;
HU)! °Use
n nypolar
7 " coordinates.
,,l",lbe, “ WtU' I * “ y ab0UI ,h' distribution
gives
*
2. MARGINAL DISTRIBimONS fH «,v)=Fv(v)

2.1 A General Discussion


It follows from the above discussion that the distributions o f X and Y can be
As before, suppose (S. 9 . P) is a probability space, and X and Y are two real- obtained from the knowledge of their joint distribution. The individual distribu­
slued functions defined on S. Several questions are pertinent:
<P
tions o f * and Y are called their marginal distributions. Thus Fx , F Y are called
^ Question 1. Suppose (X, Y) is a random vector. What can be said about X and respectively the marginal distribution functions o f AT and Y. There is nothing wrong
/individually? Are they random variables? If they are. can we find their distribu­ if the qualifier ‘•marginal” is omitted because, a ite rjU , these .re the distributions
*
ot X and / in the usual sense.
tions if the joint distribution of X and Y is known?
<
Question 2. Conversely, suppose X and Y are random variables. Then, is (X Y) Example 2.1. Consider the joint distribution function of X and Y given by
fa random vector? If it is, can we find the joint distribution when the individual 0, <1
u< 0 or v < 0
utributions o f X and Y are known? . F(u.v) = {1 - 2e'f ♦ e‘ 2v 1
u > v and v > 0
The answer to Question 1 is contained in the following: l - e - iu + 2e-<“ ^ ) - 2 ( w > 0 and v > u <
Find the distributions o f * and Y.
lf(X , Y ) is a bivariate random vector, then X and Y are each random variables. t
I Furthermore, the distributions o f X and Y are given by Solution

Fx(u) =F(u, « ) = Jim F(u, v)

Fy(v) = F \~ k) = Iim Flu, f)

Scanned by CamScanner
wmm
« m m u t nim m !

Scanned by CamScanner
Scanned by CamScanner
*** Joint and Marginal Distributions ! 299
,J w,/ * • r>
1 ...
Solution. The probability function o f * U obtain«! by summing the joint proba­
bilities with respect to ill the possible values of y. Therefore, for X - I . ,
V '
* \ y\ >2 •• . yi ... W =x)
X\ pi*\.yi) #**». > , ) ..• p(x%
.yj) ... 2jiK*i ,y/)
= i ( ( * + l) + (* + ° ) + <x + 0 + (* + 9)]
Xi p ( X i ,y t) p(x 1, y 7) .. • pixz.yj) ... Z p l x , ,yj)
thus
4x ♦ 11 - 1 A k
H X ‘ x )=
xi p (x i, y i) P(xj, y 2) . • Pbi. yj) . .. I Ptri. >,)
Similarly, the probability function of Y ii given by

H Y -y) g xP ( x i .y \) ^ p ( x i t y 2) .. • I , * * ! . * / » - - - 1 = A [(i+ /)+ (« +y ) i. >’ = - i . o , i . 3 .

As can be seen, the totals in the vertical and horizontal margins in fact represent, Hence
respectively, the probability functions o f X and Y. It is because of this feature that
> = - 1 .0 ,1 ,3
the individual distributions o f X and Y are often called the marginal distributions.
In Section 2.1, we saw that distinct joint distributions a n give rise to the same
marginal distributions. As another example o f this, consider the family of joint Example 2.4 (The trinomial distribution) Suppose X and Y have the trinomial
probability functions given below (where 0 < c/2 < ^ (why?)): distribution with n trials and parameters p. q. (See Example 1.9.) Find the
marginal distributions of X and Y.

Solution. We know that the joint probability function o f X and Y is

1 I
r € 1 - 9
where i = 0,1, . . . ,n ; ; = 0 , 1 ..........n - L Therefore, for / = 0 , 1 , we have
i , f 1 € 3 S_
18 2 18 2 18 18 n-i
3 € 2 £ f ( X m0 m U V ' l Y=l)
_S_ /■o
0
18 2 18 2 18
= j h ilflifi- / - / ) ! M v - p - i f " 1 .
i_ _5_
W -7 ) i
18 18

This table describes a family o f joint probability functions for different values
o f €. But no m atter what e is (as long as 0 < c/2 < A) , we always get the same
marginal probability function o f X, n a m e l y , = * ,) = | P {X = x2) - and
P (X = x 3) - and the same marginal probability function of Y, namely,
= ( ^ ) p ' [ < i + ( i - p - « ) r - i = ( " ) p ,( i - p ) " - '
W V i ) = i P {Y = v,) = and P {Y =>*,) =

Example 2.3. I f X and Y have the joint probability function given by Thus, i f X and Y have the trinomial distribution w ith n trials and parameters p, q,
then X has the binomial distribution w ith n trials and the probability o f success p.
P {X = x , r = ^ ) = i ( j r + / ) , x = 1 ,4 , >» = - 1 , 0 ,1 , 3
Similarly, Y has the binomial distribution w ith n trials and the probability o f
find the marginal distributions o f X and Y. success q.

Scanned by CamScanner
r " ...............— p p t^ o n
2.3 The Absolutely Continuous C ue Joint end Marginal Distribvlkms /

then X and y each E n ^ » 2 y“ ^ " U0lIS d‘stril>ution w ith a jo im p d f /


in d Y are given b y V m w ,us distribution and the p d r s o f * I } /U. y ) * ®

/*(*) = J f(x,y)dy. -oo < x<M i d + **r)

f r M = f mf ( x , y ) d x . - « < ,< „

/Ut.y) = 0

o tp t in function called ihc probibiliivrtrn "i Vf lppropnattly m,eer« “'g * non-


P uty density function. Now we already know that Figure 2.1

F A U ) ~- » ) = ^ ~ ) = j“r K x .y )dydl
Therefore,

• fx b )= r O d y + f ' ^ H x t y d y + r O d y

Fx ( u ) ~ J _ “\ j _ K * , y ) d y \ d x =?(' + T ) * 0 < * <]

Combining the above, we get the pdf o f X as


shows th a t th e distrib u tio n function O f * can be obtained by integrating the non­

negative fu n c tio n ^ / f(x , y ) dy. Therefore, by definition, J has an absolutely


c ontinuous d istrib u tio n w ith the pdf / “ f ( x , y ) d y . 10 , elsewhere

A similar argum ent shows th at Y is absolutely continuous w ith p d f f~ f ( x , y ) dx. Example 2.6. Let

Example 2,5. Suppose the joint p d f o f X and Y is given by ir y


10 , elsewhere
= Ifc H -x H 0 < JC < I, 0 < y < \
(а) Find the marginal pdrs o f X and Y.
f(X‘y ) ^ '0 . elsewhere
(б) Compute P ( y < J ) .
Find the marginal distribution o f X.
Solution
Solution. We have / * ( * ) = S~i(x,y)dy.
(a) We first find the pdf of X.
I f x < 0 or x > I , then f(x, y ) = 0 for every y. Therefore, If x < 0 or x > 1, then /(x, y ) = 0 for every y, and therefore

lx(*)~ / 0 d y =0 if j t < 0 or * > 1 fx(x)~ f~0dy = 0

I f 0 < j r < I, as can be seen from Figure 2.1, If 0 < x < I , it can be seen from Figure 2.2 that

if y < 0 or y > 1
f(x w) = I®' if > < 0 ix(x)= i ° 0 d y ♦ / * lQ r V d y + /~0d>> = Sx*
Jy y } |i( l+ * H 0<y < 1 ° x ,
Combining these results, we get

Is* 4 , o<*«;i
/* (* )=
elsewhere

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
t Probability Theory and Applications Conditional Ditlri
«
to ,io n ,a n d M 'p r n d e n , Random Variable I S 13

. . u . real n um b « T h e n i s, fu n c tio n o f„ derilledby

■a " )

,h' C° n d i,im a ,d is' Tibu,ion f unc' k” ' o f the random variable X given

frm ple I I. Let X in d Y hive i joint distribution given by the following pdf:

lU, elsewhere

■ind:
(c) The conditional distribution function o f * Riven 0 < Y < 1

{c) n 4 < ^ < i l 0 <


ohition
(a) We want to find P (X < u 10 < Y < J) for any re, i number u. This is
ivcn by

/ atA, < u i o < y < j ) = o<y<{) Figure 1.1


2) P (0 < Y < \ )
= 0 < Y < i) (b) To findP(J < * < ^ 10 < K < j), let us w riteP ( X < u | 0 < Y < })
5 n c c rv0 y < \ ) ~ S , from Example 2.6 in Chapter 8. = F{u 10 < Y < }). Then, recalling thatP ( a < Z < b ) = Fz (b ) - Fz (a), we get

/ ,( i < * < 5 i o < r < i ) = / x 5 i o < r < } ) - / :x i i o < r < } )


There are four cases that need to be considered, depending upon the location of
on the real line. These are u < 0 , 0 < u < * , i < i i < l , and « > 1 . The last
• ¿ M i N - S s ) ’
hree cases are shown in Figures 1.1(a), (b), and (c). It can be seen from these
cures that _ 383
1539
w<0 (c) As in (ft), we have

W < u l 0 < y < i ) =‘ 1i «M


[»;■
m . Ii i/ 2 u
/ 10x 2y d x d y ,
y
-
0< u< \

19/ / \0xydxdy , {<«< 1


■SHT-GT)
$ f m f l lQx2y d x d y , u> 1
1.2 Conditional Distribution Given a Specific V alue
Lest the reader be misled by the heading, our goal is to find, specifically, the
herefore, the conditional distribution function o f X given 0 < Y < J is
Jtiined as conditional distribution o f one random variable given th a t th e o th e r random vari­
able has assumed a specific value. First we will consider the discrete case, then
0, - u< 0 the absolutely continuous case.

1 8 « !,

A (20«3- I ) ,
1,
o< u< i

J< u< I
u> I
The discrete case
Suppose X ind Y have a joint discrete distribution w here the possible values o f

. ? '* ' • in d ,ho“ of Y ire • r i ..........T h en , b y directly applying the


definition o f condmonal probability, w e can find the p ro b ab ility o f the event that
A assumes a value x , given that Y » j y as

WK=v. V=„A

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
r
?
Ip / Basic Probability Theory attd Applications ■^ 4
liution Solution. Right away, we get the marginal distribution functions of and ^ 35
(a) It a n be shown thil the marginal pdf of ATis (0 , x<0 3
*<0 Fx (x )‘ \im F k ,y ) = 0< x< l 4 ?
ix to '
*>0 I. x>l

inct Jyurtvl x„) is defined only when fx (x„) > 0, we can consider the condi- and
onal pdf only if xa > 0. Now, $
(0 . >’ < 0
Fy(y) = lim F[x, .v) = ! y( 1 + y - / ) . 0* y < 1
'= 1le°’“J
J '> x „ « --- 1 1, y > 1

herefore, In order to find the conditional pdf’s, we require the joint pdf and the margin.l
pdf’s. These are obtained as follows:
><10
/m O -lx „ )= & £ ^ = 3J „ 4 |2 ( x + y - 3 x / ) . 0 < x < l. 0 < j-< 1
/jKv*o) y> xo / (X’ > ) = M 10, elsewhere

= lo, y< x„ d \ _ |1 , 0<x< 1


| e-(> -iJ y> x„ fx (x )~ fa x(x ) |q^ elsewhere

(4) The conditional distribution function of Y given X = x0 is |l+ 2 y - 3 /. 0 < ,< !


! r O ) - d/ r W ¡o, elsewhere
l \ Y < u \ X =x 0)= f " fy \x ( y \x 0)dy
(а) The conditional pdf of y given X - x 0 is defined provided 0 < x 0 < 1■
0, u < x„ In that case,
f “ e^ y -x J j y u> x0 , ¡ X x o + y -ix y ), 0< y< l
J n x O ’lx 0) |o , elsewhere
_ 10, «<x0
(б) The conditional pdf fx \y ( x I^o) *s defined only if 0 < y Q< 1, and we have
1 1 u> x0
2 (x + y 0 - i x y 20 )
(f) Since we have already obtained the conditional distribution function i 0< x < 1
art (*), we can find PQ < Y < 4 1X = 2) as follows: Jx ir t* l* > J M yo) i * 2 y 0 -- 31.,3
y ’
«p
Writing Fr lx (u | x0) for P {Y < u \ X = x0), 0, elsewhere

A 3 < Y < 4 1X = 2 )= F r lx ( 4 12 ) - Fm (312) (why?) (c) We shall first find / V > } | Y = y 0) for any 0 < y 0 < 1.
« p
= (1 - e ‘(,-1)) - ( | - e-<3-l)) n * > J l > ' = . > '„ ) = l - P ( * < i l l ' = >’o)
« p
- 1 - £ n ix iy f.x \y 0 )d x
*ample 1.7. Suppose the joint distribution function of X and Y is given by fp
_r 2j£ l p z l S & d x
0, x<0 or y <0 , 1 + 2 > o -3 > 5 t p
x y ( x + y - x y J), 0<x<l and 0<^<1 « 3
= 1_
f\* .y ) = { y ( i + y - y ), x>i and o<^<i 4(1 +2>-0 - 3 > J )
x, 0<x<I and «P
1, x> I and y> 1 Hence, substituting
IP
ind: ' V > i i r = i) = i
(“) the conditional pdf o f Y given X =x a
(4) the conditional pdf o f X given Y = y 0
{c) P ( X > \ \ r = i)
« -

Scanned by CamScanner
jp/naill <ivtfbM,!,^ t HWNa>KIM<l|lf«UUll|UIU
Conditional Distributions and Independent Random Variables 323 /
IT t h e j o i n t p d f O f X i n d K is g iv e n a s f i x , y ) * K u(x. y), w h e r e K i s , c o n s ta n t
Therefore,
S rs
r ! . This
M follows
i * n° because,
r ,0 COmpU,e " * COnS,an' f0r finding ,he condition.)
for instance, '

- 1/4
fx \ r t o \ y )* — — = - K ujx .y) _ _«(*.
_ y)
_ _
/ i ( x .y )d x K j\(x ,y )d x f ~ u (xt y ) d x • Example 1.9. (The standard bivariate normal distribution) Let X and Y have the
Ihere K cancels out. joint pdf

1 _ - - ( l / j ( l -pM lijr* - ip x y * y * r X< 1


S ' . s a c c S i T d i5 t n b U ,i 0 n ° f X a " d y “ * » Oy ‘h e f o ,to w in g j o i n , p d f . 2 n V i - Prv2, *f ........................................■
* -“ < y <
where -1 < p < I . Find the conditional pdf o f Y given X = x.
f( x . y ) = 0<xl+ y < |
i elsewhere Solution. We have seen in Example 2.8 o f Chapter 8 that X has a standard normal
distribution. Hence,
ind:
(a) the conditional p d f o f X given Y = v ,-(2(i - * ,)r,(x* - *p*r*y*)
( b ) P ( \ X \ > \ \ Y = y /5 ) °
olution
v/27T
<«) Firs, we need the margin,! p d f o f K which we shall obtain in terms o f A'
= 1 . - - n i l -V )l''< ** - 2p * y * r* )* * , /j
(I-y*y V 2*(l - p 2)
fr(y)= rf(x,y)dx= -<<y< 1 _ 1 .--(2(1 - p ' i r ' u * - 2px.Vt /- - (I - p; )jr! |
V 2 ir( l-^ )
\ ** elsewhere
_ 1 -12(1 - p , )|-, (ps** - 2nxy*y7)
n/ 2 » ( 1 - P j )

( 0, elsewhere
= j * ’ §Vi ->,2(l + 2 y 2), -l< y <1
• °> elsewhere
elsewhi Therefore, from the functional form o f the conditional pdf, we recognize that
the conditional distribution o f Y, given X = x, is normal with mean px and
ext, we note that
variance l - p \ Similarly, the conditional distribution o f X, given Y - y , is normal
K {x7 + y l ) rr^p 0 < x < V ^ 7 l with mean py and variance I - p2. a
f ( * . y 0)
0, elsewhere
It follows immediately from the definition o f conditional probability density
berefore, for - I < y 0 < 1, function that

( _____* ¡ ¿ ± £ 1
f x \ r ( x [ y 0) = j K - i> /\ - ^ ¿ ( 1 + 2 ^ 5 ) ’
-y /i^ K x K y /T ^ V l -°°< X < 00
-« < « < «
I o. elsewhere

3 fe L tz a
> / F ? I < x < %/ p ^ ! Evomple 1.10. A nonnegative number X is picked with the probability law given
= { 2 y /l-y U :* 2 y iY
by the following pdf:
0, elsewhere
_ | xe~x, x> 0
(b) Substituting y 0 = the conditional p d f o f X given Y = ^ is /* < * )• <0,
elsewhere

w w v j) . I f »• - ¿ 2 1 ' If X x, a number Y is picked with the uniform distribution over the interval
|0 ,x j. Find:
(a) the joint pdf of X and Y

Scanned by CamScanner
m

Scanned by CamScanner
Scanned by CamScanner
f I B ttK ProboM tfy T krorv a nd A p p lr v lto m C o ^ O ^ ^ < ’^ a ' K niam Y^ tlm ^
aicntly, X«« variables X and Y have the following joint
Example 2.2. Suppose two random variables a a

'nefim tkm 2 Two random vari.Nes X and Y are a id lo be independent if for distribution function:
x <0 or y < 0
V P*,r ra l mtmbm * «nd> the two event! ls|Jf(j) < x t and l i | K(i) < y\
! independent. In other words a x ,\. i ’- V * * ' - » . x> y ,n i
, - e~* + 2r “<Jt*jr,_ 2e~y, 1,1(1 y > x
l\X < x . Y < y ) = P { X < x ) 'P { Y < y )
Show that X and Y are not independent
v every x and v. This criterion, expressed in terms of the distribution funcMor.»,
Hates that two random variables are independent if and only if F[x, >>) Solu,io, In Example 2.1 of Chapter 8 the marpnal distribution functions were
* fy ix ) * F yiy) f°r every pair of real numbers x. y. found as
i
That definition 1 implies definition 2 is easy to verify: one has only to lake |0, x<0
Mi B(“*°»x l *nd f ij= (-°®>y ].T o prove that definition 2 implies definition I is «*>■ II- x> 0
much harder, and is omitted.
In Chapter 8 we saw that joint distributions uniquely determine the marginal and
distributions. Also, we showed through examples that the converse is false, in that |0, y< 0
it is not possible to determine the joint distribution from knowledge of the F y iy Y 11 - 2e~r ♦ e~7y. y> 0
marginal distributions. We find now that an important exception to this is pro­
vided when the random variables are independent, and that, as a matter of fact, Now, if x, y > 0,
in this case the jotni distribution is given by F\x, v) = F%(x) *Fy(y) for any real Fx (x) ■Fr (y) = (I - • (I - e 'v ) * H *. y )
numbers x. y.
We give yet another definition of independence. Hence X and Y are not independent. *

Definition 3 Two random variables X and Y are said to be independent if for To determine whether two random variables are independent or not, there is
every choice of real numbers a, b (a'< b) and c, d (c < d) the paiTS of events actually no need to find the marginal D.F.’s explicitly and then check w hether ^
Ja < * < b\, \c < Y *Zd\ are independent; that is, F{x, y ) = Fx(x) • Fy{y). It suffices to check whether the joint D.F. can be
factored as a product of two functions (not necessarily D.F.’s). one depending on
P (a < X < b . c < Y< .d) - P(a < X < b )P (c < Y < d )
x only, and the other on y only (see exercise 1). On the basis of this, we could
' Jt is left to the reader to show that definition 2 and definition 3 are equivalent. decide immediately that the random variables in Example 2 2 are not independent,
^'H andom variables which are not independent are said to be dependent random without finding the marginal D.F.'s.
llriables. Sometimes one can tell if two random variables are not independent pictorially
*
simply by looking at the domain of the definition o f the probability distribution.
Example 2.1. The joint D.F. of X and Y is given by
To see this, suppose JITand Y have a joint pdf which is positive over the shaded
x < 0 or y < 0 region indicated in Figure 2.1, and zero outside it. (For convenience we are
x > 0 and y > 0 assuming that X and Y have a joint absolutely continuous distribution.)
Are X and Y independent?
■j
i
Solution. The marginal D.F.'s of * and Y are given by

10, jc < 0 V
F x(x) ~ lim F\x, y ) =
y-*~ 11 - e~x, *>0 < «. v )

md
u ►

10, y< 0
FYi y ) - lim f\x , y ) -
X-*• 11 - e~y , y> 0

for every x, y , the Figure 2.1


ire independent.

Scanned by CamScanner
Scanned by CamScanner
\ i Bath- ,mn>b*Miry Theory end Apphccnom C o n d ilio m lD a trU v m K a n d tn d tfeK im llU iid o m Y tm b k s H U ^^0
Comment I fX and Y are independent random mTitbits, then, for m y x„
[* “ ' I * 0* lM ,h" defin,,i0n il 'O“1« 1« " *0 one Of the three
"“ f r I tWI* 10 * ° W ,h"‘ 11" «luwalent to definition 2. Before ¡ \ X = x , \ Y * y , ) ’ H X ~ x t)
- I the equivalence, we dull establish the following result:

If X and Y ire independent random variables, then because ^

W y . , , y . v>-
« X - x , \ Y - y , ) - n Y ._Vj)
=P x M P r M
p y (y /)
, p x { j!l)
^

That IS. the conditional distribution o f X. firen Y = y ,.is the u m e as the ilisrn-
This is true because button o f X and hence does not depend on Y. It is this consequence that might
explain the use of the term “ independence
F[a, = lim F^a, b - - ) Similarly, if and Y are independent random variables,

P [ Y - y ii X = X i) - l\Y = y ,) = P r(y i) ^
" ' Fy (6 " n ) ’ since X and Y are independent
for any v, ^
= Fx (a) • \m F y (i? - jj) = Fx (a) ■Fy(b~) Example 2.3 l f l \ X = i, >' = /) = , i. / * 1. 2 .........show that X and Y

are independent.
We shall now show that definitions 2 and 4(a) are equivalent.
Solution In Example 13 we showed that *
(0 Assume X and Y are independent as per definition 2, that is, F(x. y)
- Fx(x ) *F y (y ) f°r every x, y. Tlicn
P x (i)- i?. i= l.2 ..
¡ \X = Xi. Y - y j ) - h \x j,y j) - F(x], y j ) - F [x,,yj) + F(x1, y j)
and
~ Fx (x i)F rb'i) ~ Fx(x'j)Fr(yi) - Fx (Xi)Fr (yj)
+Fx (xJ)FY(y])
=M - Fx (xH] ■(Fj-CKy) - Fr (y])] Pr(i)= f r /= 1 . 2 .. ..

That is, Since p{i, /’) = px (i) •Py(j) for ever)' /, / = 1,2, , the random variables X and )'
H X = x,. Y - y j ) = H X - x , ) - H Y = yi) are independent.

This implies independence according to definition 4(a). Example 2.4. Show that the random variables X and Y which have the trinomial
(if) Conversely, assume independence according to definition 4(a), lhat is. distribution with n trials and parameters p. q arc not independent.
¡\X =x h Y =y/) = P(X - X/) ■P(Y - y j ) Then, by the definition of a joint distribu­ Solution. We know from Example 2.4 of Chapter 8 that the marginal distribution
tion function, . o f * *s binomial with n trials and the probability of success p. Also we saw in
F \x.y)= I Z P[X = x,. Y - y /) Example 1 4 that the conditional distribution of X given Y - j is binomial with
y / t y x /< j n - j trials and the probability of success pl{ \ - q ). Since the conditional distribu­
= Z Z F\X =Xj) • P[ Y - y /) , by assumption tion of ATgiven Y = j i%not the same as the marginal distribution o f X, we can
//<>• x,<x conclude lhat the random variables are not independent.
= z nx= x< ) z n r - y i )
* ( « ji r/<y Example2.5. Consider a sample space 5 = |j , , , j , , , , , , | where all the outcomes
arc equally likely. Let X and r be two random variables defined on S as shown
=Fx (x)-Fy(y) below x
This gives independence according to definition 2.
Hence, from (i) and (if), we have the equivalence of the two definitions. i no
*1
*3
•i
‘A

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
i/ Beac Probability Theory end Applications Venables /
Con*ditiona/ Distnbutions and Independent Random
Show that X and Y v t independent, and find
M / f t < jt < j)
{ b ) /> (i< K < l>

3. J n d w n n f b i i x M d Y a jo in t discrete distribution w M t h e probabi/K


mass distributed at four points as follows: P (X = 2 . Y - 3 ) ~ i , * \ •
= a , f J( X = - l , Y = 3) = b ,P ( X = - \ . Y - ~ \ ) ‘ l M X and f a r e in dependent.
determine a and
4. Suppose X 2nd Y are tw o random variables where Y is a degenerate random
(0,0) (2.0) - {5t0) variable. Show that * a n d / a r e independent.
5. I f * and X have a jo in t probability function which is positive at precisely three
Figure 2.4 points, under what conditions will * and K be independent? Can y o u generalize
this?
Of thC iegi0un E “ cqual 10 ,he a,ea o f ,he * iuare minus * * «*» 6. A number is picked at random from the set o f integers II, 2 , . . . , 1261. Suppose
r ? ' tqual 10 25 - 2(1 • 3 • 3) = 16. Since the height X represents the remainder when the num ber is divided by 7, K the rem ainder
at cach point o f the region E is ¿-, the desired probability is J f.
when it is divided by 6, and Z the rem ainder w hen divided by 9. A nsw er the fo llo w ­

5 2 ¡¡¡¡¿ E £ 2 r" independCT'“ “ « ing:


(a) Are * and Y independent?
(b) Are * and Z independent?
tha' * * in d : *re “ V I * « « . I h - » « h(X ) and , ( Y). Hence, (c) Are Y and Z independent?
pdf o f / » " d ° ra variables- T h ' n f o n ’ wri,u,‘ W f<” «■* * * « 7. Suppose random variables * and Y are independent, w here X is uniform ly
distributed over the interval ( 0 , 3J and Y h a s the p d f given by
f x \ Y'(U, V) = fx * (u ) * *
0< y< 2
for any u, v. fy (y )
lo, elsewhere
From Example 2 .1 0 in Chapter 6 we know t h a t * 2 and y 2each have chi-square
distnbutions with 1 degree o f freedom. Therefore Find the joint D.F. o f X and Y.
8. The amount * (in dollars) that Tom earns in a day has the p robability function
— L — vj/j
f x \ Y1(u>v) 2lty/uv ’ «>0, f> 0 . r i _ 10 + lx - 2 5 1
P (X x = 10, 20, 25
0, } 50 ’
elsewhere
and the amount Y (in dollars) that Jane earns in a day has the following p ro bability ®
EXERCISES-SECTION 2 function: ^
1. Show th at the criterion for the independence o f two random variables amounts
tobeing able to factor the joint D.F. P(x, y ) as a product o f two functions, one 9 * \ y - 161
F (Y ~ y ) >' = 9 ,1 2 ,1 3 ,1 6
50
« p e n d in g on x only, and the other on y only. That is, F[x, y ) = w(x) • v (y ) for
every x , y .
If the amounts that Tom and Jane earn are independent, find th e probability that
2. I f the jo int D .F. o f ATand Y is given by Jane cams more than Tom.
r0 , x < 0, or y < 0 9. The random vector (X, Y) is distributed on tw o points, ( 0 ,1 ) and (2 3 ) w ith
*V respective probabilities J and I Find the jo int distribution o f tw o random ’
-j~ -, 0 < jr< 2 and 0 < y < 2
r v i T indepen<!em and have distributions identical m j
with those of * and Y, respectively.
flx .y ) = •jjr, 0 < jr< 2 and y > 2 C 3
10. l UT ? * 3nd i m in < lep in d c n l rand°"> variables w ith ^ < X < t ) = i
x> 2 and 0 < y < 2

x > 2 and y > 2 (a ) P ( A \ J B ) (b) P(A - B )

Scanned by CamScanner
m
t m
w
w
w
m
» m
u m
m

Scanned by CamScanner
Scanned by CamScanner
/
) Bask Probability Theory end Applications ( M y /»««««*« Independent Random Vriabki / U ■

(v ) (T he conditional distribution) We shall define this by considering the ran­ and


dom variables X , Y, Z, W which have a joint distribution. The conditional (joint)
0<z< I
i f o f X and Y given Z - r 0 , W * w0 is defined by
!? elsewhere

fx , m . w ( * .y \ H.„) - L x j . z . ’f - y - ' o . " o) Notice, incidentally, th a t / ( * , r . z ) = f X M f r ( ? ) ( z ( * ) Tor every X >'. 7. so -hat


(2
f z , w 0 . wo) r , and Z are mutually independent random variables.
j b e conditional ^ ^ in l) p d f o f AT, Y. and W given Z = z0 is defined by ’ (b) The joint p d f o f * and Y is given by
[ S '[ T x 'y z d z . 0 < x < I, 0 < y < !
fx . r. w u t o , . W I ; o ) = 7= ' O
I lz (-o ) elsewhere
and so on.
\( jx V , 0 < x < I, 0 < f < I

(W) (M utual independence) \ ( X „ X , ..........* „ have a joint absolutely con­ = I o, elsewhere


tinuous distribution, they are said to be mutually independent if
( c ) The conditional p d f o f X and K given Z = z 0 . 0 < r „ < 1. is obtained «1^

/(* i . * * ---- , xn) = ! x f r t ) f x M ■- ■f x tl(*n)


for every x , , j f 2......... x„.
In the c o ntext o f mutually independent random variables, the word ‘mutually”
_ |6 x V . 0 < x < 1. 0 < y < I
is often om itted.
” 10. elsewhere
Example 3.1. Suppose X , Y, and Z have the following joint pdf:
The answer in (b) is the same as in (c). Is this surprising0
f ( x . y . z ) = * l7* 2y z , if 0 < x < I. 0 < y < \ . 0 < z < I (d) To find P[X < Y). we use the joint p d f o f A' and Y obtained in t b).
10. elsewhere
[ \ X < Y ) = ( 'S '( > x 1y d y d x = \
Find:
(а) the p d f’s o f X . and Z
(e) We have
(б) the jo in t p d f o f X and Y
(c) the conditional p d f o f X and Y given Z = zQ, 0 < z < 1 P (X < Y < Z ) = /// f( x ,y ,z ) d x d y d z
(d )P (X < Y ) U x .y .:)\x < y < z \
(e) P (X < Y < Z ) - 1 lf ‘f } 12x y z d x d y dz
0 0 0
Solution _ 4_
(fl) The p d f o f X is obtained by integrating, as follows: = 35 I

r r t i w I I / 1/ 1 0 < x < I Having discussed the general nature o f the distributions o f several random
fx ix ) = / / fix , y , z ) d v d z = jo o
variables, we now give a result which deserves special mention:
' 0. elsewhere
If X t , X } ......... X„ are (mutually) independent random variables, then any sub­
l3 x 2, 0<x < 1
set o f these random variables is independent.
10, elsewhere
Similarly. The reader will get the idea o f the proof in the general case if we show in
particular tha! X tr X 2, . . , X k (where k < n) are independent whenever
/ 7 ‘ 12xV* 0<^< l *2......... X k , X k , X n are. We observe first that, for any u |. w2 . ltk ‘
fy iy ) = J o o
(o elsewhere \X i < U |......... X t < Ufr, X k . x < ° ° , . . . , X„ < ° ° l
= lArl < u l ln lA r 2 < M 2l n . . . n | A r ft< M * i n i A r Jk, l < » t n . . . n } j r n < e* l
_ \2 y , 0< y< \
= lATj ^ U | ......... X k ^ I
10, elsewhere
since \X f <«>1 = 5 for any *.

Scanned by CamScanner
k î î n s î î î î t î

ta

Scanned by CamScanner
Scanned by CamScanner
Functions o f Several Random Variables I ?S3 ^

Bear Probability Ttn'on end Applications


• . -ni the probability function of V is given by
Continuing this argument, tne P __________ _
Ejgjtain situations, i» is possible to Had the distribution o f h(X, Y) by using what
E r e called the moment generating functions.

THE DISCRETE CASE


We shall give various examples to illustrate the essence o f the approach used in
r the discrete case. (<•) Thc '™ dOTn vafial,le r/a n d W =°3.*then M s) = 3. and so «1 ^

I Example 1.1. X and Y have a joint distribution with the probability function
mum o f m y(i). For example, if JT(s) - 6 and
We have, for example.
I )
*
I given below.
HV = 4l = IJf = 4, y = 5 l u i x = 4, y = 7l

X \ 3 5 7
Therefore,
2 ± _L -L 5
24 12 12 i &
-L 1 , 1 1 1
4 0 12 4 A A W - 4> V i 2 B 3 &
6 1 f.
6 M A Continuing, the probability function of W is given b y ________ &
P {Y = y) 1 i
A 3 4 r~ ~ 2 3 ____ 6_ &

Find the distributions o f - &


(e) Z = X + Y
• Example 1.2. X and Y are independent variables with X binomial Bin, p) *
(b ) V - m ax(X Y) ran d o m

(c) W = m in(X Y). and Y binomial B{m. p). Find the distribution of Z = X + r. V
Solution Solution. Since the possioie values of X are 0 , 1 , . • ■, n and those of Y arc V
(a) Tlic probability function ofZ*is given as 0 , 1 , . . . , wi, the possible values of Z are 0 , 1, 2 , . • • >n
*
I 3 5
X A +A +
7 9

A + 4+ 6= M
11

t2 + M = ^
13
_L
12
Now
u + y = * i= u = o ,y = if c iu u = i.r = k - i> u .- u tx = k . r= o l «:
flZ = i) M 12
= U \X = i, Y - k ~ i \ %
i=0
For example, %
\Z - 7| = \X = A, Y = 3 i u \ X = 2, y = 5 ( U ! X = 6, Y= 11
Thenfore, %
p ( z = * ) = f ( J f + y = * ) = i{ u ix = i, y = f c - /i) «
Therefore,
/tz = 7 ) = ^ = 4 ,y = 3 ) iw = 2 ,r = 5 ) + w = 6 ,r = D 2 p<x = i, Y - k - 1), since the events are mutually exclusivei
i=0 ’ ’ |
= I) P{X = t)P{Y = k - 1), since X and Y are independent
»=0

s | 0(” ) p ' o - p y , ( ^ (V - ,d - Pr
soon.
Notice that, for example, since X is B{n, p) and Y is B(m, p).
iK - 7 i- u - 2 . r - 7 iu i x - 4 . y - 7 iu u r - 6 . r - 7 i

Therefore,

I I 1 5
i \ V - 7) = 24+! 2 + l2~24

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Functions o f Serení Rendon: VartcMn / S IJ
P 21 B«*h Probability Thenty end Application

Comment There are some situations where one can easily identify the random and
vihablcs V = max(A, K), W= min(Af. Y).T o be specific, consider the distribution
, , . .1 2 (1 -7 ), 0<7< I

I
given by the pdf
I r W - |o . elsewhere

4. Find the distribution functions of


0, elsewhere
(a ) X + Y (b ) X Y (c ) X / Y
In this case, max(Jf. V) = X and mint AT, K) = Yt in view of the fact that f(x, y ) is if X and Y are independen, random variables with the following pdf .
positive for 0 ^ y < x < » . Hence, finding the distributions of max(J\f. Y) and
0<x< I
min(Af, I') amounts to finding the marginal distributions of X and Y respectively /* (* )= J o*’ elsewhere

The random variables max(*, , X l t . . . , X n) and mini*, , X i ........ XH) are 100
^>100
referred to as the extremes. Their distributions are very important in reliability and M r)-
renewal theory . To realize the importance of the distribution of these random _ elsewhere
variables, consider the following situation: Suppose a machine runs on n electronic 5. The joint pdf of X and Y is given by
components. Let X f denote the life of the/th component. If the machine breaks
down as soon as one component goes bad, then we would be interested in the distri­ k
x >0, y > 0
bution of min(AT|, X i . . , X„). On the other hand, if the machine breaks down /( x .J 'H O + J + y ) 3’
when the last component goes bad, then our interest would lie in finding the 0. elsewhere
distribution of maxCJf,. X 7, . . . JK„). where k is a constant Evaluate k and obtain the distribution function o ( Z - X + Y.
6. The random variables X and / have the following joint pdf:
EXERCISES-SECTION 2
Note: Since computation of the integrals in many of the following problems can be „ U ( i♦ x r t * 2- / ) ] , -i<x<i,
J(x, y ) jq elsewhere
quite involved, it will be sufficient to iust set the integrals up.
I. If ATand Y are independent random variables with the pdf’s
Find the distribution of Z = X + Y.
n I< x< 2 7. Find the pdf of Z = X Y if the joint pdf of X and Y is given by
/* (* ) = 10, elsewhere i „-*<•♦ y> x>0, ^ >0
.-<>• fix. y) = elsewhere
y> \
elsewhere
8. Find the distribution function of V = max(Af, Y) if the joint pdf of X and Y is
find the distribution functions o f- given by
(a) X + Y (b) X Y (c) X /Y (d) max(*. r )
2. Suppose X and Y are independent, identically distributed random variables, _ Ixe -x(i ♦ y) x>0, y> 0
/(x, y)
elsewhere
each having the following pdf:

■HT, u > 100 3. MISCELLANEOUS EXAMPLES


fX (u) =fy(u) = u We present the following miscellaneous examples in the continuous case as a
0, elsewhere separate section in view of the importance of the results contained therein.
Find the distribution functions o f- Example 3.1. Let X and Y be independent and uniformly distributed over the
(a) X + Y (b) X Y (c) X ¡Y (d) max(A', Y)
interval | 0 , 1J. Find the distributions o f the following random variables:
(e) min(X. Y) (¿) * + Y (Ib) X Y (c) X /Y {d) max(Af, Y)
3. Find the distribution functions of
(a ) X + Y (b )X Y (c)m a x{X,Y)
if A'and Y are independent with the following pdf’s:

2x, 0<x< I
fx ( x ) = 0, elsewhere

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
m ' V“ * * * * * * i W Ar i, l „

a n « X and Y are independent Now


Function» o f Sereral Random Variqhln / 391
l\X < H )* P (Y < ,t)= ! ° - '< 0 Hence
I « -* ■ '. f>0
Therefore,
, x ir ( , ) = [ f ' N ~ 2 0 V ) ( $ f - f S d K

l0 - f< 0
~ i 0y h r « T 7 ) ( o f - 2p, 7 / 3 ' W
'N - O r I - o . I> 0
And, consequently. ¿ - j t - m .i M dy
»0,0, v/l ~ I y (cxpi 2(1 ” P )\<M 0,0,

W . n ( ') = »>o To evaluate the above integral, let


1u ’ elsewhere
«= I ' l - 1PL + i \
(e) The distribution o f mtn(X. Y). 2 (l~ P 2)\oJ 0,02 CFj)
We have in (x >0 < / ) = I - P (X > r) ■J \ Y > i ) After simplifying, we get
since X and }' are independent. Now
' « * > - i f >’ *
* * > /)-!f- ,> o
I 1, / <0 Hence
Therefore,
-°°< r< °
¡m y g . ! , - V *
i>0
l< 0 Commmr. In Example 3.3 if. in particular, we set p = 0, the distribution of K *
Hence X /Y has the pdf

( rn
Jm ■ ,W. r)(0 ~
- Ii n2e 7‘■ 1> 0
fr ,y(/) = — ±!°2____ I .
'
° l/°
°i/°l ;
ir(a? + a |f 3) » [(o./o,)2 + rJ] ’
&
10 , elsewhere
which, as will be recalled from Section 2.4 of Chapter 5, is die Cauchy distribution
&
Example 3.3. Suppose X and Y have the joint bivariate normal distribution In other words, i f X is N(0, o\), Y is N(0. a]), and ifX a n d Y are independent,
given by then X /Y lm a Cauchy distribution with parameters a = a,/o, and b = 0.

The Student's t-distribution &


Rx' y ) = 2 ^ 7 — ' ex p r 2< r v ) ( f ? " 2p Z s S ) ! •
Consider two independent random variables X and Y. where X has the standard
% *
-° ° < X < °°, -« < _ )< = normal distribution and Y the chi-square distribution with n degrees of freedom,
e are going to find below the distribution of a new random variable T where &
Find the distribution o f the quotient. X/Y.
r= VP
Solution. As we have seen, the distribution o f the quotient is given by

f x / r 0 ) = / '/ f r y . y )y dy - / “ /(O', y )y dy k ' “" 1 2 wc cln T as T = X/Z. As can be easily verified, the
0 distribution of s/YJii is given as

Z>0
/z ( 0 = 2<"'J) - 'r ( 'i j
%
elsewhere
%
c

%
c

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
P f /Utk- tyobeM itv 77,v o n ' and Applicanons

Therefore, substituting for /* (/),


5. Suppose the current I in a circuit is a random variable with the following pdl :
r< 0
( M - IN --*»-
'> o 1 10, elsewhere

Also, suppose the resistance R is a random variable with the pdf


If 0 < r < 1, i! follows that
12“ , 0<u< I
* / ' F x O * r) - Fx ( , ) r " J t = « 1 / - r f f , ( , + r) - f j f ( / ) r -i rff 10, r.|;*where

Find the distributer, of the voltage E given by £ - IR.


t / ' r[ ^ ( f + o - / > - w r l * j 6. The sides X and Y o f a rectangular region are given to be independent random
variables with the following pdfs:
“ » { / '■ 'K ' + D - r r ' r f / t / ' o - r r - . * |
. 1_r I < jt < 3
= w' r« -i/, . , r I fx (x ) - elsewhere
i ( i - '• ) + - [ 0.

( i f « < j < I - r. ,hen 0 < , + , < 1 , and consequem|y^ ( , + , w * r JfJ and


then 1 < r +r, and consequently F * (/ + r) = I .) .’ ’ I < x< 2
If 1, it follows that fr W ~ 0, elsewhere

» f lFx C + r )~ Fx (t)]"~' d t = n / 1[ 1 - 1 f - i d t = 1 Find:


0 (a) the distribution o f the area o f the region
(b) the expected area
(Since , > I m d 0 < , < 1, we have I + r > 1. Hence Fx (t + r)= I and Fx (t) = , )
in summary, f 7. The length X o f a rectangular region is a random variable with the pdf given by

_ ü (3:r! - 2 * - I), I < x< 2


r< n fx M =
r" l vise Altere

«I- O O Cl
Given that the length is*, the distribution of the breadth Y is given by the follow­
r> t ing pdf
ta s 2(x+ y)
1< y< x
£00= (3*J I)’
M r)- -r). 0 < r< 1
(0. elsewhere
elsewhere
Find the distribution o f-
EXERCISES-SECTION 3 (a) the perimeter
1. Suppose X and Y are independent and identically distributed random variables (b) the area.
each having an exponential distribution with parameter X. Find the distribution o f- 8. M X and Y are independent, normally distributed random variables, each
(a) X + Y (b) X /Y Nlß, a1), find the distribution of U - \J X 2 + Y 2.
(c) nax(X , Y) (d) min(X, Y) 9. The joint pdf of X and Y is given by
2. Let 0 < a < i . Two numbers are picked independently, one at random in the
interval [a, ¿ J,a n d the other at random in the interval [-*, -a]. If JITrepresents the (1- - y \ x 2+ y 2 K 1
K x .y )-
number in [a, A], and Y the number in \-b , a \, find the distribution of—
elsewhere
- (a) the sum o f the numbers, X + Y
(b) the product o f the numbers, XY. Find the distribution o f Z - \J X 2 + Y 1.
3. In exercise 2, having obtained the distributions o f X + Y and XY. find £(X+ Y)
and E (XY).
4. If X and Y are independent random variables each having an exponential distri­
bution with parameter X, find the distribution o fZ = X - Y .

Scanned by CamScanner
lliK llllitilk

Scanned by CamScanner
/
JVO Htsh Pr\,hcbiht. Thtvry uh i Applicano fu

one as we have scon in Chapter 10. The following equivalent definition, using the Solution. We shall find t \ Z ) using Iwo me.hods
jcint distribution o f X and Y, avoids this.
Method I: In Example 3.1 (d) o f Chapter 10 we found the distribution o f Z as
The expectation o f Z = M X . Y) is defined by
, , . 1 21, 0<i<l
,7 10. elsewhere
-L J . ^ x , ) ’)f(x -y)dy< lx, in the continuous ease
E(Z) = E{h(X. >0) =
Therefore.
— X h(x „ y t )P{X - X,. Y=.Vi), in the discrete case
E(Z) -- f ' t f z C ) d t = f ‘r - 2 t d t = j

J n T c iT Z t ^ 'T ,S USed- " " ° n“ d '° 0UIJin ,he distribution o f 2,

the two definitions. (The above formula generalize m i n ' b


:T" inT‘'01a“ eqUIValence of
Method 2:

E(Z) = S 'S 'm ix (x ,y )f(x ,y )d y d x


..voiving more than two random variables ) " y ° SltUaII° nS

- j ' S ' max(x, >0(1) d y d x


joint di^ribu*iom ' Proofs'iiTih^rt ° f ' h' aSSerllons wil1 be ®ven f u m i n g continuous 0 0

^ by " the d,SCrele “ rollow anil0^ b* * -■ since f(x. y ) - I if 0 < * < I and 0 £ y < I , and f(x, y ) = 0 elsewhere. Hence

f T z T x e! y ^ PZ X and Y havt a" absolutely continuous join, distribution. E(Z) = / 1I f 1 max(x,y ) d y + / max(jr,y ) d y \ d x
0 |o X J

Now. if 0 < V < X, then max(jr, y ) = x . and if x < y < I , then maxO, y ) = y.
/ \/" /" (x + y)f(x,y )Jydx
Consequently.

E(Z) = S ' \ i X x d y + S 'y d y ] d x


z s s s z r r ™ oH l,e iw° defini,,ons o f « p ecia!i°"- ■« icmi in 0 0 x \

Solution. Jn C hapter JO we saw that the distribution o f Z is given by

f z ( z ) = S ~ f(x , z ~ x ) d x , -•» < 2 < “>


Example 1.3 X and Y have the following joint probability function:

Therefore, l « ( * + > ,i). x = 1 ,4 , y = - l , 0 , 1,3


P (X * x , Y = y ) -
elsewhere
E (Z )= f z l z ( z ) d 2
Find (a) E (Y 2/X ), (b) F.(XY).

= / " - J / " / ( * , z - x ) J jr| dz Solution


lai
- SS zf(x, z - x ) d x dz
E (Y 2IX) = I , I . £ - n X = x , Y = y )

=S S ( x +y ) f ( x , y ) d x d y
X y x 42
letting y = z - x .

Example 1.2. S u p p o se * and Y are independent random variables,each with the


4 hK)
uniform distribution over the interval [0, J ], Find E (Z) where Z = m a x (* Y).

=¿ 1 (n + f )

4 [(11+t H ,1 + t) ] = I

Scanned by CamScanner
Scanned by CamScanner
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------------
394 / Basic hnhotuh’v 77'i‘orvand A nnlfrtrior* * i Xpt , w non-Several Randnm I orvM ci /

The Expectation of a Linear Combination If A, and h 2 are real-valued functions


£ iX " r* )* ['_ f~ x"y * H x, y ) d yd x
of two real variables, and if o and b are any constants, then
- f ' f * x ny k • 2r(4f - y ) dy dx
0 -*
b\ah,(X. Y) + bh2(X. Y)) = aE(li,(X. Y ))* b E (h 2(X. Y))
- 2 / ' j c —* { X( x y * - y k " )d y c b ;

■ ■ ..» « L i______i H )* M , The result follows since by the definition o f expectation


0 l i t I * +2 *+I * +2 |
2 E(ah,(X. )') + bli2(X, Y))= l ~ r W , ( x , y ) * b h 1( x . y m * . > ' , ^ "
i t r n l(T i~ i)(l - H ) * ” ) - f j - j f l
= a f ~ f Mh ,(x ,y )f(x .y )tly < lx + b f _ f > h (x ,y )f(x ,y )d y dx
A i l i n g appropriate values to „ and A, this gives £ W = | (setting n = I, * = 0),
f ( D - l (settingn = 2, t = 0 ),£i(I0 — (setting/! = 0 ,* = I), m i E ( Y 1) = l -aE (hi(X , Y)) + bE(h2(X, Y))
(setting/i = 0 ,* = 2 ). K ’ 5
Consequently, The expression a M X . Y) + blh (X. Y) is called a linear combination o f the
random variables M X W . >9, and aE(h,(X. Y)) ♦ W a called a
V ar(r) = £ W !) - [ £ ( * ) ] ’ lin e a r combination of the real numbers £ (* ,(* . I'M. £ (* ,(* . 10)- The above result

then states that the expected value of a linear combination o f random variables is
and
equal to the linear combination o f their expected values. This property o f tile
expectation operation is referred to as the linear property.
V a rfn = £ ■ ( ) ' • ) 2 1

Some particular cases


Example 1 .7. Suppose X and K have a continuous distribution with the joint pdf (/) Setting h\(X , Y ) ~ X a n d /^ (X Y )= 1 gives
_ I (*■+ .v). if 0 < j r < I and 0 < y < l
K x .y )
10, elsewhere E{aX + b) - aE{X) + for any constants a. b
Find £(min(Ar, /)).

Solution For example, it now follows that

£(m m (X , Y ) )- f f min(x, y )j(x , y ) d y dx e [ ^ A =0


' °x 1

= f ' f 1min(x, y ) ( x + y ) d v d x (why? I (h) Setting lit(X, Y) = X, h 2(X. Y) = Y, and a = b = I gives


o o

= / ’ / * m in (x .y )(x + y )d y * f'm i n ( x .y ) ( x + y ) d y j dx C (X + Y ) = E (X ) + b \Y )
o

- I ' f XA x + y ) d y + f ' x ( x + y ) d y dx (why?)


o I I ' That is, the expected value o f the sum o f two random variables is equal to the
sum o f their expected values.
Computation o f the integralsyields
Comment. In proving that E (X + Y) = E (X ) + t ( Y ) , we have not made any assump­
tion regarding the dependence or independence o f X and Y. The result holds
irrespective o f such considerations.

1.2 The Basic Properties of Expectation


We shall establish below the following (wo main results: (I ) the expected value The foregoing result regarding the expectation o f a linear combination generalizes
rf the sum o f two random variables is equal to the sum o f their expected values; in an obvious way to the case of n random variables X ,, X 2......... X „ . We have
¡2) the expected value o f the product o f two random variables is equal to the pred­
ict o f their expected values provided the random variables are independent (the

Scanned by CamScanner
Scanned by CamScanner
c x/m in io n -a e m iu u n a o m ► « » «

1.3 The Covariance and (he Correlation Coefficient


Two constants which provide a measure o f relationship between random
variables in the theory of joint distributions are the cuvttriance and the correlation
coefficient.
IfyTand Y ire two random variables then their covarnnte, denoted by
Cov(X, Y). is defined as

Cov(X. Y) = E {(X - V x ' t f - f r ) )

where ¡ix ~ R X ) and ^ r = £(Y )-


In the above definition, if in particular we take * = Y. then the formula yields

Cov<X X ) =E[(X - u r ) 1! = ValW


In other words, the covariance o f a random variable with its e lf is its variance
The following version o f the formula is often convenient for computing the
covariance:

Cov{X, Y ) = E ( X Y ) - £ ( X ) E ( Y )

Using the properties o f the expectation operation, this can be proved as


follows.

C o v « Y) = £ [ ( X - u x X Y - M y ) l
= E ( X Y - » x Y - n YX + u x liy )
=E(XY) - tix E (Y ) - u YE (X ) + tix p Y (why?)
= E (X Y )-E (X )E (Y )

Comments. (1 ) If X and Y are independent, then we know that E { X Y )- E (X )E ( Y )


and, consequently, Cov(X, Y) = E { X Y )- E (X )E ( Y ) = 0. The converse o f this is,
of course, false, as Examples 1.9 and 1.10 show. Hence. ( / hvor.v. ’s are inde­
pendent. then it follows that CovfX, Y) = 0. However, ifC ovfX , Y ) = 0, it is
erroneous ro conclude that X and Y are independent.
(2) Notice that x 1 ± 2x y + y 2 = (x ± y ? > 0. Therefore, \ x y t < ( x 1 + / ) / 2.
Hence

r r \x y \f( x ,y ) d y d x < / " / " ~ ^ - f ( x ,y ) d y d x

= -¿ \£ jC x 'l(x-y)<iydx + f ' f ~ y tf(x .y )d y jx j

Consequently, E(XY) exists if E(X>) < » and E (Y 2) < » . Therefore, the definition
of covariance is meaningful i f £ ( * 2) and E (Y 2) are finite.

Scanned by CamScanner
¥

•00 / Basic P n tc b ilily Theory jnrf Applications <0


F.xptciarioit -Srreral Random Veriob ln / 401
«.M * „ * r ,„ * , ^ ^ <3

” " r“ ™ to . £^ = 0 (h ) + , ( S + 2 (i Í ) = tI
O
^ > = 0l £ ) + l l 7 ? M i i ) = 7 f
I*x. y) = Co^ .n = Etxr¡ - e (X)e(Y) 6_
________ V v >K-y)Vat<y) VvaitADVa r(r ) and W n - 0 - l ( ¿ ) + O . 2 ( ¿ ) * l - 0 ( ¿ ) * l - l ( ¿ ) * 2 . 15

provided, of course, neiiher variance equals zero. Asa result.

Covfjf y \ = n ii J C ,ndePende" < i n this case


need not imnlv " i* ° us thal the convcr« is false; that is, p(X, r ) = 0
^ 9 and To H n " r ra ’ w * * * " md‘ pendem- <0 n « ^ a in , Examples
.»ana 1.10.) In summary, rndependent random variables are uncorrelated and c « W r . n - é - . ( { § = - é
^ nmdom variables need not be in d e p e n d í
Hence
E* Z l l J i L t hT COntT SiX beadS 0f Which thr“ a* «*• white, and
z z i , : ? p,c, at random wuh° ut « n * « « « - » x P ( X .Y ) =
i s s n s f ' and r ,he number ° f wh,,e bMds- “ ,he
Solution. We know that Example 1.12 (The trinomial distribution) Suppose X and Y have a joint tri­
nomial distribution with n trials and parameters p and q, where 0 < q < I ,
0 < P < 1, and 0 < p + g < J, Find (a) Cow(X, Y), (b) p(X, Y).

Solution. We are already familiar with the fact that X has the binomia! distribution
0 with n trials and the probability of success p, and Y has the binomial distribution
with n trials and the probability of success q.
where i = 0 , 1 , 2 ; / = 0 ,1 ,2 with the understanding thal (*) = 0 if r > * o r Therefore,
r< 0 . '*■/
E(X) - tip, \ai(X ) = np( 1 - p)
Displaying Ihe joint probability function in tabular form yields
and E(Y) = nq, Var<Y) = nq( 1 - q)
0 1 2 HX=x) Also, in Example 15 in this section, we saw that
> <
0 0 ñ is
J.
IS E(XY) = n(n - \)pq
1 ís ñ 0 JL
IS Hence:
2 is 0 0 -1
15 («)
P(Y = y ) -fi.-fi.-L
IS IS IS C o v tt Y) = E (X Y )-E (X )E (Y )
= n(n —1)pq - np • nq
Therefore, ~~npq

£w = 0 + i( ^ ) +2(n )=i intu^Uvely reasonable?* ,hS, ' he C0Varian“ ° f '* thiS S“ ">
E(X7) = 0'~
151

Scanned by CamScanner
/ r _ . 7* » »«• m. . 4 ».
*’ ' "'V • «"« ■I|^»vwfViìt
(b)
p(x, y) = Co* * n
v V ar(A r)» V a r (y )

= — "tino

= - / H a l l
v O -riO -? )

^ ° v ( X K) = -n p ^

>nd y) = - [Z . pq

------------------ v U - p X i - o)

S S £ /J i SUPPOSi * ln d X h- »jo*"* distribution with 0 » jo ta , pdf

/(*.>>) = ° < > < ;c < i


(0. elsewhere
Fin d :
(fl) VarfA") and Var<Y)
(à ) Cov(Ar, ) 0 *
(c) P i* , y )

Soto/iòn. F o r any nonnegative integers «, k , w t have

E ( X n Y k ) s‘JT f mx ny kf(x , y ) d y d x

= f lf XXny k • \0x7y d y d x
0 o
■ 10
(* + 2X* + k + 5)

Therefore, assigning appropriate values to n and k,

£ W = f, B c n -f. i t o - f . ¿acn= £ Jnd f ( j m = i 5

Hence:
(«)

V” « H - ( ! ) '• è

Scanned by CamScanner
Scanned by CamScanner
m
w
f r f f m
t m
m
t t w

Scanned by CamScanner
4 0 R t B a w P m b t N b t v T h e o r y a ltd A p p l n a i m n
M p eeurto*' Several Random I « « N o 1 409

1.5 The Method of Indicator Random Variables linear property of expectation ,« follows that El,AoB) * % > * «W -
For any event A, the indicator r.v. of A was delined in Chapter 4 as one which
that is, P(A UB) = P(A) + PIB) closely linked lo that of the cone
takesthe value I at each sample po.nt in A and the value 0 at each sample point in
A Thus an indicator r.v. assumes only two values, namely. 0 and I . It is so called
because if the value of the random variable is 1, it indicates that the event A has be true. The proof will be left io the cxercise set.
occurred, and if the value is 0. it indicates that A has not occurred. lf A l t A 7........ An are n events, then th e y a re independent if and only i f the
The following identities are immediate and can be proved routinely.
indicator random variables lA(. U t ........ U „ ar€ i,,(leP*nd* ^
(0 Iab = A* '¡ b ar,d*,n general. We shall next find Vari lA). We immediately have EUa )I * 1** /V J) ♦ 0 ]
^AtAt ...A n Al, ((, tA,t
Hence,
(") U = \ - 1 A
VatdA) * n A y n A ' ) for any events
0*0 I a ub = I a + (s “ Ia b an(l . ,n general,

Ia ,»A,U..M A. -- s ' A - I , I A¡A/ * ■ ■♦ (-! r " l A,A ,...A a The covanance between two indicator random variables can be expressed in
terms of the probabilities of the underlying events as follows: Suppose A and B are
. If, in particular, A A .......... A„ ate mutually exclusive, then two events. Then
n
E(I a -Jb ) = E(1 a b ) = I\AB)

and we get
('*') I a b ' ~ 1a ~ U b

Actually, (re) and (iv) follow from (i) and (if). For example, CoyytA. IB) *P (A B )- P(A)P[B). for any two events A. B

lA B t-l A ‘ l B \ by (0
The method of indicator random variables turns out to be a very powerful tool
= (rf* 0 - / * ) , b y (ii)
in many instances, as the following examples will illustrate.
= Ia - 1 a ' I b <
Example 1.18 (The binomial distribution) Suppose X represents the number of
= Ia ~ I a b > by (o
successes in n independent Bernoulli tnals. with the probability of success p on each
Let us now find E{!A ). Since /¿(s) = 1 if and only if s e A. and IA(s) = 0 if and trial. In other words, ATis B(n. p). Find £'(X) and Var(AT).
only if s € A \ it follows that Solution, We previously found E(X) and Var(A) in Chapter 7 using the direct
approach which, as will be recalled, involved some tedious algebraic steps We now
f \ I A = l) = w and P(iA =0) =P(A')
give a much simpler approach
Tlierefore, Let A, represent the event that there is a success on the ith trial, ( = 1 . 2 .........n.
Then clearly
£•(/„)= l-P (A ) + 0 -P (A ') =P(A)

Hence.
whert I Al, lAi........ Ia „ are independent r.v.’s, since the events >4,, A ^ .......... A n
E(/a ) = P(A) for any event A are independent.
It follows that

This result shows that we can regard the probability o f an event as the expected E(X) = E(lA) + . . . + E(lAt)
value o f the corresponding indicator random variable. In other words, the concept
of expectation is an extension o f the concept of a probability measure. Ej K A d = np
This single fact now leads to the various results of the probability measure that
are already familiar to us. For example, since IA kj b ~ U * I b ~ (afl. usin8

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
-/a • Banc PmKihitfty Tlustn aiuf ipphcttions
E xpenal ioii -S r v tn l Ranamn iarubies / 4 1 7

I or - I when. jnd only when, the functional rel.lion between X and Y K |mca, -
Therefore,
thatis. when there ex,st constants * 0 and c for which l \ Y = m X * c \ = I
What significance can he attached to - I <p(JT, K )< I? Tim would imply that E (Y) = E(X2 + 2 X )*E {X 7)* 2 E {X ) = 1
mere . » positive probability that the relation between X and Y is no. linear E (Y 2) = E[{X2 + VC)7) = E(X*) + 4£(AT3) + A£(X2) = 7
and E(XY) = E(X(X7 ♦ 2*)) = E {X ') + 2FJX7) = 2
m a l r 'i l ? ” * T " “ “C^ * c) > 0 ulh" «-«I«. »»
WC in ' l,C JrV P'ane- 'here 15a P°si" ve Probability Hence V*r(X) = I, VarOO = 6 ,and Cov(X Y ) s 2. Consequently,
that (X. Y) will no. be on such a line. The correlation coefficient tl-irefore
provides a measure of .he linear relationship of X and Y. I. is due to this reason
that it is sometimes called the coefficient o f linear correlation. ** j h z = J l

C iim pie 1.22. We give below three cases of joint distributions ofX and Y which is strictly between -1 and 1, as we had anticipated.
(fl) X has a standard normal distribution and Y = I X + 1. (c) The joint pdf is positive over the shaded region in Figure 1.1 and is zero out­
(b) X has a standard normal distribution and Y = X 7 + 2X side it. We anticipate here that -1 < p(X, Y) < 1. Not only that; we anticipate it to
(c) The joint pdf of X and y is be negative (why?).

( I--O r‘*y,)/i
f ( x ,y ) = irc ’ *<0 and ;- > 0 , or
l 0, elsewhere x> 0 and y < 0

In each case, comment on the correlation coefficient in the light of the func­
tional relation between X and Y,

Solution
(a) The joint distribution o f X and Y is singular since all the probability mass is
distributed along the line y = 2x +«l. Consequently, there exists no joint pdf.
Since the relation between X and Y is linear with m = 2 > 0, the correlation
coefficient has to be equal to 1. We shall compute it directly anyway.
Since X is a standard normal variable. E(X) = 0 and Var(Af) = E{X2) = 1. Next,
E{Y) = E{ 2 * + l ) = 2 Z f 0 m i = l
E (Y 2) = E[( 2X + I )2] = £ ( 4X2 + AX + I ) = 4E(X2) + 4£(X ) + 1 = 5 Figure 1.1
and E (X Y )= E (X (2 X + \)) = 2E(X7) +E(X) = 2
It can be easily verified that the marginal distributions o f X and Y are standard
Hence •
normal. HenceE(X) = E(Y) « 0 and Var(X) = V arfr) = I, so that p(X. Y) = E(X Y).
Next,
0 -

BSB S Bfl i S i
e-ix**y*)n „ „-(**♦ y*)/j
E(XY) = f f x y - d y d x + /**/ x y - dydx
as was anticipated.
(£) The joint distribution of X and Y is singular, since all the probability mass
is distributed smoothly on the curve y - * 2 + 2x, that is, a region whose area is =i i y - '^ r y ^ O y + J~xe-*’n dx i ’y e - ^ d y l
zero. Hence there exists no joint probability density function. Letting u2I2 - 1, it follows that
The relation between X and Y is not linear, so that we already know that
p(X. Y) should be strictly between -1 and I. Let us compute the actual correlation J mue~u>n du =!"e~ , dt= 1
coefficient. Since X is N (0 ,1), we have o o
Hence
E(X) = 0, E (X ') = 1. E (X 2) - 0. H X ') = 3

E(XY)= ^((-1X 1) + OX-1)] ■ —


" -JT
8
6

Scanned by CamScanner
418 'fattc ProSehihn 71t<<nr\ end AppiictttMH
Thu*. finally,
5. Two points arc picked at nmdom and independently inside the interval |0 , .;)
f*X, Y) = - ?
n . Find the expected distance and the variance ot (he distance between the points
6. Suppose the distribution of X and Y is given by the following pdf
*hich is between - I and 0
I (x 0 < x < I and 0 < y < I
0. elsewhere
e x e r c i s e s - s e c t io n I
Find E(mix(X. Y))
a i ° 'nt distribution. For the
7. Let 0 < a < b. Suppose X and Y are uniformly distributed over the intervals
[a, b\ and (-6. -* ], respectively. Find
W ) = _ / xfx(x)dx (a )£ (* + K )
(b) E(XY) if X and Y are independent
O " the other hand, according to ,he definition g,ven in this chapter, (Comment: Recall that in exercise 3 o f Section 3, Chapter I0,£(JT * )') and E (X Y )
were found by actually finding the distributions o f X + Y and X Y . There is no
£(X)=_f~ f~x f(x>y ) d y d x
need to go this route!)
8. If X and Y are independent random variables, why is it true that E {X fY )
How do you reconcile the two definitions?
- E ( X ) ' E ( \fY ) l What restriction would be required on J ? Use the above result to
U X and Y have ,he follow,ng joint probability function. find E (X /Y ) for the random variables X and Y described in exercise 7
9. Suppose Z is uniformly distributed over the interval f0,2irJ. Define X and Y
0 as follows:
-I 0.1 0.2 0.1 X = cos Z and Y = sin Z
0 0.1 0.1 0.2
2 >0.1 0.1 0 Show that—
(a) X and Y are not independent
find—
(b) X and Y are not correlated.
(a) E(X) (b) E (X l Y) 10. If ATand Y are two random vanables such that each assumes only two values,
\ y + 1/ then show that Cov(^f, Y) - 0 implies that X and Y are independent. Hint; There is
The amount X (in dollars) that a babysitter cams on a weekend and the amount
no loss of generality in assuming that X assumes values 0 .x and Y assumes values
hat she spends in the following week have the following mint Hktrihnti
0,j»(why?).
11. Suppose X has a distribution which is symmetric about 0. Let Y - X 2. Show
1 3 6
that X and Y are uncorrelated. Are they independent?
2 1 i o 12. The dimensions .Y, Y, Z o f a rectangular parallelepiped are known to be
4 i i * independent random variables with the following p d f’s:
7 0 i J
11, I<x<4
Find: /* (* ) =
10, elsewhere
(a) the expected amount earned
(b) the expected amount spent Mr)« I !(3 - y ) ,
10,
I < y< 3
elsewhere
(c) the correlation coefficient between the amount earned and the amount
spent and
4. I f X and Xhave an absolutely continuous joint distribution with pdf given by
1< z < 5
/z W *
elsewhere
i f c r t - L 24* 1 - * *
IU, elsewhere Find:
find: (a) the expected surface area
(a) E (X ) (b) E (X 2Y) (b) the expected volume o f the parallelepiped

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
( Recall that if a ** b, llicn ac ^ be il c > 0. and ac > be if t < 0.)
and, consequently,
Therefore,

(— , if 0 < x < f l and - r r / 2 < 0 < f r / 2


f z ( ' ) = "*■f ° f \ x > y t ) l Y{y)<iy * i~l\X<iyt)lYlv)<lv
0 fx .Y ( X 'e )= )**
(0 , elsewhere
since X and Y are independent. Hence
Now. as tan be seen from Figure 2.3. (he needle will intersect a line precisely
F z (t) = / 11 - t \ i y f ))/Y ( y ) < ty +f mp x { , jy )(/l, when X ^ I cos Y. Hence
o
f \ the needle intersects a line) * P [ X r- 1 Y)
Differentiating, this jives the expression that we obtained in Chapter 10 lor the
= f mp ( X < n o s Y \ Y = d ) f r ( 9 ) J d
distribution o i Z - X / )' when X and Y arc independent
J/J
^ r l [ 2 1 1 ^ Bufff sN e e d lc ^ b k " ’t ) Suppose a floor is marked with = S " 1 P ( X < .l c o s Y \ Y = 8) —
p m U lines all a. a distance 2a apart. A needle o f length 21 ( /< „ ) is tossed «/ -w /2

Since X and Y are independent,


r i hn«. K " ’epr0babil" y ,ha' ,he " « dlc *«l '» '« w e , one
f \ X < l c o s Y \ Y = d )= P { X < lc o s d )
So/ur,on First o f all. we need a more precise formulaiion o f the phrase 'at _ I cos 0
a
e e l T o f ,1 ° " P r ' h? ' " * rand0m VariaWe X r^ “ " ‘ the from .he
center o f the needle to the nearest line, and V the angle between the needle and the
direction perpendicular to the given lines (see Figure 2 3 ). (Notice that for —it/2 ^ 0 ^ it/2 we have ( X / cos 6 K a. ) Consequently,

, .. , f t / 7 I cos 0 d 6
the needle intersects a line) = J —~— —
-*/2 " ”
I I'*2
= — sin e \
na |-»/2
= 2/
ita

EXERCISES-SECTION 2
1. Two discrete random variables X and Y have the following joint probability
function:

2 3 5

1 JL 1 ±
18 9 IS
2 ± 1 1
18 9 9

3 0 I 1*

Figure 2.3 Find:


The pluase “al random” is meant to connote that the dislribution o f X is (a) H (X \Y = i), i = 2 ,3 ,S
unifonn over the inlcrval (0. ¿r), the disliil.uMon o f Y is ufi'form over thc interval <b) E ( Y \X = i), » = 1 ,2 ,3
|~ir/2, ir/2]. and, furthermore. tliat X and Y are independenI. Henee In e joint probability function o f A and Y is given by

[I [I _ 2
P (X = x t Y = y )= ‘ n(n + 1 ) ’
y- 1 , 2 ______ x. x = 1,2,.. • »h
/* (* ) =
(o, elsewhere (o, elsewhere 0, otherwise

•C ount d e Button w a s a French naturalist o f thc eighteenth century.

Scanned by CamScanner
*32 •Bask Probability Throrr tmf .ippluvrions f x p o loii'in Srxc'oi Random I anjhltr / 4$.t
F in d :
II. A point A', is picked at random in the interval | 0 . I ] A second point X 7 is then
(a) the regression o f Y o n X
picked at random in the interval |0 . V,] Show that ihe distribution o f AT,» identi­
(b) the regression o f AT on Y
cal with that of YxY, where Y ,, Y: are independent random variables each having
the uniform distribution over the interval |0 . 1 1 Hint: II 0 < w < I .

» u - r•.« . „ *■* « *■“» ftX 2< u ) = r ^ 7 < u \ X ^ x ) f Xi( x )d x = i U l t / x + / ' ; i / x


(a) Cov(AT, Y )
(b> £T(JT | K - / ) , / = 1.2... w
Simplify and compare with Example 3.1(b) of Chapter 10.
(c> £ ‘( K |A r = f), 1 = 1,2......... „ 12. The number o f emergency calls at a hospital on any day is a random variable
4. From a group o f fifteen people consisting of four doctors, five lawyers, and s.x
X with the following distribution:
students, five people are picked at random.
(a) Find th e expected number o f lawyers. 100 150 200 300
x
(b) Given that there are tw o doctors in the sample, find the expected 0.3 0.4 0.2
f \ X = x) 0 1
num ber o f lawyers.
5. If the joint p d f o f X and Y is given by The probability that an emergency is due to a heart attack is 0.05. Find the ex­
pected number of calls due to heart attack.
fir v*= t 24- * l - * ) , 13. A tennis pro gives 8,1 0 , or 12 lessons during a day with respective probabilities
’ 10 , elsewhere
0 .3.0.5,0.2. The probability that a lesson is taken by a junior is 0.3. that it is
fin d - taken by a regular adult student is 0.6, and that it is taken by an inlrequent adult
(a) the regression o f X on Y visitor is 0.1. If the charges per lesson are 5 dollars for the juniors. 8 dollars for the
(b ) ihe regression o f Y on AT. regular adult student, and 10 dollars for the infrequent adult customer, find the
6. Find the regression o f X on Y if the jo in t pdf o f AT and Y is given by pro’s expected earnings in a day.
14. A real number X is picked at random i«1 ‘he interval (0,1 ]. 11X = x, a coin
f(x w ¡21(1 - x X 1 ” >') + *>']. 0 < x < I, 0 < , < 1 with P(head) = x is tossed n times. Let Y represent the number of heads in n
’ 10, elsewhere
tosses. Find the probability distribution o f Y and E (Y ).
7. I f X and Y have a joint p d f given by

fix.y) ik x y 2,
!o.
0< y< x< \
elsewhere

fin d -
(a) t \ Y n I AT= J)
(b ) V a K H A T H ) •
8. Tw o random variables A' and Y have the following joint pdf:

i(x,y)=j y 0,
0 < x< y< °°

elsewhere

If ^ o > 0 , fin d -
(a) E {X n l^ o ), where n is a positive integer
(b ) VaKATI^o)
9. I f AT and ) ' have the joint uniform distribution over the circle with radius 1 and
centered at the origin, find E (X n | Y = y 0), where - I < y 0 < 1 and n is any non­
negative integer.
10. Suppose X and Y are independent random variables, each exponentially dis­
tributed w ith param eter A. Find the p d f o f Z = X /{ X + Y).

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
444 ' /w «*/
1*ri r*0mg ¡•IMi'ilom i 44.s

« í r r " 1 :,M,b' «■•».«.»■.*■ +t í hm u«rJ *hn rr«): lu .^ aw «h? «*fí oí ihf Knomul dMnbuimn
H * p U rm ti* mtf ft* It* b cw vl'i 4»irrt*txw» lU .d l fluì if X n fiin r ) iheti
*M » ) ■ <*V ',V ,fí * r ***wVfIl
Í O . W « m tM « X • », ♦ ♦ K * h y* w w drprndnit
Inxm H if y V fatli will» p W H « p S**x M yfs)mpt* * H ‘ PH a i * * -•
Wmd“ KW U * '* Ó . «W* », «I. «Tf |T?

H , i t \ - * r ifiMr U) WrJ H -
1 * 0 - * <| <Ä
W :l*cf-f r^Sìnn rM t( 1 kw tfc* «*fJ<nre hwomul ^*nN»«—- « A r>*iaro
rtifM . tWn X M X * fi * 4 ♦ï# ï";
f c r w Z 'ÜÜ,"“ ' Z *""* ’* " ' * '• - ' * » * < ’ — <* * « » * .o K* tr-wprpiW» r f \ f.«> «»rh fhf u n # fr<K»««(fx +* >»HjI4 Kj»*
,c , * " " •'■— > • » « ___________* i.i»nd tb* tn%1 h* rl^> NhAm I 4*i*nM»iio« frapi fbf *•« f^*rte\n>.
, (rm ial io ihn purpow ” ^
are w t « * * •« tv •hütlkmtirm

li X « * y m jq m m im , , \ ^ -, * , ^ lW. ,,„ m ¡m J„ , t K fjrt *ívn»» i m|f il tt « *t\rn ti rw W in t « JW> At-


ifiH»?»««. fH^r» Mprfi M*ty -*\f m%i *\»t .,xr*-.çiim&i U>*r » J v.i»fnmfy
m y m<f fWw « pr*. iwty «n<* 4i^n«»^-in ô u i w (rw f»w •«> rt T>< é K H M <A
*»<«» * * * •! îK« dvffWvAl çir >H«»m « K^nml rh« ici>o* <AtÂw Sntnk • r A jtí lèmpty J*»<
Jivj JCLfpl »f M «icft

TV r m i l . « h( v rn'V J » u*i.*n l »«f»M»rw Thinn>m If ram rjm t m r*rafct*n rh< urw m%t t iHn»
h av e rK< u n v » tl^iri^ H itw io , üv! ;.'v »iv«rw ly
* L y .* r l
h trtfc*! 'mntU if X ü1«l ï sr* T*n r * 'i t ^h«i V jl 1 } * MyU )U * f .

K o » *n»*t I « roarprnorr Jl »luí r**‘ **


ii j n d _ nlY »f f 4 î i i * fy f u i t i r j l l n r ii
A« y n < m<f jt <1 hjvmg rhc -im« .lisrrh»iTi. n ir* t^<ir»tient
<a T lim . tiw t« f> lu m N
t:
aiu' iherriiMT Far itjmpie »f in r » X h.» rtw « f f M r» • f*** * * i iftw ir *K* t:
X Kji fh« F.irvíuii «tainfeatvm »nh p«»rjm«f*r i ■Z. «*r ■jthar 4tttnhvt><» N^»
M u . M i M 'ü r “ 1 wv.fi 4 mf} c:

• W| ijs j * k lj**•?«
Fh^r wporUiKic oí rh< jr^u«nc« ^«i-.nrm i m « rro«j foü<7*«n| V é ^ v c
»<• *íi*i »•> M * « ¿»tnfctt-Mn o f Z B ^i.T Y t muí « « j ^ w u * »<j ^ U m «
« ff lí »» rumi o«i ilut tfc» ü» j ^ f.jrm u rt« n^f ><* » , c
I h c a K * r trvW í ,* * S f
<T*. I , P n i i H / « i |.»
to j r . fa x * „ « , ! «
* í,t,
c' c * '»
* «Jmnfctítmn íam .'n» f then tí o
P W h> the Jaií»b«»)OB • F
I. vinírvV * e ol / «
c
tf/ÍM = l / | W ,í <V, U ;'
f I 1 v ? c .~ i « . H w „ ' l W h ,S f ( 0 * .» » ,.
In p»fíKuUf if 1 » X, * Xj * • X# tW& 0> I-«A.ÎMB3

ï
/M
I ,- i
Thaï r /V « f[f i’/' ffcr o (i of micptmkmt r i i ti rq+a ;o r v
r*xhh*.i c f rk<tr *t¿ ri

* » )•
10

V^ a * « | T i f » d tì-r U
u» X* ï ï r

Scanned by CamScanner
Scanned by CamScanner
44$ f Besa Pntfh/hilii i /WvtitJ \¡>o!h*¡io>is /
(/CHCryi'H# Function* 449

" " , 7 x< ,s t ' - n ......... "• f" f " * ,/,f The reproductive property o) th e n *<nnuldistribution.
. .. - — ............ SupposeX
, • ■. \. X 7. . T,
. . Xr
, variables and that X¡ is M u ,, o f), t = 1. -• . . . r. I hen
L ~1 ^ ,~i 0 |^ 1 ) ' ,n P ^ W iU r. if the random variables are identically dis­ are independent random
MXi(s) = eti<s*(a' i r ) . / = I - -■ ,T
t r i b u í will, a common mean „ and a contmon v a ria ,« then Jf is,V(H.
Since X i . X 1.........X, are independent, writing Z = X , + X 2 + . . + Xr>we gei

1.4 Reproductive Properties ’ ' - ' v ’_ U -c* s' V ' s - } .


Wc saw m Chapter 10 that if AT. A' ' y < , »ii, li • ((«¡♦i»:4 • • + I
variables with parameter p, then x \ ♦ * t l h,nümial
able. We also found that t o l T * :. ' * aliwa b,nom,al rjndom ^ But we recognize this js the ingt of a random variable winch i> normally distributed
mo n that the same » true ol Foisson distribuí,on, in Hint if J ,
. r r
T|m „ ranf ' v ", .Pt, Po,sson rando''' variables, then so is X, * X , + +A' with mean E jj ,- and variance I a , .
•-1 *=l
Hence, if AT,, A'2r , Xr are independent random variables where X,- is

a}), i - 1,2.........r. th en X t + Af2+ . . . + XT is ft, 0 /)•


Í l o S o ^ m a v o t l' r : í ' " ' ? 0" tC,IOn o n n d epend=„, random variables
Comment As far as the mean and variance o f Z are concerned, these go by the

rules e ( 1 X,) = I t ’lX, )• and. si nee X ,. X 7_____Xr are independent,


'i-1 I /-I

Var( I A,) = I Var(A)l. The important fact that is brought out in the above dis­
....... —
lishl m T er,y °rf ,hCbin°"'ial and PülSS0" diS,ribuliu"s* “ «'*■ cussion (hitherto not proved) is that the distribution o f X X, is normal
i=l
we h n,gf S.
The reproductive property o f the chi-square distribution. Suppose X , , X i t . . . . X r
7ft* reproductive property o f,h e binomial distribution with parameter p are independent random variables where the distribution o f X, is chi-square w ith
u p p o se* ,, * , Xr are independent random variables where X, is Bln- c l //,• degrees o f freedom, i = 1 .2 ......... r. Then from th e com m ent which follows
‘ - ' ■ 2......... '• T l'™ * * ¡ « = !/* * + <I 1= 1 i r. part (c) of Example 1.2, A M s) = ( I - 2s)~nfn, i= 1 , 2 ..........r.
Therefore, if 2 = X, + Jf, + . . . + Xr, we get H ence,lettingZ - AT, + . !. + A'r

MzU)=A1Xl(s)Mx M . . . M Xr(s) A /z(i)s d - 2 s f n' /2. . . ( l - ^ ) - " r / 2


= [per + n = (1 - 2 s f (,,i*- ■+nr)n

Since this is them gf o f * » , ♦ * ,+ . . . ♦ „ ,.p h „ follows that the distribution But this is easily recognized as the mgf o f a random variable which has the chi-
square distribution with <1 , + n , t . . . ♦ n , degrees o f freedom
or A, * u i. . ln° mia! Whh * ' + "> + • • • + n, trials and probability of
success p, thereby exhibiting the reproductive property. Hence, .„ conclusion, if ......... * r are independent random variables
where X, is chi-square with H, degrees of freedom, i = 1 2 r then
The reproductive properly o f tlie Foisson distribution. Suppose A' í= 1 -> r •V, + . . . + Xr is chi-square with «, + „ J+ . . . + deg[ees o f freedom
are independent Poisson random variables where the parameter o fX is X Then ’ ’
% fW = e*'(' ( = 1 . 2 ......... r.
LettingZ = Ar, + AT2+ . ..+ATr ,
: : h n d ,he -W rtbuU«" o f the total kinetic energy o f a l l l l
Mz {s )= e K^ ~
= C( W .. .♦ArXi'i -i) r cm/sec is giv'en'« ' Par" C'e 0 fm a “ " « « « « a, a velocity

But this is the mgr of a random variable with a Poisson distribution with parameter lion. Let K,. Ks .........[/,„ lepresem the velocities o f the panicles. Then the
*1 + a2+ . . . + A,.
total kinetic energy Z is given bv Z = 2 - . o y 2 - y 1/2
• ,-=1 2 ~ ¡iy V‘

-. i'd ~sr i > t


Now Vi is/V(0,9). Therefore, K /3 is M O n , na , , ,
1 , B1
** 2 i • • * • »10 are

Scanned by CamScanner
indepttukni. by «he reproductive property of the chi-square distribution it follows 8. Fo, ilic m il » p m » l i t follow,»* c * n . .demify (he undfrlyrn.t i M m
w o f the random variable
that 1 |;A ) hss a ch,-iquare distribution with 10 degrees of freedom Hence
10 I
the pdf of U * 1 y f y is pven hy la)

j m w -{j Éi ,
li .?; *T
,b )
3 r=0
W r t - j p f e » " ''" “ " >0
* elsewhere it) =|4 ^4 /J
From this it can easily he shown that the distribution of Z * 91/ is Id)

9. Consider the following m g f s expressed as power series in s:


//< * )= j W r i s ) 2* ' IM' 2>0
* elsewhere
Mx (s) = Ì s'. |i| < I
The verification o f this is left to the reader. r*0

My(j)
r
- rì-0 i-Ì0 4r -'^ ll(< :
EXERCISES-SECTION 1
(a) FindE(Xr) a n d £ ( D . r = 0, I . 2 . . .
X and a r I* ; : r , rilt ,and° m Var“ b,e X WUh n x =c^ 1 0blaln lhe »•' (b) How are the random variables X and Y related?
_ Suppose * has the probability function defined by P[X = I) = \ and 10. I f * is uniformly distributed over the interval [a. />), use the mgf ol * lo show
H X * - 2 ) ■ V For any positive integer n, find /•(*” ) in (he following two ways: that
(a) By using the basic definition of E(X")
b
(b) By expanding the mgf oM f as a power series t \ X r) = r = 1. 2 . . .
(r+ IX/>-<7)’
3 A random variable X assumes the three values -2 , 3 ,4 with respective nrobab.l-
11. Suppose * has a continuous distribution with the follov/ing po.'
(a) Find the m gf o f JIT.
/u )* k - « < X < 00
(b) Compute E(X). £ ( * ’ ), and E(JT3) by differentia ling the mgf
4. A fair die is rolled repeatedly until a I or a 6 shows up. Find the mgf o f the (a) Obtain the mgf of *.
number o f throws required. (b) Using the mgf. find E{X). E{Xexl2), and V ar(*).
5. I f * has a negative binomial distribution with parameters r, p, use the moment 12. The mgf of a random variable * is given by Af(r) = (I - s) \ s < I . Use the
generating function of X to find k \X ) and E (X 7) power series expansion of M{s) to obtain E(Xr) for any nonnegative integer r
6. I f * has the pdf
Hint: Use 1/(1 —i) = E / and differentiale twice.
r -0
_ I |x |, -I < x < I
/<*)* I Suppose the mgf of * is given by
10. elsewhere
M(s) = ( 0 .2 + 0 . 8 ^ ) '°
find the mgf o f X.
7. If the mgf o f * is Identify the distribution o f * and compute ^ 4 .3 < * < 7.8), using an appropriate
table.
j* 0 14. If the distribution of * is symmetric about c. show that the mgf of * (if it
M (r)= 65
exists) is given by
(l. 5 -0

determine the distribution o f * Mx (s) = ecs J m(e~sy + e*y)f(c + y ) dy


o

15. If the distribution of * is symmetric about c, show that X ~ c and - * + r have


the same distribution. (Here you are expected to prove by using the mgf’s. See
exercise 2 of Chapter 6 for an alternate approach.) Hint: Use exercise 14 and the
fact that MaX+b(s) = ebtMx (as).

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
■/VS ¡k*tc Pmhibif(tv Ithon auJ -1/v*«** ¿trm* Gfumunit hwcliont /457

H o w « ', in llie absence o f sucll knowledge, the probabilities can be obtained by 4 lfA and Y are independent random variables. show that
differentiating and evaluating the derivatives at r.ero.
S»a - i.r(s> = S.vi»" r(j/>•
n v = o )= * ^ K O )= (i-p i" for any constants a. b.
5 For any random variable X. show that
n x = n= = n|ps + (i - p i r ' p j ^ o
V a r O ) = * " ( l ) + * ' ( ! ) - l * '( n ] 2

=»pii - p t Suppose an experiment consists o f » independent trials w here -he probability of


i- p f
success on the ithVri.l i s p , i= 1 .2 . . Pro b jb ll"> *cni,a,U18
m r =2) function of X , the num ber o f successes in 11 trials c i y i Y - H I and
7. Using the factorial moment generating fu n ctio n , find i ■

= (")p ’d - p r ! EiX(X - 1 )(X - 2)) if X is Poisson with paiam eter X. Generalize and tin
£ ( * < * - 1 ) . . . ( * - i r + I ) ) for any positive integer n.
In general, for r = 0 , 1 . 2 ..........n .it can be seen that 8 Suppose 0 < a < 1 .1f the probability generating function o f a random variable
X is given by .
v .„ . a ° J - - p)r v \ ^
¿<s) = (!+ « )+ [(! +o), - 4 « r
UK I

* (")prd - p r
find E{X).
9. If the probability generating function o f a ran d o m variable X is given by
EXERCISES-SECTION 2
J In the following cases, obtain the factorial moment generating functions from Sts) = 3s! - l&s+ 16
the m g f's. _
(a ) X is Poisson with param eter X. use the partial fraction decom position to find th e p ro b ab ility fu n ction o f X
( b ) X is geom etric with param eter p 10. Suppose a random variable X has the follow ing p ro b ab ility generating fu n ctio n :
(c ) X is negative binomial with param eters r, p.
(d ) X is uniform over the interval \a ,b \. ¿ s) = £ ( 1 - 3 . ‘ + 3. » - » “ i i / j 2)«*
(e ) X has the m gf given by
Find:
« (i) = ie - 5I + f„ i-3, + foe-:, + si
< > )« * = 3) .
; . C onsider the following factorial m om ent generating functions. In eacli case, (b) P(X - 5) ■
d eterm in e the corresponding mgf. (c) P[X = 6)
< d ) f( X = 9)
(a) (£ W
5*1. S> 0 11. Suppose X, K, and Z are independent random variables w ith the follow ing
£W = log<5 probability functions:
11, s- 1

C) ^ )= 5 ? +^
I 1 1 3i 5
+ T2s + -12 5 '
4
s> 0 IdT
l \ i f = r) = r - 1 , 2 ,. ..

(c) nr-*-® im '


r = 0 .1 ,2 ,...

n2=r) = 3(3)-
3. F o r any random variable X , show that r =0 . 1 , 2 . . .

ftur.4W = * V
Find g x ( ! ) .i v ( s ) . and f z ( s ) . and use these to find th e d istrib u tio n of .V + V + Z .
w here a and b are any constants. Him: D ecom pose th e generating function in to partial fractions.

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
■*'*- rr, / HfotVClhl A pplkVtlOHi limit Theorem ¡uProhahihiy I46.*

Oiebyshev’s incqualiiy is of no value if 0 < h < I . because it does not tell us any Before we proceed with more examples, we introduce the following notation
thing that we don’t already know. For instance, Chehyshev\ inequality would which will be used in the rest of this chapter. We also state a result Irom calculus
which we will find extremely useful.
We should realize that Chehyshev’s inequality makes a very general, all-encom- Suppose X t . X 2, . . . is a sequence of random variables, that is, a countable
passmg statement regardless ol the p,ecise form of the distribution of the . j.idoin collcction of random variables. We shall often write X n , n 2 I. to denote such a
vana e, and. consequently, can only provide a vcrv crude hound. We may be ahle sequence.
, r „ p; r ° " j lm if t ",form;" io n «*a™ |ab|c »r me diSi,ibU- A sequenceX„t n > I. is said to constitute an indepe-.i -nt sequence of random
variables if any finite subcollection of these random vm tables is independen!.
•« « » . w0Uid bc , He t.
a p o s itio n i I ^ 1 With (he aid o fC h e b y sh ev ’s in eq u ality , w e are m Let S„ = I X Then S„ is a random variable which represents the sum of a
l h S : : m 3 m " ' e SUl,, f m ial S' alem ' n ' lh al p robability is in fact less f=l
sample of size n and S J n represents the sample mean. Whenever convenient, we
o l l Z Z Z ° VCT y r ^ ^ iS8iW " V -W . A H e n we can
d o even ^ t e r - ¿ . f r o m th e stan d ard norm al table, prov.de the e x m value o f shall write the sample mean as X„, using the subscript n to emphasize the tact that
b u t in rlw * ’r N<>lh,n6 b e a li kll,>w'ng the exact d istrib u tio n ; the mean is based on n observations.
We know that i(E(X¡) = w, and Var(Af,) = o f, then
t o we 1 n ' >n^ o f" " ^ a lo t o f in fo rm atio n
that we may not have otherwise.
S ince W - p O e X o V . we see th a t A(U r - u \ > e ) will be small if the
v a ria n c e a ,s sm all T hus, f h e b y s h e v s in eq u ality lends precision to the statem en t
} I , Viman':c m c a n s tl,a ' l»fge deviations from the m ean are im p ro b ab le
a n d th a t th e p ro b a b ility d istrib u tio n (ends to be co n cen trated aro u n d th e m ean and. if the random variables are independent,
It th u s in d ic a te s th e sense in w hich th e variance m ay he used as a m easure o l the
n
scatter o f the distribution about the mean.
ZoJ
i=l
Example /. /. Suppose X is uniformly distributed over the interval |0. 2 1 Var(*„) =
(a) Applying Chebyshcv’s inequality, find an upper bound on the probability
n x < 0 .2 or X > 1.8) and compare it with the exact value. In particular, if ^ = n and of = a2 for every i, then
(¿>) Find an upper bound on P(X < 0 .3 or X > 1.8). E ( X „ \-- u an d V aa„) = -
n
Solution.' Si nee X is uniformly distributed over |0 ,2 J . wc know that L'iX) = I
and V a rM = ( 2 - 0 ) 2/12 = i ' The result from calculus that we shall find particularly helpful is the following:
(a) We have
I f j is a given real number and c„ is a sequence of real numbers with
f [ X ^ 0 . 2 or JT > 1 .8 )» P ( lX - l\> 0 .8 ) lim cn - 0, then

and. by Chebyshev’s inequality, a + cn)


lim [1 +

W - l |> 0 .* ) < ^ P = 0.S2 We shall accept this result without proof. As a trivial special case of this result,
(U.o)
we have
On the other hand, the exact probability is equal to (0.4)} = 0.2.
(b ) In this case, ^ lim (l + - ) = e°
n-*- \ nl
P { X < 0 .3 or X > \.S ) < P { X < 0 .3 or X > 1.7)
= P { \ X - I |> 0 .7 ) for any real nu~ber j.

< | q 7^ , by Chcbyshev's inequality Example 1.2. Suppose a fair die is rolled thirty times and the number showing on
the die noted each time. Use Chebyshev’s inequality to find a lower bound on the
= 0.68 probability that the total score will be between 90 and 120. both inclusive.

The exact answer is (0.3)} + (0.2)} = 0.25. ■

------Jsma

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
4 72. Banc Proh«biht\- Picorv and Appliranoni ' I ,»*ti: Theorems in Prnhchflnv ! 4 V
4
12 Consider th e sequence o f independent random variables A, where converges in distrib u tio n to X th en one can use Ihe d ,s .n b u n o n X t o obtain the O
HXf= 2') - PU i = - 2 f) = 2-(I ,,|> approxim ate probabilities. (O f course, one assum es th at
a
such lhal th e probabilities can be obtained fro m n easily I This can he p ro ed o
and
any sequence o f r.v.’s as follows: - ____
nA , = 0) = 1 - 2-1* Suppose X , . X , . . . . converges in d istrib u tio n to X . th at is. Fx „ c 8
FX a u h e c o n tin u ity points o f Fx Suppose a and b ( a < b ) are c o n tin u ity p oints
Show that the sequence obeys ihe WLLN
o f Fx . T hen, by the definition,
” ' “ q“'" ce0 f mu,ually i"J'P enJ' " 1 ra"dom l,m Fx (b ) = Fx (b ) and lim /* „ ( « ) = F x ia l £
variables, with £W () = „ and Vartf,) = o’ for every r. Show that
n-*~ n n~r“
hm C
n~* Hence

lim P ( a < X „ < b ) = lim \Fx ^ b ) - F Xn(a )\ K


Win/, (i) S uppose t = n - c . Then *-*«. n-+~
= Fx ( b ) - F x {a) = P { a < X < b )
C

In o th er words, if n is large, P(a < X n < b ) can be a p p ro x im a te d by P{a < A ^ b ). <

0 0 Suppose t = p + e .T h e n Exam ple 2.1. Suppose fo r each n, X n is ;V ( 0 ,1/ n ) S h o w th a t X „ converge« ;n


distribution.

Solution. The D .F. o f X n is given by


Use Chebyshev’s theorem.
FX„(“ ) = _ / " e " * ' 1- cfcr = <Ku s fn )

2. C O N V ERG EN C E IN DISTRrBUTION

n
Therefore,
T h e m ain result o f this section is the central limit theorem. We shall initiate our
discussion w ith th e general notion o f convergence in distribution. 4>(-oo) if M< 0
lim Fx (u ) = 4>(0) if u = 0
2.1 T he G eneral N o tio n o f Convergence in D istribution <t>(°°) if u > 0

A sequence o f random variables X„, n > 1, is said to converge in distribution


(° if u < 0

XI
( o r in law ) to a random variable X if if u = 0
■ 4
U if u > 0

ti
Jim FXn(u ) = Fx (u) Hence, if we consider a ran d o m variable X w hose D .F. is given by

n
/!*>■
if u < 0

/I
FX (u )= j °
at each real n u m b er u w here Fx is continuous if u > 0

/I
Tilt* is w ritte n com pactly as X„ * X . (The points where Fx is continuous are then X„ -* X . (N otice th at f x n(0 ) - ; for every n and Fx { 0 ) = 1, so th a t

jT
called th e c o n tin u ity p o in ts o f FX ). J i m / ^ ( O ) * Fx (0 ). But 0 is n o t a c o n tin u ity p o in t o f Fx , and fo r con v e rg e n c e
T he im p o rta n c e o f th e n o tio n o f convergence in distribution is to be seen in the
in distrib u tio n we d o n o t n eed convergence at the d isc o n tin u ity p o in ts .)
follow ing fact: O ften one is interested in finding Ihe probabilities associated with

/I
Ihe d istrib u tio n ofAT„ w hen n is large. However. Ihe problem o f finding the distribu E xam ple 2 2 Discuss the convergence in d istrib u tio n o f the se q u en c e s o f r a n d o n ,
lion o f X„ m ay be q u ite com plicated and, som etim es, even if the distribution o f variables X n , n > 1, in th e follow ing cases:
X„ is available, th e actual co m putations m ay be quite involved. For example, (a) For every integer n > 1, th e p d f o f X„ is given by

/■»
suppose X n is th e sum o f n in dependent. Bernoulli random variables each with
0<x < I
the prob ab ility o f success p. The distribution o f X n is known for every n as being
r*
|0 , elsew here
binom ial, B (n , p ). But even for m o derately small values o f n, the com putations o f
l~r

/ ' '*‘(1 are m essy. Now , ¡ fit is know n th at the sequence JT,, X , , . . .
■lit il

Scanned by CamScanner
■r?
1,1 I It*•
>1
4?4 / flanc frobcbth'y Theory and A pplnvnom

(ft) For every integei n > 1, the D.F. of X„ is given by

Fyjx)= j°’ *<"


" (1. x>» k - 0 .1 .
lim b (k \ii,p )s *! ’
Solution rt~*m
(a) The D.F o f Xm can he easily obtained as

(0 . «<0 Let us fint find the limit foi * ' v'- ***
F\ n{u) = ju " , 0<«<l
. 1 1 - o f * Mi** v ¡i)
1 1, «>1
As n goes to infinity, ...... ~
Next, taking the ratio o
lim Fx (u) = j 0 ' u<l
■ " (l, u> l

Hence *„ * X where the D.F. o f * is

Fx(u)
_ |0 , i/< l
~ 1, u> \

(/>) In this case, V rnF j^iu) = 0 for every real number u. But there is no distribu­
tion function which, at its continuity points, will agree with a function which is (Kb 01
identically zero. Hence Xn does noj converge in distribution. ■ ■ i i - w 1

Therefore
The reader is cautioned that, if Xn ^ A, tr.cre is nothing implied about the
convergence of the sequence Xn, n > 1, to the random variable X per se. b(k-.n,p)
To see this, suppose * is a standard normal variable. Define a sequence Xn, J™. b ( k - l -,n,P) k
n > I , by
T lia l is.
* « .-< -ir*
lim b { k \ n . p ) - \ ^ k - V-n' P)
Thus, for any sample point s e S, Xn{s) = (-l)"* (s). Now, since * is JV(0,1), it n-*m
follows that for ei>ery integer n > \ , X n is.V(0,1). (Recall that i f * isN[p, o2). then Proceeding recursively.
fl* + ft is N(afi + ft, a2o7). Here a = ( - \ f and b = 0.) Therefore, it follows that
X _X__ iirn Mk - 2;n. P)
*n *• lim b (k ;n ,p )- k * * - | n^ m
However, for every s g S the sequence of real numbers ( -l)" * (i), n > 1, diverges. n-+~ . .
Hence, J \s \ Xn(s) diverges) = P{Xn diverges) = 1. X _ X_ lim ft(0; ii. p)
=k 'k -\ 2 1 n-*"
In conclusion, although *„ * * , the sequence o f random variables itself diverges.
X _vx
Poisson approximation to the binomial k\
In Chapter 5, we enunciated the postulates under which the random variable
Thus we see......
representing the number o f changes during a given interval of length i has the poisson random variable will, p a n j m e I< _ ,* /„ ) ) - . X a * n -
Poisson distribution. We shall now establish that under certain conditions the
fi(„, p), then E(X) = "P X ' f ,he Polsson random variable.)
binomial probabilities b{k\n, p ) can be approximated by the Poisson probabilities and X IS indeed the mean and ihe v a ria n t 01
if the number o f trials n is large.

Scanned by CamScanner
Scanned by CamScanner
a a a a « ft ä « ä ft ft « t § €
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
a «i n c (A \\ V\ V \ V« Ü. 'V ï*. V* V* ft h
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
* * * /« •» hnàttUlry Tktnrrtmi 4p rh ti^u - ' : <trrr. h A- ' »r;
CHAPTERS
M 0 < * V — . Oetorwteif
Section 1
»• ( Ï W l j ) 4 13 /f<ri • * rr ‘^, r >0. 0?te*tote
4 liî» * ^ A h " a l 13 f i n w 15 ffiti- •—* r . jno* < r < J ^ r . O itoirtierf
* l ) * 0 4 5 2 . p i i ) . 0 1 « , f i l ) . (ioH *yr»
* p (* )» í Y ¿1 h
'fc J U M 1 ä )
'\ N 1
• **°i. ,-V
lq I I » 1 » J r >0
I , i r r \ 'r Q r . n f I * ' r
^ ) * ( î ) ( Û Â J i f ( 0 .i 2 f \ *.0|2 lo flirvhcrr
l i «. 0.2S08 ’ ’ *
,s A * W 0 . |) ( f t . * j \ * . 0 |2 21 ^ >/l * f * 1. » <*0. Û tfcr*?rrf

, 10.
|0 fF < 112
2
A fir)
« "> <?**)*(«>• (M <*)•<*) fl f fr4* f ^ 12

CHATTU 7
30 (,) ( » ) ( / « * » * ) ' 4 *0 .1
V rfm I
(»>) rn o A ) *• I i i <*> i
21 onus > í » i a n - i . y « ? / )”45 íb > /.íj)*i. vjm tm s
H T v m<*, protaNr numhri oí m t'pr»u « 4 1*. p r r è ^ n * 0 W P Vj^T) • V Vj4 T )-2
Svitavi 2
• u> f i T i • j. v * f r ) - A (<)Æ UD*i. v»ur>*j

• (a) / U ) » l . -3 * 1 * 5 . o a n I m (V| I • /1JTl » i s. v«cr> - ff


3 „ (° * < î ' < * >t id t ; j
*T*>- (i - : * 7. ;< i< s
'I - *>« • I «p*n«*; «turner of nwfcha n Í for w r , «t.
M l I! 10 2* 15 Nn [S w*to'
M *) I2 (b ) 0 Í3 ( 0 23 ** I«) } lb) 12 (c) 5? id) 3fl*>
9 (*) 0601$ ib ) 0 5 ï;â le) QtflJS <¿10 133* u t Q3*M :i iir> *2 5. v * ir f4 A
II O A M :i iu V iK ifix )!1
13- (•) 4 (b) * -5 f t f la it t i fo# 1 * 1. 2. ,t, bat f ijT ’ 1»ijijes a»>r ets»
15 OObOt> <hnr« " ^ V 2^ (»> 2.5mmuí« ib) ? muiu'ei
H (•) 0 022* (h )O T 7 4 5 üednt 2
!« ( 1 ) 0 1587 tb )a m m <<iQM«3 iá»09TT: I 0 201* ) (i) 200 ib) 20
21. (■> e-* ' = 00111 <*} l - < i - r “* V « 0 W
* £(» * Z*J>'
CHATTER 4
1 Cue .Itm n ^ J Í *«)>0 m ñ f •éut)>ñ.X **}>0 7 (*) ( * ) [ / * R ») a J*'

5 <») ^ r ( r ) * ^ j ^ 1, / t H I i , - * . - 4 . - 1 . 2 .S. Ili 12 (i) O’îOÂi (b) TWre i#t feo )*4 4
„ m ». o i (V) «fr
lb) / » / < ! ) * rr<~>*£. Fzt SJ" ! M W * i- M Ä l ' i
10 *r ^ ’V I - r ’*). 1 * 1 .2 . (jr jw rtn t JtUnbvtMj* «nUk p*j*mcie:
17 0 5 iV) — Í ------- ,
l-r'M * *I O • 11« ♦II1
21 fïtaT)*-»; VálidaJTi* I
23 («) o o :2 â i*» ¿> >44«. (( ) o j o r
25 (4)1.)(057 T2)tt ft) I-10JT72)h (»H^oorréñot^:^
f » W 02^)-ru
îi #iri*on. vmjo-Oj«

Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Scanned by CamScanner
Binom ial Probabilities
for

■î i l ï î f l l l l ï l î l
i
5
; 0001
.0 0 0 °
.0000
.0054
.0004
.0000
!0331
.0046
.0004
0839
0185
002Ä
?2 lî
Ò 2 IS
n n î?
? Â 15
Ì216
2865
-2965
25*'
1361
2731
273
1707
?5 ÏZ
0896
*090
S 787
-0548
15*9
o S iS
U S?
0039
0 3 >2
-1094

. 1 1 1 1 1 1 i I É 1 1 Ê I
O >¿3 00>J!7'w AU)KJ

Il 1 1 1 1
nrtnS
1 1 1 i I i I
0 000 .0008 .0050 0165 Cn»Q ’A t *« -2048 .2194 2508 if iò n -, 7 39 1641
.0000 .0000 .000 1 00Û6 nn5fi °7 3 5 .1024 .1181 iä § ? ‘5 ?9 2 -2506 .2461
.0000 .0000 .0000 0000 non? '2 ? ? ^ '2210 0341 .0424 0743 ï!< n 24° 8 -2461
.0000 .0000 .0000 0000 OOnn S r i? 0039 0073 0098 0 2 ?2 n in ? À542 ‘ 641
.0000 .0000 .0000 !oooo 0000 cro i S2SÍ 2009 -0013 o lii o fjí 0703

10
!I l i ü i l i i l i i i
o oSSS
0 000
-0000
o oo n
.0000
ÎX S S
----------------- - ° ° ° ° ------?0 00 ___ 0 0 0 0
.0000
2 0 00
.0000
.0001
0000
.0000
0004
0000
.0000
0014
0001
.0000
0030
0003
.0000
'n n l i
°0 0 5
.0000
'2 f H 5
0016
.000?
0746

° ° 42
.1080
S 2 Ï?
oooa
.1172
0439
o o îo
Scanned by CamScanner
r u v u u i

Scanned by CamScanner
Scanned by CamScanner

You might also like