0 views

Uploaded by huatson82

Probabilistics and Statistics problems

- hpapp02
- Probability Questions and Answers
- Modern Mathematical Statistics With Applications (2nd Edition)
- m2l5
- 07 MonteCarlo
- chapt3
- ECE R13 2-1.pdf
- Anal Bioe Syllabus2014
- lec9.pdf
- Chapter 2 Exercises
- Binomialprobabilitydistributionsppt 150312233957 Conversion Gate01
- 15MA301-Notes Unit-1 and Unit 2 (1)
- prp notes
- MCA Regular First to Fourth Semester Syllabus Final..
- 14 - Probability
- Rate Decline Analysis for Naturally Fractured Reservoirs
- Prinsip Dasarpengukuran
- XII Probability Assignment Main
- 3300 Chapter 06 Slides
- 2018-A Weighted Negative Binomial Lindley Distribution With Applications to Dispersed Data

You are on page 1of 28

1

1 Axioms of probability

1. Sample space of an experiment is the set of all possible outcomes, denotes by S.

2. An event E is any subset of S.

3. E ∪ F (read as E union F ) is the set of all outcomes belonging to E or to F or both.

4. E ∪ F = F ∪ E.

5. E ∩ F (read as E intersection F ) is the set of all outcomes belong to both E and F .

6. E ∩ F = F ∩ E.

7. E c (read as E complement) is the set of all outcomes in the sample space S that are not in

E.

8. De Morgan’s laws:

(E ∪ F )c = E c ∩ F c (1)

and

(E ∩ F )c = E c ∪ F c . (2)

9. Two events E and F are mutually exclusive if E and F have no outcomes in common.

10. If an experiment can result in any one of n different, equally likely outcomes, and if exactly

m of these outcomes correspond to event E, then the probability of event E is

m

P (E) = . (3)

n

11. The odds ratio of an event E is:

P (E)

. (4)

1 − P (E)

12. Axiom 1: the probability assigned to each event takes a value between 0 and 1 (inclusive).

13. Axiom 2: the probability of the entire sample space S is 1.

14. Axiom 3: if E and F are mutually exclusive events then

P (E ∪ F ) = P (E) + P (F ). (5)

P (E ∪ F ) = P (E) + P (F ) − P (E ∩ F ). (6)

P (E c ) = 1 − P (E). (7)

P (E c ∩ F ) = P (F ) − P (E ∩ F ). (8)

P (E ∩ F c ) = P (E) − P (E ∩ F ). (9)

2

1.1 Problems

1. The number 1, 2, 3, and 4 are written separately on four slips of paper. The slips are then

put into a hat and stirred. A blindfolded person draws two slips from the hat, one after the

other, without replacement. Describe a sample space for the experiment.

2. A card is drawn at random from an ordinary bridge deck. Find the probability and the odds

that the card is an “honor” (that is, an ace, king, queen, jack, or ten).

3. For a chronic disease, there are five standard ameliorative treatments: a, b, c, d and e. A

doctor has resources for conducting a comparative study of three of these treatments. If he

chooses the three treatments for study at random from the five, what is the probability that

(ii) treatments a and b will be chosen,

(iii) treatments a or c will be chosen,

(iv) treatments a and b or b and c will be chosen?

4. The integers 1, 2, 3, . . . , 20 are written on slips of paper which are placed in a bowl and

thoroughly mixed. A slip is drawn from the bowl at random. What is the probability that

the number on the slip is either prime or divisible by 3?

5. Let E and F be events in a sample space S, such that P (E) = 0.4, P (F ) = 0.3 and P (E∩F ) =

0.2. Find the following:

(i) P (E ∪ F );

(ii) P (E c );

(iii) P (F c );

(iv) P (E c ∩ F );

(v) P (E c ∩ F c ).

(i) no heads;

(ii) at least one head.

1.2 Solutions

S = {(1, 2), (1, 3), (1, 4), (2, 1), (2, 3), (2, 4), (3, 1), (3, 2), (3, 4), (4, 1), (4, 2), (4, 3)}.

2. The probability that the card is an honor is: (5 × 4)/52 = 5/13. The odds is: (5/13)/(1 −

(5/13)) = 5/8.

S = {abc, abd, abe, acd, ace, ade, bcd, bce, bde, cde}.

Then,

3

(i) 6/10 = 3/5;

(ii) 3/10;

(iii) 9/10;

(iv) 5/10 = 1/2.

4. Let E be the event that the number is prime and F be the event that it is divisible by 3.

Then,

P (E) = 9/20

P (F ) = 6/20,

P (E ∩ F ) = 1/20.

(ii) by complementary law 1, P (E c ) = 1 − 0.4 = 0.6;

(iii) by complementary law 1, P (F c ) = 1 − 0.3 = 0.7;

(iv) by complementary law 2, P (E c ∩ F ) = 0.3 − 0.2 = 0.1;

(v) by de Morgan’s law and complementary law 1, P (E c ∩ F c ) = P ((E ∪ F )c ) = 1 − P (E ∪

F ) = 1 − 0.5 = 0.5.

(ii) By complementary law 1, the probability of at least one head is 1 minus the probability

of no heads which is 1 − (1/8) = 7/8.

P (E ∩ F )

P (E | F ) = (10)

P (F )

denoted by P (F | E) and is defined by

P (E ∩ F )

P (F | E) = (11)

P (E)

4

2. Multiplicative laws:

P (E ∩ F ) = P (E)P (F | E) (12)

and

P (E ∩ F ) = P (F )P (E | F ). (13)

3. A useful identity:

P (E) = P (E | F )P (F ) + P (E | F c )P (F c ). (14)

4. Bayes’s theorem:

P (E | F )P (F )

P (F | E) = . (15)

P (E | F )P (F ) + P (E | F c )P (F c )

5. Two events E and F are independent if and only if anyone of the following is true:

(i) P (E ∩ F ) = P (E)P (F );

(ii) P (E | F ) = P (E);

(ii) P (F | E) = P (F ).

6. Let E and F be independent events. Then E and F c are independent events, as are E c and

F , and E c and F c .

2.1 Problems

1. Two a’s and two b’s are arranged in order. All arrangements are equally likely. Given that

the last letter, in order, is b, find the probability that the two as are together.

2. A coin is tossed until a head appears, or until it has been tossed three times. Given that the

head does not occur on the first toss, what is the probability that the coin is tossed three

times?

3. Jimmy likes to go shopping with his mother because he can sometimes get her to buy him a

toy. The probability that she takes him along on her shopping trip this afternoon is 0.4, and

if she does, the probability that she gets a toy is 0.8. What is the probability that she takes

him shopping and buys him a toy?

4. An urn contains 5 black balls and 10 red balls. Two balls are drawn at random, one after

the other, without replacement. Set up a sample space for the possible outcomes of the

experiment, with appropriate probabilities.

5. Two coins are tossed. Show that event “head on first coin” and event “coins fall alike” are

independent.

6. Sixty percent of the students in a school are boys. Eighty percent of the boys and 75% of the

girls have activity tickets for all the school activities. A ticket is found and turned in to the

school’s lost and found department. What is the probability that it belongs to a girl? To a

boy?

5

2.2 Solutions

Let E denote the event that the two as are together and let F denote the event that the last

letter, in order, is b. The question is to find the conditional probability that P (E | F ). From

the sample space,

P (F ) = 3/6 = 1/2

and

P (E ∩ F ) = 2/6 = 1/3.

Thus by definition of the conditional probability

P (E | F ) = P (E ∩ F )/P (F ) = 2/3.

S = {H, T H, T T H, T T T }.

Let E denote the event that the coin is tossed three times and let F denote the event that

the head does not occur on the first toss. The question is to find the conditional probability

that P (E | F ). Clearly P (F ) = 1/2 and from the sample space P (E ∩ F ) = 1/4. Thus, by

the definition of the conditional probability P (E | F ) = 1/2.

3. Let E denote the event that Jimmy’s mother takes him along on her shopping trip this

afternoon. Let F denote the event that she buys Jimmy a toy. The question is to find

P (E ∩ F ). We are given that

P (E) = 0.4

and

P (F | E) = 0.8.

So by the multiplicative law the required probability

4. The possible outcomes are: (First Red, Second Red), (First Red, Second Black), (First Black,

Second Red) and (First Black, Second Black). The corresponding probabilities are:

10 9 3

P (RR) = P (Second Red | First Red) × P (First Red) = × = ,

15 14 7

10 5 5

P (RB) = P (Second Black | First Red) × P (First Red) = × = ,

15 14 21

5 10 5

P (BR) = P (Second Red | First Black) × P (First Black) = × = ,

15 14 21

5 4 2

P (BB) = P (Second Black | First Black) × P (First Black) = × = .

15 14 21

6

5. The sample space for this experiment is:

S = {HH, HT, T H, T T }.

Let E denote the event that head on first coin and let F denote the event that coins fall alike.

From the sample space,

P (E) = 1/2,

P (F ) = 1/2

and

P (E ∩ F ) = 1/4.

Thus E and F must be independent.

6. Let E be the event that a student has an activity ticket. Let F be the event a student is a

girl. The question is to find P (F | E). We are given that

P (F ) = 0.4,

P (E | F ) = 0.75,

P (F c ) = 1 − P (F ) = 0.6

and

P (E | F c ) = 0.8.

Using Bayes’ theorem, we get the probability:

P (E | F )P (F )

P (F | E) =

P (E | F )P (F ) + P (E | F c )P (F c )

0.75 × 0.4

=

0.75 × 0.4 + 0.8 × 0.6

= 0.3846154.

P (F ) = 0.6,

P (E | F ) = 0.8,

P (F c ) = 1 − P (F ) = 0.4

and

P (E | F c ) = 0.75.

Using Bayes’ theorem again, we get the probability:

P (E | F )P (F )

P (F | E) =

P (E | F )P (F ) + P (E | F c )P (F c )

0.8 × 0.6

=

0.8 × 0.6 + 0.75 × 0.4

= 0.6153846.

7

3 Random variables

1. A random variable is a function that assigns a numerical value to each outcome of a par-

ticular experiment. A random variable is denoted by an uppercase letter, such as X and a

corresponding lower case letter such as x is used to denote a possible value of X.

2. A discrete random variable is a random variable with a finite or countably infinite number of

possible values, e.g. number of accidents, number of applicants interviewed, number of power

plants, etc.

3. If the possible values of a random variable contains an interval of real numbers then it is a

continuous random variable, e.g. temperature, breaking strength, failure time, etc.

P

(ii) all x p(x) = 1.

5. The probability density function (pdf) of a continuous random variable X is a non-negative

function f (x) ≥ 0 such that

Z b

f (x)dx = P (a < X < b) (17)

a

Z ∞

f (x)dx = P (−∞ < X < ∞) = 1 (18)

−∞

and

Z a

f (x)dx = P (X = a) = 0. (19)

a

X

F (x) = P (X ≤ x) = p(z). (20)

for all z ≤ x

That of a continuous random variable X is defined by

Z x

F (x) = P (X ≤ x) = f (y)dy. (21)

−∞

(i) 0 ≤ F (x) ≤ 1;

(ii) If a ≤ b then F (a) ≤ F (b);

(iii) F (−∞) = 0;

8

(iv) F (∞) = 1;

(v) if X is a continuous random variable F (b) − F (a) = P (a < X < b);

(vi) if X is a continuous random variable then its pdf can be obtained as

∂F (x)

f (x) = . (22)

∂x

7. The expected value of a discrete random variable X is:

X

E(X) = xp(x). (23)

for all x

That of a continuous random variable X is defined by

Z ∞

E(X) = xf (x)dx. (24)

−∞

(i) E(c) = c, where c is a constant;

(ii) E(cX) = cE(X), where c is a constant;

(iii) E(cX + d) = cE(X) + d, where c and d are constants.

8. Let g be any real-valued function. The expected value of g(X) for a discrete random variable

X is:

X

E(g(X)) = g(x)p(x). (25)

for all x

That of a continuous random variable X is defined by

Z ∞

E(g(X)) = g(x)f (x)dx. (26)

−∞

V ar(X) = E(X 2 ) − (E(X))2 . (27)

Its properties are:

(i) V ar(c) = 0, where c is a constant;

(ii) V ar(cX) = c2 V ar(X), where c is a constant;

(iii) V ar(cX + d) = c2 V ar(X), where c and d are constants.

10. The standard deviation of a random variable X (discrete or continuous) is:

q

SD(X) = V ar(X). (28)

11. Let X be a continuous random variable with pdf fX . Set Y = g(X) where g is a strictly

monotone differentiable function. Then the pdf of Y is:

dg −1 (y) dg(y)

fY (y) = fX (g −1

(y)) = fX (g (y))/

−1 , (29)

dy dy

9

3.1 Problems

1. Verify that

0 if x < −3,

1/3

if −3 ≤ x < −1,

F (x) =

2/3 if −1 ≤ x < 0,

1 if x≥0

is a cumulative distribution function and derive the corresponding probability mass function.

(

1/3 if x = 0, 1, 2,

p(x) =

0 otherwise.

3. An urn contains 4 balls numbered 1, 2, 3, 4, respectively. Let X denote the number that

occurs if one ball is drawn at random from the urn. What is the probability mass function of

X?

4. Consider the urn defined in question 3. Two balls are drawn from the urn without replace-

ment. Let X be the sum of the two numbers that occur. Derive the probability mass function

of X.

5. The church lottery is going to give away a 3000-dollar car. They sell 10,000 tickets at 1 dollar

apiece. If you buy 1 ticket, what is your expected gain. What is your expected gain if you

buy 100 tickets? Compute the variance of your gain in these two instances.

(

1/4 if x = 2, 4, 8, 16,

p(x) =

0 otherwise,

(i) E(X);

(ii) E(X 2 );

(iii) E(1/X);

(iv) E(2X/2 );

(v) V ar(X);

(vi) SD(X).

7. Verify that

0

√

if x < 0,

F (x) = x if 0 ≤ x ≤ 1,

1

if x > 1

is a cumulative distribution function and derive the corresponding probability density func-

tion.

10

8. A random variable X has the probability density function:

(

10 exp(−10x) if x > 0,

f (x) =

0 otherwise.

(

2(1 − x) if 0 < x < 1,

f (x) =

0 otherwise,

(i) E(X);

(ii) E(X 2 );

(iii) E((X + 10)2 );

(iv) E(1/(1 − X));

(v) V ar(X);

(vi) SD(X).

3.2 Solutions

1. Check the four conditions (i)-(iv) for F to be a cdf. It is useful to draw F as a function of

x beforehand. From the plot conditions (i) and (ii) would be obvious. Condition (iii) follows

since F (x) = 0 for all x < −3. Condition (iv) follows since F (x) = 1 for all x ≥ 0. For the

probability mass function note that F ‘jumps’ by the amount 1/3 at x = 0, −1, −3. Thus,

(

1/3 if x = 0, −1, −3,

p(x) =

0 otherwise.

0 if x ≤ 0,

1/3 if 0 < x ≤ 1,

F (x) =

2/3 if 1 < x ≤ 2,

1 if x ≥ 2.

(

1/4 if x = 1, 2, 3, 4,

p(x) =

0 otherwise.

11

4. The sample space of possible outcomes and the corresponding values for X are given in the

table below.

Outcome X

1,2 3

1,3 4

1,4 5

2,1 3

2,3 5

2,4 6

3,1 4

3,2 5

3,4 7

4,1 5

4,2 6

4,3 7

x p(x)

3 2/12=1/6

4 2/12=1/6

5 4/12 = 1/3

6 2/12=1/6

7 2/12=1/6

5. Let X denote the gain. Then X = 3000 − 1 = 2999 with probability 1/10000 and X = 0 − 1 =

−1 with probability 9999/10000. Thus, by definition of expectation, the expected gain is:

1 9999 7

E(X) = 2999 × + (−1) × =−

10000 10000 10

and

1 9999 4502

E(X 2 ) = 29992 × + (−1)2 × = .

10000 10000 5

Hence, by the definition of the variance, the variance of the gain is:

4502 49 89991

V ar(X) = E(X 2 ) − (E(X))2 = − = .

5 100 100

If you buy 100 tickets then the expected gain is:

100 9900

E(X) = 2900 × + (−100) × = −70

10000 10000

and

100 9900

E(X 2 ) = 29002 × + (−100)2 × = 94000.

10000 10000

Hence, the corresponding variance is:

12

6. (i) by the definition of expectation,

1 1 1 1 15

E(X) = 2 × + 4 × + 8 × + 16 × = ;

4 4 4 4 2

(ii) by the definition of expectation,

1 1 1 1

E(X 2 ) = 22 × + 42 × + 82 × + 162 × = 85;

4 4 4 4

(iii) by the definition of expectation,

1 1 1 1 1 1 1 1 15

E(1/X) = × + × + × + × = ;

2 4 4 4 8 4 16 4 64

(iv) by the definition of expectation,

1 1 1 1 139

E(2X/2 ) = 2 × + 22 × + 24 × + 28 × = ;

4 4 4 4 2

(v) by the definition of variance,

225 115

V ar(X) = E(X 2 ) − (E(X))2 = 85 −

= ;

4 4

p √

(vi) by the definition of standard deviation, SD(X) = 115/4 = 115/2.

7. Check the four conditions (i)-(iv). It is useful to draw F as a function of x beforehand. From

the plot conditions (i) and (ii) would be obvious. Condition (iii) follows since F (x) = 0 for all

x < 0. Condition (iv) follows since F (x) = 1 for all x > 1. To obtain the probability density

function simply differentiate:

( √ √

∂ x/∂x = 1/(2 x) if 0 ≤ x ≤ 1,

f (x) =

0 otherwise.

Z x

F (x) = 10 exp(−10y)dy = [− exp(−10y)]x0 = 1 − exp(−10x).

0

(

1 − exp(−10x) if x > 0,

F (x) =

0 otherwise.

" #1

1 1 x2 x3 1 1 1

Z Z

2

E(X) = 2 x(1 − x)dx = 2 x−x dx = 2 − =2 − = ;

0 0 2 3 0

2 3 3

" #1

1 1 x3 x4 1 1 1

Z Z

2 2 2 3

E X =2 x (1 − x)dx = 2 x −x dx = 2 − =2 − = ;

0 0 3 4 0

3 4 6

13

(iii)

1 20 641

E (X + 10)2 = E X 2 + 20X + 100 = E(X 2 )+ 20E(X)+ 100 = + + 100 = ;

6 3 6

Z 1 1−x

Z 1

E (1/(1 − X)) = 2 dx = 2 dx = 2 [x]10 = 2;

0 1−x 0

1 1 1

V ar(X) = E(X 2 ) − (E(X))2 =

− = ;

6 9 18

p √

(vi) by the definition of standard deviation, SD(X) = 1/18 = 1/(3 2).

10. We are given that fX (x) = exp(−x) and g(x) = exp(−x). The inverse function of g is

g −1 (y) = − log y and

dg −1 (y) d log y 1

=− =− .

dy dy y

We get the probability density function of Y as:

1 1 1

fY (y) = fX (− log y) − = exp(log y) = y = 1.

y y y

4 Binomial distribution

This is a discrete distribution. It was derived by James Bernoulli in his treatise Ars Conjectandi

published in 1713. The distribution has applications in quality control.

4.1 Definition

(i) all trials are statistically independent (in the sense that knowing the outcome of any particular

one of them does not change one’s assessment of chance related to any others);

(ii) each trial results in only one of two possible outcomes, labeled as “success” and “failure”;

(iii) and, the probability of success on each trial, denoted by p, remains constant.

Then the random variable X that equals the number of trials that result in a success is said to

have the binomial distribution with parameters m and p.

14

Probability mass function The pmf of the random variable X is:

!

m x

p(x) = p (1 − p)m−x , x = 0, 1, . . . , m, (30)

x

where

!

m m!

= . (31)

x x!(m − x)!

4.2 Properties

x

!

m

pi (1 − p)m−i .

X

F (x) = (32)

i=0

i

1

Z p

F (x) = 1 − tx (1 − t)m−x−1 dt = 1 − Ip (x + 1, m − x), (33)

B(x + 1, m − x) 0

where

1

Z p

Ip (a, b) = ta−1 (1 − t)b−1 dt (34)

B(a, b) 0

h i

E {X − E(X)}3 = mp(1 − p)(1 − 2p); (37)

h i

E {X − E(X)}4 = 3 {mp(1 − p)}2 + mp(1 − p)(1 − 2p); (38)

!

m − 1 [mp]+1

E {|X − E(X)|} = m p (1 − p)m−[mp] ; (39)

[mp]

15

(vii) the coefficient of variation is:

s

1−p

CV (X) = ; (40)

mp

1 − 2p

γ1 (X) = p ; (41)

mp(1 − p)

1 − 6p(1 − p)

γ2 (X) = 3 + . (42)

mp(1 − p)

4.3 Estimation

Suppose you have a dataset x1 , x2 , . . . , xn from a binomial distribution with the parameter m known

and p unknown. Then an estimate for p is:

Pn

i=1 xi

p̂ = . (43)

nm

An exact 95% confidence interval for the true value of p is (pL , pu ), where pL and pU are the

solutions of

n n

!

X X

Ip L xi , nm − xi + 1 = 0.025 (44)

i=1 i=1

and

n n

!

X X

Ip U xi + 1, nm − xi = 0.975, (45)

i=1 i=1

respectively. For large n, an approximate 95% confidence interval for the true value of p is given

by either:

r r

1.962 p̂(1−p̂)1.962 1.964 1.962 p̂(1−p̂)1.962 1.964

p̂ + 2m − mn + 4(mn)2 p̂ + 2mn + mn + 4(mn)2

1.962

, 1.962

(46)

1+ mn 1+ mn

or

s s

p̂ − 1.96

p̂(1 − p̂) p̂(1 − p̂)

, p̂ + 1.96 . (47)

mn mn

16

4.4 Generalizations of the binomial distribution

! , m−r !

m x X2 m j

p(x) = p (1 − p)m−x p (1 − p)m−j , x = r1 , . . . , m − r2 . (48)

x j=r1

j

! ,

m x

p(x) = p (1 − p)m−x {1 − (1 − p)m } . (49)

x

!

∞

m X

p(x) = wj pxj (1 − pj )m−x (50)

x j=1

!Z

m 1

p(x) = g(p)px (1 − p)m−x dp (51)

x 0

Z 1

(where g(p) ≥ 0 and g(p)dp = 1).

0

4.5 Problems

1. In the past, two building contractors, A and B, have competed for 50 contracts. A won 20

and B won 30 of these contracts. The contractors have both been asked to tender for three

new contracts. On the basis of their past performance, what is the probability that

(ii) Contractor B will win at least one contract;

(iii) Contractor A will win exactly two contracts?

2. A food-packaging apparatus underfills 10% of the containers. Find the probability that for

any particular 5 containers the number of underfilled will be:

(i) exactly 3;

(ii) exactly 2;

(iii) zero;

(iv) at least 1.

3. An engineer has designed a modified welding robot. The robot will be considered good

enough to manufacture if it misses only 1% of its assigned welds. And it will be judged a

poor performer if it misses 5% of its welds. A test is performed involving 100 welds. The new

design will be accepted if the number of missed welds X is 2 or less and rejected otherwise.

17

(ii) What is the probability that a poor design will be accepted?

4. A manufacturing process has 100 customer orders to fill. Each order requires one component

that is purchased from a supplier. However, typically, 2% of the components are identified as

defective, and the components can be assumed to be independent.

(i) If the manufacturer stocks 100 components, what is the probability that the 100 orders

can be filled without reordering components?

(ii) If the manufacturer stocks 102 components, what is the probability that the 100 orders

can be filled without reordering components?

(iii) If the manufacturer stocks 105 components, what is the probability that the 100 orders

can be filled without reordering components?

5. An engineer has designed a modified welding robot. The robot will be considered good

enough to manufacture if it misses only 1% of its assigned welds. And it will be judged a

poor performer if it misses 5% of its welds. A test is performed involving 100 welds. The new

design will be accepted if the number of missed welds X is 2 or less and rejected otherwise.

(ii) What is the probability that a poor design will be accepted?

4.6 Solutions

1. Let X denote the number of contracts won by A. Clearly X is a binomial random variable

with m = 3 and p = 20/50 = 0.4.

!

3

P (X = 3) = (0.4)3 (0.6)0 = (0.4)3 = 0.064;

3

= P (X ≤ 2)

= 1 − P (X = 3)

= 1 − (0.4)3

= 0.936;

!

3 3×2

P (X = 2) = (0.4)2 (1 − 0.4)1 = × (0.4)2 × 0.6 = 0.288.

2 1×2

2. Let X denote the number of underfilled containers. Clearly X is a binomial random variable

with m = 5 and p = 0.1.

18

(i) by binomial pmf,

!

5 5×4×3

P (X = 3) = (0.1)3 (1 − 0.1)2 = (0.1)3 (0.9)2 = 0.0081;

3 1×2×3

!

5 5×4

P (X = 2) = (0.1)2 (1 − 0.1)3 = (0.1)2 (0.9)3 = 0.0729;

2 1×2

!

5

P (X = 0) = (0.1)0 (1 − 0.1)5 = (0.9)5 = 0.590;

0

3. Clearly X is a binomial random variable with the following values for the parameters.

(i) m = 100 and p = 0.01. A good design will be rejected if X > 2. So, by binomial pmf,

the corresponding probability is:

P (X > 2) = 1 − P (X ≤ 2)

= 1 − P (X = 0) − P (X = 1) − P (X = 2)

! !

100 100

= 1− (0.01)0 (0.99)100 − (0.01)1 (0.99)99

0 1

!

100

− (0.01)2 (0.99)98

2

= 1 − (0.99)100 − 100 × 0.01 × (0.99)99 − 99 × 50 × (0.01)2 × (0.99)98

= 0.0793732;

(ii) m = 100 and p = 0.05. A poor design will be accepted if X ≤ 2. So, by binomial pmf,

the corresponding probability is:

P (X ≤ 2) = P (X = 0) + P (X = 1) + P (X = 2)

! !

100 100

= (0.05)0 (0.95)100 + (0.05)1 (0.95)99

0 1

!

100

+ (0.05)2 (0.95)98

2

= (0.95)100 + 100 × 0.05 × (0.95)99 + 99 × 50 × (0.05)2 × (0.95)98

= 0.118263.

4. Let X denote the number of defective components. Clearly X is a binomial random variable

with the following values for the parameters.

(i) m = 100 and p = 0.02. Orders can be filled without reordering components if there are

no defective components. So, by binomial pmf, the corresponding probability is:

!

100

P (X = 0) = (0.02)0 (0.98)100 = (0.98)100 = 0.1326196;

0

19

(ii) m = 102 and p = 0.02. Orders can be filled without reordering components if there are

at most two defective components. So, by binomial pmf, the corresponding probability

is:

P (X ≤ 2) = P (X = 0) + P (X = 1) + P (X = 2)

! !

102 102

= (0.02)0 (0.98)102 + (0.02)1 (0.98)101

0 1

!

102

+ (0.02)2 (0.98)100

2

= (0.98)102 + 102 × 0.02 × (0.98)101 + 101 × 51 × (0.02)2 × (0.98)100

= 0.6657502;

(iii) m = 105 and p = 0.02. Orders can be filled without reordering components if there are

at most five defective components. So, by binomial pmf, the corresponding probability

is:

P (X ≤ 5) = P (X = 0) + P (X = 1) + P (X = 2) + P (X = 3)

+P (X = 4) + P (X = 5)

! !

105 105

= (0.02)0 (0.98)105 + (0.02)1 (0.98)104

0 1

! !

105 105

+ (0.02)2 (0.98)103 (0.02)3 (0.98)102

2 3

! !

105 105

+ (0.02)4 (0.98)101 (0.02)5 (0.98)100

4 5

= (0.98)105 + 105 × 0.02 × (0.98)104 + 105 × 52 × (0.02)2 × (0.98)103

+35 × 52 × 103 × (0.02)3 × (0.98)102

+35 × 13 × 103 × 102 × (0.02)4 × (0.98)101

+7 × 13 × 103 × 102 × 101 × (0.02)5 × (0.98)100

= 0.9807593.

5. Clearly X is a binomial random variable with the following values for the parameters.

(i) m = 100 and p = 0.01. A good design will be rejected if X > 2. So, by binomial pmf,

the corresponding probability is:

P (X > 2) = 1 − P (X ≤ 2)

= 1 − P (X = 0) − P (X = 1) − P (X = 2)

! !

100 100

= 1− (0.01)0 (0.99)100 − (0.01)1 (0.99)99

0 1

!

100

− (0.01)2 (0.99)98

2

= 1 − (0.99)100 − 100 × 0.01 × (0.99)99 − 99 × 50 × (0.01)2 × (0.99)98

= 0.0793732;

20

(ii) m = 100 and p = 0.05. A poor design will be accepted if X ≤ 2. So, by binomial pmf,

the corresponding probability is:

P (X ≤ 2) = P (X = 0) + P (X = 1) + P (X = 2)

! !

100 100

= (0.05)0 (0.95)100 + (0.05)1 (0.95)99

0 1

!

100

+ (0.05)2 (0.95)98

2

= (0.95)100 + 100 × 0.05 × (0.95)99 + 99 × 50 × (0.05)2 × (0.95)98

= 0.118263.

4.7 Figures

Binomial PMF

n=10,p=0.1 n=10,p=0.4

0.4

0.4

•

•

•

PMF

PMF

0.2

0.2

• • •

• •

• • •

0.0

0.0

• • • • • • • • • • •

0 2 4 6 8 10 0 2 4 6 8 10

x x

n=10,p=0.7 n=10,p=0.9

0.4

0.4

•

•

•

PMF

PMF

•

0.2

0.2

• •

• •

• • •

0.0

0.0

• • • • • • • • • • •

0 2 4 6 8 10 0 2 4 6 8 10

x x

21

5 Poisson distribution

The Poisson distribution (another discrete distribution) is associated with counts of number of

occurrences of a relatively rare event over a specified interval of time, volume, length, area, etc.

In 1837 Poisson published the derivation of this distribution which bears his name. An important

application of the distribution is in queueing theory.

construction site per month, the number of plating faults per square meter of metal sheet, etc.

5.1 Definition

Given a continuous interval (in time, length, etc), assume discrete events occur randomly through-

out the interval. If the interval can be partitioned into subintervals of small enough length such

that

(ii) the probability of one occurrence in a subinterval is the same for all subintervals and propor-

tional to the length of the subinterval;

(iii) and, the occurrence of an event in one subinterval has no effect on the occurrence or non-

occurrence in another non-overlapping subinterval,

If the mean number of occurrences in the interval is λ, the random variable X that equals the

number of occurrences in the interval is said to have the Poisson distribution with parameter λ.

λx exp(−λ)

p(x) = , x = 0, 1, 2, . . . . (52)

x!

5.2 Properties

E(X) = λ; (53)

V ar(X) = λ; (54)

22

5.3 Problems

1. The number of cracks requiring repair in a section of motor way is assumed to follow a Poisson

distribution with a mean of two cracks per mile:

(i) What is the probability that there are no cracks that require repair in 5 miles of motor

way?

(ii) What is the probability that at least one crack requires repair in one-half mile of motor

way?

(iii) What is the probability that there are exactly three cracks that require repair in 2 miles

of motor way?

intensity 6 or more. Calculate the following:

(i) the probability that such earthquakes will occur in the next 3 years;

(ii) the probability that no such earthquake will occur in the next 10 years.

3. Suppose that in one year the number of industrial accidents follows a Poisson distribution

with mean 3. What is the probability that in a given year there will be at least 1 accident?

4. Cars arrive at a particular highway patrol safety checkpoint with the mean rate λ = 100 per

hour. Each car encountered during a 1-hour period is tested for mechanical defects, and the

drivers are informed of needed repairs. Suppose that 10% of all cars in the state need such

repairs.

(i) What is the probability that at least 4 cars will be found in need of repairs?

(ii) To test the effectiveness of the warnings, the highway patrol will trace the defective cars

to determine if the necessary repairs were eventually made. Supposing that only half of

the originally warned operators ever fix their cars, what is the probability that at least

2 cars will have been repaired?

5. The number of surface flaws in plastic panels used in the interior of automobiles has a Poisson

distribution with a mean of 0.05 per square foot of plastic panel. Assume an automobile

interior contains 10 square feet of plastic panel.

(i) What is the probability that there are no surface flaws in an auto’s interior?

(ii) If 10 cars are sold to a rental company, what is the probability that none of the 10 cars

has any surface flaws?

(iii) If 10 cars are sold to a rental company, what is the probability that at most one car has

any surface flaws?

6. Cars arrive at a particular highway patrol safety checkpoint with the mean rate λ = 100 per

hour. Each car encountered during a 1-hour period is tested for mechanical defects, and the

drivers are informed of needed repairs. Suppose that 10% of all cars in the state need such

repairs.

(i) What is the probability that at least 4 cars will be found in need of repairs?

23

(ii) To test the effectiveness of the warnings, the highway patrol will trace the defective cars

to determine if the necessary repairs were eventually made. Supposing that only half of

the originally warned operators ever fix their cars, what is the probability that at least

2 cars will have been repaired?

7. The number of surface flaws in plastic panels used in the interior of automobiles has a Poisson

distribution with a mean of 0.05 per square foot of plastic panel. Assume an automobile

interior contains 10 square feet of plastic panel.

(i) What is the probability that there are no surface flaws in an auto’s interior?

(ii) If 10 cars are sold to a rental company, what is the probability that none of the 10 cars

has any surface flaws?

(iii) If 10 cars are sold to a rental company, what is the probability that at most one car has

any surface flaws?

5.4 Solutions

1. Let X denote the number of cracks in the motor way. X is a poisson random variable with

the following values for the parameter λ.

100 exp(−10)

P (X = 0) = = exp(−10) = 4.54 × 10−5 ;

0!

(ii) λ = 1/2 × 2 = 1; by Poisson pmf, the required probability is:

10 exp(−1)

P (X ≥ 1) = 1 − P (X = 0) = 1 − = 1 − exp(−1) = 0.632;

0!

(iii) λ = 2 × 2 = 4; by Poisson pmf, the required probability is:

43 exp(−4) 64 exp(−4)

P (X = 3) = = = 0.195.

3! 6

2. Let X denote the number of earthquakes in the region. X is a poisson random variable with

the following values for the parameter λ.

(0.36)0 exp(−0.36)

P (X ≥ 1) = 1 − P (X = 0) = 1 − = 1 − exp(−0.36) = 0.302;

0!

(ii) λ = (12/100) × 10 = 1.2; by Poisson pmf, the required probability is:

(1.2)0 exp(−1.2)

P (X = 0) = = exp(−1.2) = 0.301.

0!

3. Let X denote the number of industrial accidents per year. X is a poisson random variable

with λ = 3. By Poisson pmf, the required probability is:

30 exp(−3)

P (X ≥ 1) = 1 − P (X = 0) = 1 − = 1 − exp(−3) = 0.95.

0!

24

4. (i) Let X denote the number of cars in need of repairs. Clearly X is a poisson random

variable with λ = 100 × 0.1 = 10. By Poisson pmf, the probability that at least 4 cars

will be found in need of repair is:

P (X ≥ 4) = 1 − P (X < 4)

= 1 − P (X = 0) − P (X = 1) − P (X = 2) − P (X = 3)

100 exp(−10) 101 exp(−10) 102 exp(−10) 103 exp(−10)

= 1− − − −

0! 1! 2! 3!

500

= 1 − 61 + exp(−10)

3

= 0.9896639;

(ii) Let X denote the number of cars fixed. Clearly X is a poisson random variable with

λ = 100 × 0.1 × (1/2) = 5. By Poisson pmf, the probability that at least 2 cars will have

been repaired is:

P (X ≥ 2) = 1 − P (X < 2)

= 1 − P (X = 0) − P (X = 1)

50 exp(−5) 51 exp(−5)

= 1− −

0! 1!

= 1 − 6 exp(−5)

= 0.9595723.

5. Let X denote the number of surface flaws. X is a poisson random variable with the following

parameter values.

(i) λ = 10 × 0.05 = 0.5. So, by Poisson pmf, the probability that there are no flaws in an

auto’s interior is:

(0.5)0 exp(−0.5)

P (X = 0) = = exp(−0.5) = 0.6065307;

0!

(ii) λ = 10 × 10 × 0.05 = 5. So, by Poisson pmf, the probability that none of the 10 cars has

any flaws is:

50 exp(−5)

P (X = 0) = = exp(−5) = 0.006737947;

0!

(iii) λ = 10 × 10 × 0.05 = 5. At most one car has any surface flaws means that either there

are no flaws or there is exactly one flaw in one of the 10 cars. So, by Poisson pmf, the

corresponding probability is:

50 exp(−5) 51 exp(−5)

P (X = 0) + P (X = 1) = + = 6 exp(−5) = 0.04042768.

0! 1!

6. (i) Let X denote the number of cars in need of repairs. Clearly X is a poisson random

variable with λ = 100 × 0.1 = 10. By Poisson pmf, the probability that at least 4 cars

will be found in need of repair is:

P (X ≥ 4) = 1 − P (X < 4)

= 1 − P (X = 0) − P (X = 1) − P (X = 2) − P (X = 3)

25

100 exp(−10) 101 exp(−10) 102 exp(−10) 103 exp(−10)

= 1− − − −

0! 1! 2! 3!

500

= 1 − 61 + exp(−10)

3

= 0.9896639;

(ii) Let X denote the number of cars fixed. Clearly X is a poisson random variable with

λ = 100 × 0.1 × (1/2) = 5. By Poisson pmf, the probability that at least 2 cars will have

been repaired is:

P (X ≥ 2) = 1 − P (X < 2)

= 1 − P (X = 0) − P (X = 1)

50 exp(−5) 51 exp(−5)

= 1− −

0! 1!

= 1 − 6 exp(−5)

= 0.9595723.

7. Let X denote the number of surface flaws. X is a poisson random variable with the following

parameter values.

(i) λ = 10 × 0.05 = 0.5. So, by Poisson pmf, the probability that there are no flaws in an

auto’s interior is:

(0.5)0 exp(−0.5)

P (X = 0) = = exp(−0.5) = 0.6065307;

0!

(ii) λ = 10 × 10 × 0.05 = 5. So, by Poisson pmf, the probability that none of the 10 cars has

any flaws is:

50 exp(−5)

P (X = 0) = = exp(−5) = 0.006737947;

0!

(iii) λ = 10 × 10 × 0.05 = 5. At most one car has any surface flaws means that either there

are no flaws or there is exactly one flaw in one of the 10 cars. So, by Poisson pmf, the

corresponding probability is:

50 exp(−5) 51 exp(−5)

P (X = 0) + P (X = 1) = + = 6 exp(−5) = 0.04042768.

0! 1!

1 λx

p(x) = , x = 1, 2, . . . . (55)

exp(λ) − 1 x!

, rX

λx exp(−λ) 1 −1

λj

p(x) = 1 − exp(−λ) , x = r1 , r1 + 1, . . . . (56)

x! j=0

j!

26

A right truncated Poisson distribution is given by

, r2

λx X λj

p(x) = , x = 0, . . . , r2 . (57)

x! j=0

j!

, r2

λx X λj

p(x) = , x = r1 , r1 + 1, . . . , r2 . (58)

x! j=r1

j!

(

λx exp(−λ)/x!, if x = 0, 1, . . . , K − 1,

p(x) = P∞ j (59)

j=K λ exp(−λ)/j!, if x = K.

p λx1 1−p λx2

p(x) = + , x = 1, 2, . . . . (60)

exp(λ1 ) − 1 x! exp(λ2 ) − 1 x!

27

5.6 Figures

Poisson PMF

Lambda=1 Lambda=2

• •

• •

PMF

PMF

0.2

0.2

• •

•

•

• • •

0.0

0.0

• • • • • • • • • • • • • • • • • • • • •

0 5 10 15 0 5 10 15

x x

Lambda=5 Lambda=10

PMF

PMF

0.2

0.2

• • •

• • • •

• •

• • • • • • •

• • •

• • • • • • •

0.0

0.0

• • • • • •

0 5 10 15 0 5 10 15

x x

28

- hpapp02Uploaded byJesus Marcos Barrionuevo Cuestas
- Probability Questions and AnswersUploaded byAK
- Modern Mathematical Statistics With Applications (2nd Edition)Uploaded byAlex Bond
- m2l5Uploaded byManikanta Rockers
- 07 MonteCarloUploaded byptscrib
- chapt3Uploaded byMarjo Kaci
- ECE R13 2-1.pdfUploaded byHassan
- Anal Bioe Syllabus2014Uploaded byBethany Grayczyk
- lec9.pdfUploaded byVanitha Shah
- Chapter 2 ExercisesUploaded by赵玉
- Binomialprobabilitydistributionsppt 150312233957 Conversion Gate01Uploaded byElenildo Barros
- 15MA301-Notes Unit-1 and Unit 2 (1)Uploaded byMd Meraj Samani
- prp notesUploaded byBharath Jojo
- MCA Regular First to Fourth Semester Syllabus Final..Uploaded byIshna Anhsi
- 14 - ProbabilityUploaded byMarvs Landicho
- Rate Decline Analysis for Naturally Fractured ReservoirsUploaded byYadira Torres
- Prinsip DasarpengukuranUploaded byDavid Sigalingging
- XII Probability Assignment MainUploaded byCRPF School
- 3300 Chapter 06 SlidesUploaded bystephanieyun00
- 2018-A Weighted Negative Binomial Lindley Distribution With Applications to Dispersed DataUploaded bysss
- Systematic SamplingUploaded byMuhammad Saad
- CGP as Level Maths Revision and PracticeUploaded byAbhiKhan
- Reinforcement LearningUploaded byHoaxerisation
- Bid and Ask Prices Tailored to Traders' Risk Aversion and Gain Propension: a Normative ApproachUploaded byAI Coordinator - CSC Journals
- HSMMs_AIJUploaded bysrihari.viswanathan
- Ce21 Engineering Statistics Syllabus 2nd Semester 2014 2015Uploaded byJigan Zurbano
- Corelation CoeficientUploaded bynitin30
- info-fa15Uploaded byFrost A
- Collaborative Statistics Teacher's GuideUploaded byjojoj
- cu 3008 Assignment 3 - Due April 8thUploaded byJim Hack

- La NotesUploaded bySabujGolder
- cmim04.pdfUploaded byOmar Oubaha
- Notes Ma131 Ch6Uploaded byXcalibur1101
- Calculation Genus SeifertUploaded byDiptaishik Choudhury
- Complexity Intro CourseUploaded byrkro@
- Neural Network Based System IdentificationUploaded bylunwen
- Eight Hateful SequencesUploaded byPradyumnaSadgir
- Ofsted 2007Uploaded byJoe Cook
- Solving Optimal Control Problems With State ConstraintsUploaded byLissete Vergara
- Assessing Risk of a Serious Failure Mode Based on Limited Field DataUploaded byShawn Waltz
- 2.2 Multistep EquationsUploaded byMrPeterson25
- Indonesian News Classification Using Support Vector MachineUploaded byShel Vina
- 1040-6Uploaded bySuni Nice
- Estimations 12 Nov 12Uploaded byGolamKibriabipu
- Microsoft Word - Chotu Session Algebra Instructor NotesUploaded byGoodone Thisone
- diffeqnweek6Uploaded byAfs Asg
- 1DUploaded bySesha Sai Kumar
- Exponent 2Uploaded byRiska Visitasari
- advtcmmUploaded byBaljinder Singh
- A Computer Model for the Schillinger System of Composition (Rankin).pdfUploaded bygilgoos
- Syllabus Paper Additional Mathematics (0606) for 2017 - 2019Uploaded byalia
- solu8Uploaded byAlisen Erdogan
- Math Jan 2005 Exam S1Uploaded bydylandon
- Student.certification MIGGYUploaded byMiguel Hernandez
- Lagrangian Fluid MechanicsUploaded byAnuj Pathak
- 2-ambirikaUploaded byapi-270305905
- Analysis and Optimization of a Butterfly Valve DiscUploaded byAsiff Razif
- Formulation of Limit Analysis in OptumUploaded bya_mohameden
- chapter_1-190810_073902Uploaded byanna5686
- Graph Theory Summary NotesUploaded byyoung07ly