Professional Documents
Culture Documents
Probability
Distributions 5.1
Introduction
Most of us will have met the idea of a frequency distribution. We collected some data, sub-
divided it into classes and the frequency (or number of items) for each class was noted. Once
a frequency distribution is constructed there are two particular statistics that characterise the
distribution, the mean value and the variance.
Because most engineering phenomena are subject to random inuences, their output is random
in nature: more precisely, their output is a random variable. In order to model such phenomena,
we use a probability distribution. Using the theoretical probability model we predict a corre-
sponding frequency distribution which can then be compared with observation.
In this block we examine discrete probability distributions where the values of the random
variable may be written as a list. As with frequency distributions we nd that a probability
distribution can also be characterised by the mean and variance.
nd the mean and variance of a attempt every guided exercise and most
discrete probability distribution of the other exercises
1. Discrete Probability Distribution
Consider the experiment of throwing two coins. With the usual notation the sample space for
this experiment is:
S = {HH, HT, T H, T T }.
A random variable is a rule which assigns a number to each member of S and is denoted by a
capital letter. We can choose this rule as we please. For example,
(i) X: number of heads which appear.
(ii) Y : has the value 1 if both throws are the same and 0 otherwise.
and so on.
With each value of the random variable we can associate a probability. To see how this is done
we again consider the random variable X introduced above. This random variable has only
three values: X = 0 (no heads appear), X = 1 (one head appears), X = 2 (both throws give a
head).
The probability that X has the value zero is written P (X = 0) and:
1
P (X = 0) = P ({T T } occurs) =
4
Similarly
1
P (X = 1) = P ({HT } {T H}) =
2
1
P (X = 2) = P ({HH}) = .
4
We can record these values of X and the associated probabilities in tabular or in graphical form.
See Figure 1.
P (X= x)
x 0 1 2 1
1 1 1 2
P (X= x) 4 2 4 1
4
0 1 2 x
Figure 1
Either of these forms is referred to as the probability distribution of X.
P (X= x )
x 1 2 3 4 5 6 1
1 1 1 1 1 1 6
P (X= x ) 6 6 6 6 6 6
1 2 3 4 5 6 x
Figure 2
Key Point
Let X be a random variable associated with an experiment. Let the values of X be denoted
by x1 , x2 , . . . , xn and let P (X = xi ) be the probability that xi occurs. We have two necessary
conditions for a valid probability distribution
P (X = xi ) 0 for all xi
n
P (X = xi ) = 1
i=1
P(X= x)
2 xi 1 2 3 4 5
(i) 5 (ii)
P (xi ) 0.3 0 0.4 0.1 0.4
1
5
1 2 3 4 x
Answer
Number of heads 0 1 2 3
Frequency 11 37 39 13
The probability of obtaining no heads in three throws is 18 and in 100 throws we would expect
on that basis 100 18 = 12.5 heads. The probability of obtaining one head is 38 and in 100 throws
we would expect 100 38 = 37.5. We can calculate the expected number of times of obtaining
two heads and of obtaining three heads. We can therefore complete the table as follows
Number of heads 0 1 2 3
Observed Frequency 11 37 39 13
Expected Frequency 12.5 37.5 37.5 12.5
We see that there is quite a good agreement between theory and experiment.
A second way of comparing a theoretical probability distribution to an experimental frequency
distribution is to compare their means and variances.
Consider performing an experiment s in which distinct observations x1 , x2 , . . . , xs with frequencies
f1 , f2 , . . . , fs are made. If n = i=1 fi , is the total number of observations then the quantity fni
is called the relative frequency of the observation with value xi . Relative frequencies are akin
to probabilities: informally, we would say that the chance of observing xk is fnk . This observation
leads to the following keypoint:
Key Point
Key Point
xi 1 2 3 4 5 6
1 1 1 1 1 1
Pi 6 6 6 6 6 6
Here
6
1 1 1 21
E(X) = Pi x i = 1 + 2 + ... + 6 = = 3.5
i=1
6 6 6 6
At rst sight this might appear to be a strange result since no single throw of a die can produce
3.5. However, if we carry out the experiment a large number of times and average the score
obtained this average should be close to 3.5 if the die is fair; also, the agreement between
experimental average and the theoretical expectation value improves as the number of throws
increases.
1
n
2
= fi (xi )2
n i=1
As a rough guide: the further away the data values are from the mean the larger will be the
variance.
The variance is often written in an alternative form which is obtained by expanding the square
(xi )2 and simplifying: n
1
2 = fi x2i 2
n i=1
This is often quoted in words:
The variance is equal to the mean of the squares minus the square of the mean.
We now extend the concept of variance to a random variable.
Key Point
The variance of a random variable
Let X be a random variable with values x1 , x2 , . . . , xn . The variance of X, which is written
V (X) is dened by
n
V (X) = Pi (xi )2
i=1
where E(X). We note that V (X) can be written in the alternative form
2. A hand-held calculator has a clock cycle time of 100 nanoseconds; these are positions
numbered 0, 1, . . . , 99. Assume a ag is set during a particular cycle at a random
position. Thus, if X is the position number at which the ag is set.
1
P (X = k) = k = 0, 1, 2, . . . , 99.
100
Evaluate the average position number E(X), and , the standard deviation.
(Hint: The sum of the rst k integers is k(k + 1)/2 and the sum of their squares is:
k(k + 1)(2k + 1)/6.)
y 0 1
1
P ( Y= y ) 1 1 2
2 2
0 1 y
x 0 1 2 3
1
1 3 3 1 8
P(X= x ) 8 8 8 8
0 1 2 3 x
(ii) In this case the sum of the probabilities is 1, but not all P (xi ) are non-negative
E(X) = 1
8
0 + 38 1 + 38 2 + 18 3 = 12
8
= 1.5
1 1 99(99 + 1))
E(X) = {0 + 1 + 2 + 3 + . . . + 99} = = 49.5
100 100 2
2 = mean of squares - square of means
1 2
2 = [1 + 22 + . . . + 992 ] (49.5)2
100
1 [99(100)(199)]
= 49.52 = 833.25
100 6
16 1 8
= . =
25 2 25
8 1 4
= . =
25 2 25
1 1
= . =
25 2 50
Engineering Mathematics: Open Learning Unit Level 1 16
5.1: Probability
x 0 3 5 10
25 16 8 1 E(X) = 48+40+10
50
= 1.96.
P (X = x) /50 /50 /50 /50