You are on page 1of 13

1

Chapter VIII: Extra



Section 1: Principles of probability.....................................................................................2
Rules of probability.........................................................................................................3
Combinatories ..................................................................................................................7
Distribution functions ....................................................................................................10





2
Section 1: Principles of probability

Events or outcomes = generic terms, describe possible result of experience
One possible conformation of a molecule
The result of rolling a 6 sided die

Let consider outcomes fall in , A B or Ccategories

Definition of probability
If N is the total number of outcomes and
A
n is the number of outcomes that fall in
category Athen the probability of outcome Ais
(8.1.1)
A
A
n
p
N


Deterministic event 1
A
p and impossible event 0
A
p

In general: 0 1
A
p

Example 8.1.1: if Ahappens 20% of the time
20
100
A
p
Example 8.1.2: rolling 6-sided die probability of getting a 6 is
6
1
6
p

Relationships among events:
1. Mutually exclusive: if A happens, then B does not and vice versa
2. Collectively exhaustive: the set of events,
1 2
, , ,
t
A A A K covers entire possibilities
3. Independent : outcome of A does not depend on outcome of B and vice versa
4. Mulitplicity: total number of ways in which different outcomes can possibly
occur ex. for events , A B and Coccurring ,
A B
n n and
C
n times, multiplicity
(8.1.2)
A B C
W n n n



3
Rules of probability

Addition rule:

Consider events , , , , A B C E K mutually exclusive

Probability of observing A or B or C or or E (union A B E L )

(8.1.3) ( )
A B E
A B E
n n n
p A B E p p p
N
+ + +
+ + +
L
L L

Where
i
n are the statistical weights

If set is exhaustive + exclusive ( ) 1
A B E
n n n N
p A B E
N N
+ + +

L
L

Definition of exhaustive:

(8.1.4)
1
1
t
i
i
p p



Multiplication rule
, , , , A B C E K are independent, probability of observing A and B and C and and E
(intersection A B E L )
(8.1.5) ( )
A B E
A B E
n n n
p A B E p p p
N N N
L L L

If
A
p is the probability that A happens and the set is exhaustive then 1
A
p is the
probability that Awill not happen
Probability that A and B happen: ( )
A B
p AB p p
Probability that A happen but not B:
( ) ( ) ( ) ( ) not 1
A A A A B
p A B p p p p p p A p A B
Probability that neither A or B happen:
( ) ( ) ( ) ( ) ( ) ( ) not not 1 1 1 1
A A A B A B
p A B p p p p p p p A B p A B + + +

Probability that something will happen:
( ) ( ) ( )( )
( ) ( ) ( )
something happen 1 not not 1 1 1

A A
A B A B
p p A B p p
p p p p p A B p A B

+ + +

4
Composite events

Reformulation = any question can be framed in terms of conditions AND and OR

Example8.1.3: what is the probability of getting 1 on first roll and 4 on second?

Not equal to 1 AND 4 on second


36 composite events and 11 are successful so ( )
11
1 first OR 4 second
36
p

In terms of AND terms:
( ) ( )
( )
( )
1 first OR 4 second 1 first and anything but 4 second
anyt hing but 1 first and 4 second
1 first and 4 second

p p
p
p
+
+ +
+
1 5 5 1 1 1 11

6 6 6 6 6 6 36
+ +




Correlated events = conditional probabilities

Correlated events outcome of A depends on outcome of B

Conditional probability: ( ) | p B A probability that B occur knowing A occurred

Joint probability ( ) ( ) ( ) and p A B p A B p AB both events occur

General multiplication rule = Bayes rule independence not required
(8.1.6) ( ) ( ) ( ) ( ) ( ) | | p AB p B A p A p A B p B
Where ( ) p A a priori probability (prior) and ( ) | p B A a posteriori probability

5
General addition rule
(8.1.7) ( ) ( ) ( ) ( ) p A B p A p B p A B +

When A and B mutually exclusive ( ) 0 p A B
When A and B independent ( )
A B
p A B p p

Degree of correlation how much one outcome depends on another
(8.1.8)
( ) ( ) |
B A B
p B A p AB
g
p p p

1 g for independent events
1 g > for positively correlated events
1 g < for negatively correlated events

Ex. attractive and repulsive forces between molecules in liquids can cause correlations
among positions or orientations influencing level of entropy of liquids

Example 8.1.4: 1 R ball + 2 G ball in barrel, the probabilities of 3 draws depends if you
put back or not the balls in barrel



6

Example 8.1.5: Gambling equation





Suppose we know a priori probabilities , , ,
A B E
p p p K that horses A, B, , E will win race

Race sequence of events = 1 horse first then another second etc.

What is the probability of A will arrive second if C was first?

Conditional probability can be estimated by removing area C and calculating fraction that
A occupy in remaining area

(8.1.9) ( )
( )
( ) ( ) ( ) ( )
( )
( )
second| first
1
p A p A
p A C
p A p B p D p E p C

+ + +

If
i
p is the probability for horse i to be first, the probability for j to be second is
(8.1.10) ( ) |
1
j
i
p
p j i
p


Joint probability that i first, j second and k third
(8.1.11) ( )
[ ] [ ]
|
1 1 1 1
j i j k
k
i
i i j i i j
p p p p
p
p k ij p
p p p p p p

1 1
] ]


Useful formula for computing the probability of drawing the queen of hearts in a card
game once you have seen seven of clubs and ace pf spades

Also useful in describing the statistical thermodynamics of liquid crystals and ligand
binds to DNA



7
Combinatories

How to count events

Basic to understand entropy concept of order and disorder = ways system can be
configured

Concerned with composition not sequence of events

What is the probability of observing 3H and 1T in order?
4
3 1
1 1
2 16
H T
p
_


,


Not the same as probability of 3H and 1T: HHHT, HHTH, HTHH, THHH all valid
4
1 4 1
4
2 16 4
p
_


,
much more probable

How many permutations or different sequence of w, x, y, and z are possible?

In general for N distinguishable objects, the number of different permutations
(8.1.12) ( ) ( ) ! 1 2 3 2 1 W N N N N L

Therefore for 4 objects: 4! 24 W


Example 8.1.6: consider a barrel with 26 letters of alphabet what is the probability to
draw the 26 letters in order
If one put back letters in barrel after each draw:
26
1
26
p
_


,

Without replacement:
1 1
26 25 24 2 1 !
p
N

L
where ! N is the number of
permutations or different sequences in which the letters could be drawn


Factorial notation: ( )( ) ! 1 2 3 2 1 N N N N L with 0! 1




8

Example 8.1.7: counting sequences distinguishable and indistinguishable objects

How many different arrangements of three letters A, H and A

Distingushable:
1 2
, A A and B 3! 6 W


Undistinguishable: A, H and A
! 3!
3
! 2!
A
N
W
N



Generalizing for indistinguishable events
1 2
, , ,
t
n n n K
(8.1.13)
1 2
!
! ! !
t
N
W
n n n

K


When only two categories are present ( ) 2 t :
(8.1.14) ( )
( )
!
,
! !
N
N
W n N
n n N n
_

,


Example 8.1.8: Counting sequences of coin flips and die rolls

Flip a coin 117 times how many different sequences have 36 heads impossible to
write all the sequences

( )
30
117!
36,117 1.84 10
36!81!
W

Roll a die 15 times, how many sequences have three 1s, one 2, one 3 five 4s and two 5s
( )
6
15!
36,117 1.51 10
3!1!1!5!2!3!
W


9

Example 8.1.9: probability of royal flush in poker

Royal flush = 1 ace, king, jack, queen and ten (any of the four suits)

How many 5 hands possible:
( )
( )
52 51 50 49 48 52!
5,52
5! 5! 52 5 !
W



The 5! term in denominator correct for permutation of 5 sequences

Probability of royal flush:
( )
6
4
1.5 10
5,52 W

extremely rare, but not impossible



Example 8.1.10: Bose Einstein statistics

Counting needed for bosons indistinguishable particles and Pauli principle does not
apply (particles can be in same energy level)

How many ways can n indistinguishable particles be put into M boxes, with any number
of particles per box?




Think of system has linear array of n particles interspersed with 1 M indistinguishable
movable walls that partition the system into M boxes

There are 1 M n + objects (counting walls + particles) the particles are distinguishable
from the walls

Number of arrangements
(8.1.15) ( )
( )
( )
1 !
,
! 1 !
M n
W n M
n M
+





10
Distribution functions

Collection of probabilities = distribution functions

Consider t outcomes 1,2,3, , i t K mutually exclusive and exhaustive
(8.1.16)
1
1
t
i
i
p


In statistical physics the order of the outcomes usually has a meaning and i corresponds
to value of some physical quantity

If the outcomes = continuous variables then probability is ( ) p x dx and ( ) p x is
probability density

(8.1.17) ( ) 1 p x dx


Normalization: for function ( ) x within range x a and x b , find value
0
such that
(8.1.18) ( )
0
b
a
x dx



To form a proper probability distribution function
(8.1.19) ( )
( ) ( )
( )
0
b
a
x x
p x
x dx


Useful distributions:

Binomial distribution

Describes processes in which each independent elementary event has two mutually
exclusive outcomes

(8.1.20) ( ) ( )
( )
!
, 1
! !
N n
n
N
P n N p p
n N n



11
Pascal triangle = simple way to write combinatoric terms in the binomial distribution


Coefficients in expression ( )
N
x y +

Example 8.1.11 Distribution of coin flips


Distribution function for probability ( ) ,
H
p n N , of observing
H
n heads in 4 N coin
flips, with 0.5 p (unbiased coin)

Most probable number of heads = 2

Generalization = multinomial probability distribution

(8.1.21) ( )
1 2
1 2 1 2
1 2
!
, , , , , ,
! ! !
t
n n n
t t
t
N
P n n n N p p p
n n n
K K
L

Where
1
t
i
i
n N

12
A probability distribution function contains all the information that can be known about a
probabilistic system

In general this function is not known

What is available from experiments = average or nth moment of a probability
distribution function
n
x
(8.1.22) ( )
( )
( )
b
n
b
n n a
b
a
a
x x dx
x x p x dx
x dx


Average = first moment

(8.1.23)
( )
1
t
i
i ip i



Or for a continuous function
(8.1.24) ( )
b
a
x xp x dx



The mean of function ( ) f i over t discrete values
(8.1.25)
( ) ( ) ( )
1
t
i
f i f i p i



Over continuous value
(8.1.26) ( ) ( ) ( )
( ) ( )
( )
b
b
a
b
a
a
f x x dx
f x f x p x dx
x dx


Example 8.1.12: Average

The average of the set of number [3,3,2,2,2,1,1]

( ) ( ) ( ) ( )
3
1
2 3 2 14
1 1 2 2 3 3 1 2 3 2
7 7 7 7
i
i ip i p p p

+ + + +



General properties of average:
(8.1.27)
( ) ( )
af x a f x
where a is a constant

And
(8.1.28)
( ) ( ) ( ) ( )
f x g x f x g x + +



13
Variance
2
, use second moment and is a measure of the width of a distribution

If we put a x then
(8.1.29) ( )
2 2
2 2 2 2 2 2
2 2 x a x ax a x ax a x x + +


Example 8.1.13 mean and variance of coin flip

( )
4
0
1 4 6 4 1
, 0 1 2 3 4 2
16 16 16 16 16
H
H H H
n
n n p n N

_ _ _ _ _
+ + + +

, , , , ,



( )
4
2 2
0
1 4 6 4 1
, 0 1 4 9 16 5
16 16 16 16 16
H
H H H
n
n n p n N

_ _ _ _ _
+ + + +

, , , , ,



The variance
2
2 2 2
5 2 1
H H
n n

Example 8.1.14: average and variance of continuous function

Suppose a flat probability function ( )
1
p x
a
on
interval 0 x a

( )
2
0 0
0
1 1
2 2
a
a a
x a
x xp x dx xdx
a a
1

1
]



And

( )
3 2
2 2 2
0 0
0
1 1
3 3
a
a a
x a
x x p x dx x dx
a a
1

1
]



So the variance
2 2 2
2
3 4 12
a a a

You might also like