You are on page 1of 4

Addition Rules

P (A [ B) = P (A) + P (B)

P (A \ B)

P (A [ B [ C) = P (A) + P (B) + P (C)

P (A \ B)

Conditional Probability (A given B)


P (A \ B)
P (A|B) =
P (B)
P (A|B c ) = 1 P (A|B)

P (A \ C)

P (B \ C) + P (A \ B \ C)

Multiplication Rule
P (A \ B) = P (B|A) P (A) = P (A|B) P (B)

Total Probability

P (A) = P (A \ B) + P (A \ B c ) = P (A|B) P (B) + P (A|B c ) P (B c )


Independent Events if:
P (A \ B) = P (A) P (B)
Bayes Rule

P (A|B) P (B)
P (A|B) P (B) + P (A|B c ) P (B c )
Counting Techniques (set size n, subset size r)
P (B|A) =

Ordered With replacing: nr


Ordered without replacing (a.k.a. binomial coecient): n Pr =
Unordered: n Cr =

(n

Format of Question:
Examples:

n!
r)! (r)!

num defective
num looking for

total - num defective


num selected - num looking for
total
num selected

n!
(n

r)!

Discrete Random Variables


Pn
= E(x) = 0 n f (n)
Pn
E(x2 ) = 0 n2 f (n)

Binomial Theorem

Linear Transformation (a+bX)

Negative Binomial

= E(a + bX) = a + b E(x)

P (X = x) =

var(x) =

(x) = E(x2 )

P (X = x) =

= E(x) = n p

= sd(a + bX) = |b| sd(x)

fxy = P (X = x, Y = x) = P (X = xandY = y)
Covariance
= E[(X

Correlation
xy =

x ) (Y

y )] = E[XY ]

xy

y
Continuous Random Variables
R1
= E(x) = 1 x f (x)dx
R1
E(x2 ) = 1 x2 f (x)dx
x

var(x) =

Examples:

(x) = E(x2 )

x y

x 1
r 1

p)n

p)x

(1

= var(x) =

P (X = x) = (1
1
= E(x) =
p
2

= var(x) =

Examples:

p)x
1

p
p2

p)

r (r p)
p2
Geometric Distribution
2

Discrete Joint Distribution

px (1

= var(x) = n p (1

r
= E(x) =
p

= var(a + bX) = b2 var(x)

xy

n
x

(p)r

Poisson Approximation
x
P (X = x) = e
x!
V ar(x) = E(x) = = np
Poisson Process
P (X = x) = e
=

Exponential Distribution - Time between Poisson


P.M.F. f (X = x) =
C.D.F. F (x) = 1
1
= E(x) =

( t)x
x!

= Rate of change i.e 2 every 3 mins


Erlang Distribution
N (x) = poisson( x)
Pr 1 e
P (X x) = 1
k=0
r
= E(x) =
r
2
= var(x) = 2

2
3

and x > 0

if x

0; otherwise 0

1
2

Memoryless Property
P (x > t + s|x > s) = P (x > t) = e

( k)k
k!

Uniform Distribution
1
f (x) =
b a
x a
F (x) =
, a < x < b (0 if a < x, otherwise 1)
b a
a+b
= E(x) =
2
(b a)2
2
= var(x) =
12
Estimators of Datasets
= Pn xi
Mean : X
i=1
n
1 Pn
2
2
2
Variance : S =
(Xi X)
n 1 i=1
p
Standard Deviation : S = S 2
# of successes
Population p: P =
n
Range: r = max(xi ) min(xi ) = yn y1
Standard Error
=p
X
rn
p(1 p)
P =
n
Examples:

= var(x) ==
p
2
= sd(x) =

Quartiles (sort dataset!)


Q1 = Ym + 0.25 (ym+1
Q3 = Ym + 0.75 (ym+1
IQR = Q3

ym )
ym )

Q1

Outliers: x > Q3 + 1.5IQR or x < Q1

1.5IQR

Standard Normal (Z score table)


x
P (X < x) = Z =
Central Limit Theorem (on dataset)
b) = (b)
P (a X
(a)

X
p
if n > 30 (x) = Z =
/ n

X
p
if n < 30 (x) = t =
/ n
Confidence Interval
Z p
If known: CI = X
2
n
t ;(n
If unknown and S < 40: CI = X
2
Z p
Else use: CI = X
2

1 conf
=
2 2
2
Z 2
n
; E: Given Error
E
Independent Samples
Y ) (1 2 )
(X
Z= p 2
2
1 /n1 + 2 /n2
Examples:

1) p

Hypothesis Testing Alternatives:


Right Sided alternative:
H0 : = 0
H1 : > 0
Left Sided alternative:
H0 : = 0
H1 : < 0
Two Sided alternative:
H0 : = 0
H1 : 6= 0

Hypothesis Testing Statistic:


Normal pop or n 30 and is known:
0
X
p
Z0 =
/ n
n 40 and unknown:
0
X
p
Z0 =
S/ n
Pop normal and unknown:
0
X
p
T0 =
S/ n
P-Value calculations:
Right Sided alternative:
p-value = 1

(Z0 )

Left Sided alternative:


p-value =

(Z0 )

Two Sided alternative:


p-value = 2 (1

(|Z0 |))

Reject H0 if p-value <


Critical Region calculations:
Right Sided alternative:
z0 > z
Left Sided alternative:
z0 <

Two Sided alternative:


z0 <

z/2 or z0 > z/2

Hypothesis testing for Proportion


P p0
Z0 = p
p0 (1 p0 )/n

Linear Regression

= yP x

n
(x
x
) (yi y)
= i=1
P ni
(x
x
)2
i
Pn i=1
Pn
Pn
n i=1 xi yi ( i=1 xi ) ( i=1 yi )
=
Pn
P2
n i=1 x2i ( i=1 xi )2
sxy
= 2
sx
Examples:

You might also like