You are on page 1of 2

Uniform Distribution: a random variable that takes any integer value in

an interval with equal likelihood.


Example: choose an integer uniformly between 0 and 10
Parameters: integers a, b (lower and upper bound of interval)
Properties:
1
if k [a, b], 0 otherwise
P (X = k) =
ba+1
a+b
E[X] =
2
(b a)(b a + 2)
V ar(X) =
12
Bernoulli Distribution: value 1 with probability p, 0 with probability 1-p
Exampe: coin toss (p = 12 for fair coin)
Parameters: p
Properties:
E[X] = p
V ar(X) = p(1 p)
Binomial Distribution: sum of n independent Bernoulli trials, each with
parameter p
Example: number of heads in 10 independent coin tosses
Parameters: n, p
Properties:
n
pk (1 p)nk
P (X = k) =
k
E[X] = np

Counting events:
|A B| = |A| + |B| |A B|
|A B C| = |A| + |B| + |C| |B C| |A C| |A B| + |A B C|

Conditional probability:
P (A|B) =

P (A) = P (A B) + P (A B) = P (A|B)P (B) + P (A|B)P (B)

P (A|B) =

P (B|A)P (A)
P (B|A)P (A)
=
P (B)
P (B|A)P (A) + P (B|A)P (A)

P (A B) = P (A|B)P (B) = P (B|A)P (A)

Independence: two events A and B are independent iff


P (A B) = P (A)P (B)

V ar(X) = np(1 p)
Poisson Distribution: numer of events that occur in a unit of time, if those
events occur independently, at an average rate per unit of time
Example: # of cars at traffic light in 1 minute, # of deaths in 1 year by car
Parameters:
Properties:
k
e
P (X = k) =
k!
E[X] =
V ar(X) =
Geometric Distribution: number of independent Bernoulli trials with parameter p until and including first success
Example: # of coins flipped until first head
Parameters: p
Properties:
P (X = k) = (1 p)k1 p
E[X] =

A, B independent = A, B independent = A, B independent

Closed form of a common summation:


X
i0

1
when |x| < 1
1x

E[X] =

x P (x)

xX

E[f (X)] =

f (x) P (x)

xX

Linearity of Expectation:

1
p

E[aX + b] = aE[X] + b

Hypergeometric Distribution: number of successes in n draws (without


replacement) from N items that contain K successes in total
Example: Urn has 10 magenta and 10 cyan balls. Probability of drawing 2
magentas in 4 draws
Parameters: n, N , K
Properties:


K
k

xi =

Expectation

1p
V ar(X) =
p2

P (X = k) =

P (A B)
P (B)

N K
nk
N
n

E[X + Y ] = E[X] + E[Y ]

If X and Y are independent r.v., then


E[X Y ] = E[X] E[Y ]

Variance: the variance of a random variable X with mean E[X] = is


V ar(X) = E[(X )2 ]

K
N
K(N K)(N n)
V ar(X) = n
N 2 (N 1)
E[X] = n

V ar(X) = E[X 2 ] (E[X])2


V ar(aX + b) = a2 V ar(X)

The Binomial Theorem:


(x + y)n =

n  
X
n k nk
x y
k
k=0

If X and Y are independent r.v., then


V ar(X + Y ) = V ar(X) + V ar(Y )

Special case when x = 1 and y = 1:


n  
X
n
k=0

= 2n

which is the same as the number of subsets of a set of size n.

Standard Deviation
=

p
V ar(X)

Continuous Uniform Distribution: a random variable that takes any real


value in an interval with equal likelihood
Example: choose a real number (with infinite precision) between 0 and 10
Parameters: a, b
Properties:
1
if x [a, b]
pdf: f (x) =
ba

L(x1 , x2 , . . . , xn |) =

ln L(x1 , x2 , . . . , xn |) =

1.

1
2

2.

=
An estimator of is unbiased iff E[]
Bernoulli: p =
Normal:
=
=
Poisson:

1
n

Pn

i=1

1
n

Pn

xi

1
n

Pn

xi

Geometric: p =

Weak Law of Large Numbers: for any  > 0, as n


P (|Mn | > ) 0
Strong Law of Large Numbers:
X + X + ... + X 

n
1
2
= =1
n
n
lim

Rules of continuous properties:

x f (x) dx

CDF: F (a) = P (X a) =

f (x) dx

Z
P (a X b) = P (a < X < b) = F (b) F (a) =

f (x) dx
a

d
F (x)
dx








P a X a+
=F a+
F a
 f (a)
2
2
2
2
f (x) =

Markovs Inequality If X is a nonnegative r.v., then for any > 0,


P (X )

E[X]

xi

i=1
i=1

Pnn

i=1

Central Limit Theorem: i.i.d. r.v. X1 ,X2 ,...,Xn , E[Xi ] = ,V ar(Xi ) = 2


As n
X1 + X2 + . . . + Xn n
N (0, 1)

n
or
n
1X
2
Mn =
)
Xi N (,
n i=1
n

or P (X k E[X])

1
k

Chebyshevs Inequality Suppose we know V ar(Y ). If Y is a r.v. with


E[Y ] = and V ar(Y ) = 2 , then for any > 0,
P (|Y | )

ln L(x1 , x2 , . . . , xn |)

= 0 =?
ln L(x1 , x2 , . . . , xn |)

Then take the derivative again to verify it is a max:

Pnn

i=1

E[X] =

ln f (xi |)

3.

=
Exponential:

E[X] =

n
X

2
ln L(x1 , x2 , . . . , xn |) < 0 ?
2

Normal Distribution: classic bell curve


Example: quantum harmonic oscillator ground state (exact) Human heights,
binomial random variables (approximate)
Parameters: , 2
Properties:
(x)2
1
e 22
pdf: f (x) =
2 2

f (xi |)

i=1

V ar(X) =

n
Y

Approach:

(b a)2
V ar(X) =
12
Exponential Distribution: time until next event in Poisson process (cont.
analog to Geometric, cont. reciprocal analog to Poisson)
Example: how long until the next soldier is killed by horse kick?
Parameters: , the average number of events per unit time
Properties:
pdf: f (x) = ex for x 0

V ar(X) =

Maximum Likelihood Estimators: Given independent samples x1 , x2 ,...,


xn from a distribution f (x|), estimate ( are the params. of the dist.). In
general, what value of maximizes

i=1

a+b
E[X] =
2

E[X] =

1 2

2. P (X (1 )) e 2

2
2

Chernoff bounds Suppose X Bin(n, p) and E[X] = = np. Then for


any 0 < < 1,
1 2
1. P (X (1 + )) e 3

xi

xi

2 =

1
n

Pn

i=1 (xi

x)2 (biased)

You might also like