You are on page 1of 2

Probability Theory:

The Basic Assumptions:


1. A randomexperiment has been defined, and a set (sample space) of all possible outcomes has been identified.
2. A class of subsets of called events has been specified.
3. Each event is assigned a number, the probability of event , denoted ().
The Basic Axioms of Probability:
1. 0 () 1
Probabilities are non-negative and do not exceed unity.
2. () = 1
The probability of the sample space is 1, indicating that once the randomexperiment has been conducted, something must happen.
3. If and are two events in that cannot occur simultaneously (mutually exclusive or disjoint), then ( or ) = () +().

Note: The discrete sample spaces considered typically have equally likely outcomes.

Axiom3 can be generalized to

=1
=

=1

<
++(1)
+1
(
1

2
. . .

)
Conditional Probability:
The probability that event has occurred given that event has occurred is given by
(|) =
( )
()

Independence:
When two events and are independent, then knowledge that one of these events has occurred doesn't give any information regarding whether
or not the other event has also occurred. Mathematically,
(|) = ()
and
(|) = ()
This implies that
( ) = ()()

This agrees with our intuitive knowledge that to obtain the probability of independent events occurring simultaneously, one multiplies the
probabilities of each of the individual events.

Combinatorics: objects out of objects
Sampling with Ordering and with Replacement:
=


Sampling with Ordering and without Replacement:
=

=
!
( )!
= ( 1)( 2) ( +1)
Sampling without Ordering and without Replacement:
=

=
!
! ( )!

Random Variables:
A randomvariable is a function that maps events within the sample space of a randomexperiment to numbers on the real line. Naturally,
( = ) = [{: () = }] for .
The cumulative distribution function (cdf) of a randomvariable is defined as

() = ( )
Properties of the cdf:
0

() 1
lim

() = 1 lim

() = 0

() is a non-decreasing function of .

() is continuous for the right. That is,

(
0
) = lim

0
+

().

The probability density function (pdf) of a randomvariable with cdf

() is defined as

() =

()


Properties of the pdf:

() 0 ( ) =

()

() =

()

() =

()

= 1




Functions of a Random Variable:
For any randomvariable and real function () defined for , we can define a function of the randomvariable (which is, itself, a
randomvariable) = (). The pdf of can be found using two different methods.

() =

()
|()|

=
1
()

The other method involves first evaluating the cdf of analytically.

() = ( ) = (() )
Solving for the inequality (if possible) should lead to

() =
1
() =

1
()
It is not only instructive, but pertinent in some cases to go through the evaluation step by step because the function = () may not be a one-
to-one function.

Discrete Random Variables:
The cdf of a discrete randomvariable is a "staircase" function, whereas the pdf of a discrete randomvariable takes on the formof an impulse train
with varying impulse strengths. Therefore, it is more convenient to describe discrete RVs in terms of the probability mass function (pmf). The
pmf of a discrete randomvariable is defined as

() = ( = )
Thus, the expectation of some function of a discrete randomvariable , (), is more conveniently calculated as
[()] = ()

()



Type Description
Sample Space of

pmf Expectation
Bernoulli
Consider a trial with success probability . = 1 if the
trial is successful.
= 0 if the trial failed.

= {0, 1}

(0) = = 1

(1) =
[] =
Binomial is the number of successes in Bernoulli trials.

= {0, 1, . . . , }

() =

(1 )

[] =
Geometric
is the number of independent Bernoulli trials until the
first success is observed.

= {1, 2, . . . }

() = (1 )
1
[] =
1


Poisson
is the number of events that occur in one time unit
when the time between events is exponentially distributed
with mean 1 . The Poisson randomvariable is an
excellent approximation of the binomial randomvariable
with = when is large.

= {0, 1, 2, . . . }

() =


[] =
Uniform
The uniformrandomvariable occurs whenever outcomes
are equally likely.

= {1, 2, . . . , }

() =
1

[] =
+1
2


Continuous Random Variables:
Type Description pdf Expectation
Uniform
The uniformrandomvariable is uniformly distributed over the
interval [, ]

() =
1

, for
0, otherwise
[] =
+
2

Exponential
The exponential randomvariable is the only continuous random
variable with the memoryless (time invariance) property.
Time invariance indicates that ( > ) = ( > +| > ).

() =

() [] =
1


Gaussian (Normal)
~(,
2
)
Many distributions in nature involving large samples of
individuals have Gaussian distributions.

() =
1
2

()
2
2
2
[] =
Laplacian > 0

() =

2

||
[] = 0

Gaussian Distribution:
When calculating integrals involving the Gaussian distribution, the substitution
=


is often warranted.
Q-function:

() =

=
1
2

2
2

= 1

= 1
1
2

2
2


Clearly, fromthe symmetry of the Gaussian distribution,
(0) = 1 2 () = 1 ()

For 0 < ,
()
1
1
1

+
1

2
+2

1
2

2
2

You might also like