You are on page 1of 59

Mathematical Methods

Expectation & Moments


Lecture 07
Sajid Bashir, PhD
sajid@iqraisb.edu.pk

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
The Case of Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
The Case of Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Considerations in applications of expectations and moments,


First: Knowledge of distribution for calculating moments.
Second: Using it to make statements about behavior of RV.

Knowledge of mean and variance,


Although very useful is not sufficient to determine distribution.
Questions, What is P(X 8) cannot be answered

Knowing only mean and variance it is possible to establish some probability bounds!
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Chebyshev Inequality
For any k > 0,

In English
Probability that the outcome of an experiment with random variable X will fall more than k
standard deviations beyond the mean of X, mX, is less than 1/k2.
Or
Portion of the total area under pdf of X outside of k standard deviations from mean mX is at
most 1/k2.
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Proof

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Example: Measurement Tape


For three-foot tape measures, we can write

If k = 2,
or

Probability of a three-foot tape measure being in error less than or equal to 0.06 feet is at
least 0.75.
Various probability bounds can be found by assigning different values to k.
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
The Case of Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Moments of Two or more RV

Expectation of a real-valued function of two RVs X and Y

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

Joint moments

Joint central moments

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

10

Individual Mean:

As in Single RV

Means of X and Y are 10 and 01 respectively.


Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

11

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
The Case of Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

12

Covariance and Correlation Coefficient


Covariance
First joint central moment gives a measure of interdependence.
It is related to nm by:

Correlation coefficient (|| 1)

If correlation coefficient vanishes RVs are said to be uncorrelated.

X and Y are independent if,


Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

13

Proof: (|| 1)

Since expectation of a nonnegative function of X and Y is nonnegative, (t, u) is


a nonnegative quadratic form in t and u, and

Importance in Analysis:
A measure of linear interdependence between RVs.
Value is a measure of accuracy with which one RV can be approximated by a linear function
of other.
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

14

Approximating X by a linear function, aY + b.


Choose a and b so to minimize mean-square error,

Taking partial derivatives with respect to a and b and setting them to zero,
minimum is attained when

Minimum mean-square error is X2(1 - 2)

Fit in mean-square sense


Exact when || = 1 and worst when = 0.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

15

Correlation coefficient
Measures only linear interdependence between two RVs.
It is not a general measure of interdependence.

= +1 (positively perfectly correlated), values they


assume fall on a straight line with positive slope;
= -1 (negatively perfectly correlated), values form
a straight line with negative slope.

Value || of decreases as scatter about these lines


increases.
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

16

Example

Determine of RVs X and Y


when X takes values 1 and 2 each with probability and Y = X2.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

17

Y assumes values 1 and 4, each with probability 1/2,

Means and second moment 11:

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

18

X and Y are uncorrelated but they are completely dependent on each other in
a nonlinear way.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

19

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
The Case of Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

20

Useful Inequalities

Extension to moments

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

21

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

22

Expectation of a function g(X1, X2, . . . , Xn) of n RVs X1, X2,..., Xn

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

23

Covariance matrix represents variance and covariance of RVs


X is random column vector with RVs as its components.
mX is a vector of means of RVs.

An nn symmetric matrix

Diagonal elements are variances


Off-diagonal elements are covariance.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

24

If X1, X2,..., Xn are mutually independent, then

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

25

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

26

Mean of sum is sum of means

Mean of linear combination of RVs is linear combination of means

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

27

For mutually independent RVs variance of sum is sum of variances

For linear combination of RVs

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

28

Example

An inspection is carried out for a group of n television picture tubes. If each


passes inspection with probability p and fails with probability q (p + q = 1),
calculate the average number of tubes in n tubes that pass the inspection.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

29

Introducing a RV Xj to represent outcome of jth inspection

Mean

Variance

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

30

Defining a RV, value is total number of tubes passing inspection.


Desired average number

Assuming independence variance

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

31

Law of large numbers


RV Y/n, where Y = X1+ +Xn, can be interpreted as an average of n RVs
independently observed from same distribution.

Law of Large Numbers


Probability that Y/n differs from mean by greater than an arbitrarily prescribed
tends to zero i.e. As n the RV Y/n approaches true mean with
probability 1.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

32

Example
If X1,..., Xn is a set of mutually independent RVs with a common distribution,
each having mean m.

Show that, as n for every > 0 :

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

33

If 2 is variance of each Xj
According to Chebyshev inequality, for every k > 0

Establishing the proof: For k = n left-hand side is less than 2/(2n), and it
tends to zero as n .

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

34

Example: Statistical Sampling


Suppose in a group of m families there are mj number of families with exactly
j children ( j = 0, 1,..., and m0 + m1 +... = m).
For a family chosen at random, number of children is a RV that assumes value r
with probability pr = mr/m.
A sample of n families among this group represents n observed independent RVs
X1,..., Xn, with same distribution.
Quantity (X1++ Xn)/n is sample average, and according to law of large
numbers for sufficiently large samples,

The sample average is likely to be close to the mean of population.


Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

35

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

36

Characteristic function X(t) is the expectation E{ejtX} of a RV X with t an


arbitrary real-valued parameter.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

37

X(t) is generally complex valued.


Properties

(3)

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

38

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

39

Generation of Moments
Expanding X(t) as a MacLaurin series

Coefficients of power series

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

40

Characteristic function from moments:

Moments from characteristic function:

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

41

Binomial Distribution

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

42

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

43

Exponential Distribution

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

44

Cumulants of a RV
Power series representation of logarithm of X(t);

1 is mean, 2 is variance, and 3 is third central moment.


Higher order n are related to moments of same order or lower, but in a more complex way.
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

45

Outline
Moments of a Single Random Variable
Mean, Median, and Mode
Central Moments, Variance, and Standard Deviation
Conditional Expectation

Chebyshev Inequality
Moments of Two or More Random Variables
Covariance and Correlation Coefficient
Schwarz Inequality
Three or More Random Variables

Moments of Sums of Random Variables


Characteristic Functions
Generation of Moments
Inversion Formulae
Joint Characteristic Functions
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

46

X(t) is defined as inverse Fourier transform of fX(x).


Other half of Fourier transform pair

knowledge of the characteristic function specifies distribution of X.

From the theory of Fourier transforms that fX(x) is unique


No two distinct density functions can have same characteristic function.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

47

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

48

(t) of a sum of independent RVs is equal to product of (t) of individual RVs.

Mutual independence

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

49

Example: Gamma Distribution


Let X1 and X2 be two independent RVs having exponential distributions with
parameter a, and let Y = X1 + X2. Determine the distribution of Y.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

50

Example: Brownian Motion


In 1827, Robert Brown, an English botanist, noticed that small particles of
matter from plants undergo erratic movements when suspended in fluids.
It was soon discovered that the erratic motion was caused by impacts on
particles by molecules of fluid in which they were suspended.
This phenomenon, which can also be observed in gases, is called Brownian
motion.
Explanation of Brownian motion was one of the major successes of statistical
mechanics.
Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

51

Consider a particle taking steps on a straight line.


It moves either one step to right with probability p, or one step to left with
probability q. (p + q = 1)
Steps are always of unit length, positive to right and negative to left, and they
are taken independently.

Determine pmf of its position after n steps.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

52

Let Xi be the RV associated with ith step and define

RV Y gives position of particle after n steps; takes integer values between -n


and n.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

53

Characteristic function of each Xi

Invoking independence

Rewriting

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

54

Letting k = 2i n

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

55

Considerable importance is attached to symmetric case;


k << n, and p = q =

In order to consider this special case, consider Stirlings formula, for large n,

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

56

Gaussian or Normal Distribution


Further simplification; length of each step is small.
Assume r steps each of length a occur in a unit time (i.e. n = rt).
As n becomes large, random variable Y approaches a continuous RV,

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

57

Einstein also obtained this result with

R is universal gas constant,


T is absolute temperature,
N is Avogadros number, and
f is coefficient of friction which, for liquid or gas at ordinary pressure, can be
expressed in terms of its viscosity and particle size.

In 1926 Perrin, a French physicist, was awarded the Nobel Prize for his
success in determining, from experiment, Avogadros number.

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

58

Thank You

Sajid Bashir,PhD

Random Variables and Probability Distributions

Nov-15

59

You might also like