5 views

Uploaded by Prasanth

Random Variables_Random Process

- Probability.ppt
- Probability Distribution
- Course Structure.pdf
- Lecture 10
- Lecture Random Processes
- PQT UNIT 1
- Stt630chap3
- 3_Continuous Random Variables
- 226 Prob Stoch2011 Yates
- 0104215v1
- rsh_qam11_ism_ch02.doc
- Introduction to Probability Theory and Statistics
- Stt630chap3.pdf
- Probability Slides
- week06
- Prob and Stat Notes
- Statistics Analysis Scientific Data
- Random Variable - Probability
- Multidimensional Randomness, Standard Random Variables and Variance Algebra
- EE585_Fall2009_Hw3_sol

You are on page 1of 7

Probability density function

Functions of Random variable

Y=g(X)

Z=g(X,Y)

Mean

Variance

correlation

The central Limit theorem

Random process

Classification of random process

WSS

SSS

Power spectral density

White noise

Ergodicity

Random vector

Correlation Matrix

Covariance Matrix

Basics of Probability

defining some terminology relating to random experiments (i.e., experiments

whose outcomes are not predictable).

Outcome

The end result of an experiment. For example, if the experiment consists

of throwing a die, the outcome would be anyone of the six faces, F1,........,F6

Random experiment

An experiment whose outcomes are not known in advance. (e.g. tossing a

coin, throwing a die, measuring the noise voltage at the terminals of a resistor

etc.)

Random event

A random event is an outcome or set of outcomes of a random experiment

that share a common attribute. For example, considering the experiment of

throwing a die, an event could be the 'face F1 ' or 'even indexed faces'

(F2 , F4 , F6 ). We denote the events by upper case letters such as A, B or

A1 , A2

Sample space

The sample space of a random experiment is a mathematical abstraction

used to represent all possible outcomes of the experiment. We denote the

sample space by S .

Mutually exclusive (disjoint) events

Two events A and B are said to be mutually exclusive if they have no

common elements (or outcomes).Hence if A and B are mutually exclusive, they

cannot occur together.

Probability of an Event

The probability of an event has been defined in several ways. Two of the

most popular definitions are: i) the relative frequency definition, and ii) the

classical definition.

Def. 2.11: The relative frequency definition:

Suppose that a random experiment is repeated n times. If the event A

occurs nA times, then the probability of A, denoted by P (A), is defined as

P ( A )=lim

nA

n

nA

n

Represents the fraction of occurrence of event A in n trials.

It is defined as the ratio of number of favourable outcomes to the total number of

outcomes of the random experiment.

P ( A )=

NA

N

Conditional Probability

It represents the probability of event B occurring, given that event A

has occurred.

P ( A )=

P ( B ) P ( A|B )

P( A)

P( A) 0

Bayes Theorem

Let

A 1 , A 2 , An

P ( B A j ) P ( A j )

P ( A j B )=

P ( BA j) P ( A j )

i=1

Independent events

If

( B| A )=P ( B ) , that the occurrence of event A does not affect the occurrence

Thus

P ( AB )=P ( A ) P( B)

X ( )

to every outcome

of the

conditions

2. The probabilities of the event { X = } and { X = } equal zero.

1. The set

as well as the transformation of the event A, which falls on the real line

segment

[ a1 , a2 ]

The elements of the set S that are contained in the evet

{ X x } change

as the number x

P { X x } of the

event

is denoted by

F X ( x)

x . This number

of random variable

F X ( x)=P { X x }

Defined for every

from

The derivative of the probability distribution function

the probability density function

f X (x)

F X ( x)

is called

X .

given for the continuous random variable. You can relate it to discrete

case also

Statistical Averages

The PDF of a random variable provides a complete statistical characterization of the

variable. However, we might be in a situation where the PDF is not available but are

able to estimate (with reasonable accuracy) certain (statistical) averages of the

random variable. Some of these averages do provide a simple and fairly adequate

(though incomplete) description of the random variable. We now define a few of

these averages and explore their significance.

The mean value (also called the expected value, mathematical

expectation or simply expectation) of random variable X is defined as

Variance

Specifying the variance essentially constrains the effective width of the density

function. The second central moment of a random variable is called the variance

and its (positive) square root is called the standard deviation.

Covariance

Correlation coefficient

correlation coefficient and we denote it by . The correlation coefficient is a

measure of dependency between the variables.

Suppose X and Y are independent.

many times, and given X = x1 , then Y would occur sometimes positive with

respect to Y m , and sometimes negative with respect to Y m . In the course of

many trials of the experiments and with the outcome 1 X = x , the sum of the

numbers ( ) 1 Y x y m would be very small and the quantity, sum

number of trials

,

tends to zero as the number of trials keep increasing.

On the other hand, let X and Y be dependent. Suppose for example, the

outcome y is conditioned on the outcome x in such a manner that there is a

greater possibility of ( ) Y y m being of the same sign as ( ) X x m . Then we

being of the opposite sign is quite large, then we expect XY to be negative.

Taking the extreme case of this dependency, let X and Y be so conditioned

that, given 1 X = x , then 1 Y = x , being constant. Then 1 XY = . That

is, for X and Y be independent, we have 0 XY = and for the totally dependent

(y = x) case, 1 XY = . If the variables are neither independent nor totally

dependent, then will have a magnitude between 0 and 1.

Two random variables X and Y are said to be orthogonal if E[XY ] = 0.

If (or ) 0 XY XY = , the random variables X and Y are said to be

uncorrelated. That is, if X and Y are uncorrelated, then XY = X Y . When the

random variables are independent, they are uncorrelated. However, the fact that

they are uncorrelated does not ensure that they are independent.

Consider a sample space S (pertaining to some random experiment) with

sample points 1 2 , , ..........., , ........ n s s s . To every j s S , let us assign a real

valued function of time, ( , ) j x s t which we denote by (

illustrated in Fig. 3.1, which shows a sample space

waveforms, labeled ( ), 1,2,3,4 j x t j = .

x t . This situation is

(collection) of time functions and a probability measure is called a Random

Process (RP)

- Probability.pptUploaded bytamann2004
- Probability DistributionUploaded byIvyJoyce
- Course Structure.pdfUploaded byAbhishek Chandrashekar
- Lecture 10Uploaded byEd Z
- Lecture Random ProcessesUploaded byYongfu Wind
- PQT UNIT 1Uploaded byBalaiyyappan Venkatesh
- Stt630chap3Uploaded byPi
- 3_Continuous Random VariablesUploaded bytahirabbasi
- 226 Prob Stoch2011 YatesUploaded byBob Sanders
- 0104215v1Uploaded byVigneshRamakrishnan
- rsh_qam11_ism_ch02.docUploaded byTheAloneSakie
- Introduction to Probability Theory and StatisticsUploaded byKei Yim
- Stt630chap3.pdfUploaded byAnonymous gUySMcpSq
- Probability SlidesUploaded byDmitriiGreen
- week06Uploaded byHawJingZhi
- Prob and Stat NotesUploaded byprachiz1
- Statistics Analysis Scientific DataUploaded byyonaye
- Random Variable - ProbabilityUploaded bySreekanth Madhavan
- Multidimensional Randomness, Standard Random Variables and Variance AlgebraUploaded byHugo Hernández
- EE585_Fall2009_Hw3_solUploaded bynoni496
- Fall 2010 Test 1 SolutionUploaded byAndrew Zeller
- Probability Theory - WeberUploaded bykterink007
- Chapter 1Uploaded byPrabhakar Krishnamurthy
- Chap3a.discrete DistrUploaded byDemitria Dini Ariyani
- Lecture Notes-March06 (1)Uploaded byrukma
- Multivariate Probability Theory - Determination of Probability Density FunctionsUploaded byHugo Hernández
- MIT6_041F10_assn04Uploaded bySeshanRamanujam
- appendixUploaded byStan Leong
- QUANTITATIVE METHODS - Common Probability Distribution Test QuestionsUploaded byEdlyn Koo
- Chapter_2.pdfUploaded byMoein Razavi

- OMN-201612-110655-1-Course_notes-v1Uploaded byLeandro Fosque
- Probability LectureNotesUploaded bybishu_iit
- Ratio DistributionUploaded byVlk Wld
- Stochastic Calculus Cheatsheet stocalc.pdfUploaded byAbraham Sauvingnon
- exxUploaded byaronl10
- Lec 03 - Oct25Uploaded byMohamed Abdou
- PQT UNIT 1Uploaded byBalaiyyappan Venkatesh
- Week 5b - Probability.pptxUploaded by_vanityk
- M-PUploaded bygustavo
- ch4Uploaded byLloyd Buffett Blankfein
- Probability Assignment 4Uploaded byVijay Kumar
- R5210402 Probability Theory & Stochastic ProcessesUploaded bysivabharathamurthy
- Probability Theory(MATHIAS LOWE)Uploaded byVladimir Moreno
- APA Lecture Notes Part1Uploaded bySheela Shivaram
- Review Problems With KeyUploaded byjumpman23
- Ks TrivediUploaded byMonzieAir
- PTSP Notes FinalUploaded byPradeep Jagathratchagan
- H Koenig - Measure and IntegrationUploaded byGianluca Napoletano
- Basic Probability ReviewUploaded bycoolapple123
- MIT6_041F10_L01Uploaded byIbrahim T. Soxibro
- lect1aUploaded byCeyhun Eyigün
- hw8sol.pdfUploaded byPei Jing
- Using the Binomial Distribution - Solutions.pdfUploaded bywolfretonmaths
- Joint SheetUploaded byharshit
- Chapter 3 Discrete Distribution _2011.pptUploaded bykaisheng13
- Combine SolutionUploaded bys0845
- 9781849962315-c1 (4)Uploaded byAgam Reddy M
- Lecture 3Uploaded byankit.batra887
- Statistic TablesUploaded byMaria Firdausiah
- Joint DistributionsUploaded byNguyễnĐăngKhiêm