CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3...

115
CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability and Conditional Expectation 4.5 Multiple Random Variables 4.6 Functions of Several Random Variables 4.7 Expected Value of Functions of Random Variables 4.8 Jointly Gaussian Random Variables

Transcript of CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3...

Page 1: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability and Conditional

Expectation 4.5 Multiple Random Variables 4.6 Functions of Several Random Variables 4.7 Expected Value of Functions of Random

Variables 4.8 Jointly Gaussian Random Variables

Page 2: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.1 Vector Random VariablesA vector random variable X is a function that assigns a vector of real

numbers to each outcome ζ in S, the sample space of the random ex

periment .

EXAMPLE 4.1

Let a random experiment consist of selecting a student’s name form

an urn. Let ζdenote the outcome of this experiment, and define the f

ollowing three functions :

years.in student of age

and pounds, in student of weight

inches, in sudent of height

ζ A

ζW

H

Page 3: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Events and Probabilities

EXAMPLE 4.4

Consider the tow-dimensional random variable X = (X, Y). Find the re

gion of the plane corresponding to the events

The regions corresponding to events A and C are straightfor

ward to find and are shown in Fig. 4.1.

.100

,5),min(

,10

22

YXC

YXB

YXA

and

Page 4: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 5: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

For the n-dimensional random variable X = (X1,…,Xn), we are

particularly interested in events that have the product form

where Ak is a one-dimensional event (ie., subset of the real line) that

involves Xk only. A f

undamental problem in modeling a system with a vector random vari

able X = (X1,…, Xn) involves specifying the probability of product-for

m events :

In principle, the probability in Eq. (4.2) is obtained by finding the pro

bability of the equivalent event in the underlying sample space,

(4.1) in in in ,2211 nn AXAXAXA

(4.2) in in

in in in

.,,11

2211

nn

nn

AXAXP

AXAXAXPAP

.ASPAP in X that such in

Page 6: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.5

None of the events in Example 4.4 are of product form. Event B is th

e union of two product-form events :

.555 YXYXB and and

Page 7: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The probability of a non-product-form event B is found as foll

ow : First, B is approximated by the union of disjoint product-form ev

ents, say, B1, B2,…, Bn ; the probability of B is then approximated by

The approximation becomes exact in the limit as the Bk’s become ar

bitrarily fine.

Independence

If the one-dimensional random variable X and Y are “independent,” i

f A1 is any event that involves X only and A2 is any event that involve

s Y only, then

kk

kk BPBPBP .

., 2121 AYPAXPAYAXP in in in in

Page 8: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

In the general case of n random variables, we say that the random v

ariables X1, X2,…, Xn are independent if

where the Ak is an event that involves Xk only.

(4.3) in in in , in ,, 1111 nnnn AXPAXPAXAXP

Page 9: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.2 PAIRS OF RANDOM VARIABLES

Pairs of Discrete Random Variable

Let the vector random variable X = (X,Y) assume values from some c

ountable set The joint probability

mass function of X specifies the probabilities of the product-form eve

nt

The probability of any event A is the sum of the pmf over the

outcomes in A :

.,2,1,,2,1),,( kjyxS kj

:kj yYxX

(4.4) ,, k,,jyYxXP

yYxXPyxp

kj

kjkjYX

2121,

)( ,,

(4.5) in

.),(),(

, kjyx Ain

YX yxpAXPkj

Page 10: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The fact that the probability of the sample space S is 1 gives

The marginal probability mass functions :

and similarly,

1 1

, .1),(j k

kjYX yxp (4.6)

(4.7a)

anything

,

,

)(

1

21

kkjX,Y

jj

j

jjX

),y(xp

yYandxXyYandxXP

YxXP

xXPxp

(4.7b) )( .

)(

1

jkjX,Y

kkY

,yxp

yYPyp

Page 11: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.7

The number of bytes N in a message has a geometric distribution wit

h parameter 1-p and range SN={0, 1, 2, …}. Suppose that messages

are broken into packets of maximum length M bytes . Let Q be the n

umber of full packets in a message and let R be the number of bytes

left over. Find the joint pmf and the marginal pmf’s of Q and R.

SQ={0, 1, 2,….} and SR={0, 1, 2, ….M – 1} . The p

robability of the elementary event {(q, r)} is given by

The marginal pmf of Q is .1, rqMpprqMNPrRqQP

1

0

1

)1(,,1,M

k

rqMpp

MqMqMqMNPqQP

in

Page 12: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The marginal pmf of R is

.1

2101

11

qMM

MqM

pp

,,,qp

ppp

.

in

1101

1

1

),2,,

0

,M-,, rpp

p

pp

rMrMrNPrRP

rM

q

rqM

Page 13: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The Joint cdf of X and Y

The joint cumulative distribution function of X and Y is defined as the

probability of the product-form event

The joint cdf is nondecreasing in the “northeast” direction,

It is impossible for either X or Y to assume a value less than

, therefore

It is certain that X and Y will assume values less than infinity, therefor

e

:"11 yYxX

(4.8) .,),( 1111, yYxXPyxF YX

,21212211 yyxx),y(xF),y(xF X,YX,Y and if (i)

(ii) 021 ),(xF),y(F X,YX,Y

(iii) .1 ),(FX,Y

Page 14: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

If we let one of the variables approach infinity while keeping

the other fixed, we obtain the marginal cumulative distribution functio

ns

and

Recall that the cdf for a single random variable is continuous

form the right. It can be shown that the joint cdf is continuous from th

e “north” and from the “east”

and

xXPYxXPxFxF YXX ,),()( , (iv)

.),(, yYPyFyF YXY )(

),(),(lim ,, yaFyxF YXYXax

(v)

),(),(lim ,, bxFyxF YXYXby

Page 15: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 16: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.8

The joint cdf for the vector of random variable X = (X,Y) is given by

Find the marginal cdf’s.

The marginal cdf’s are obtained by letting one of the variable

s approach infinity :

elsewhere.

0

0,011),(,

yxeeyxF

xx

YX

01),(lim)( ,

xeyxFxF x

YXy

X

01),(lim)( ,

yeyxFyF x

YXx

Y

Page 17: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The cdf can be used to find the probability of events that can

be expressed as the union and intersection of semi-infinite rectangle

s. Consider the strip defined by denoted b

y the region B in Fig. 4.6(a) .

By the third axiom of probability we have that

The probability of the semi-infinite strip is therefore

Consider next the rectangle denot

ed by the region A in Fig 4.6 (b).

,121 yYxXx and

12111,12, ,),(),( yYxXxPyxFyxF YXYX

.),(),(, 11,12,121 yxFyxFyYxXxP YXYX

2121 yY y,xXx

.),(),(),(

,),(

21,11,12,

212122,

yxFyxFyxF

yYyxXxPyxF

YXYXYX

YX

Page 18: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 19: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The probability of the rectangle is thus

EXAMPLE 4.9

Find the probability of the events

where x > 0 and y > 0, and in Example 4.8

The probability of A is given directly by the cdf :

The probability of B requires more work. Consider Bc

),y(xF),y(xF),y(xF),y(xF

yYyxXxP

X,YX,YX,YX,Y 11211222

212 ,

(vi)

,1,1 YXA ,, yYxXB 52,21 YXD

.11)1,1(1,1 , eeFYXPAP YX

,)( yYxXyYxXB cc

Page 20: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The probability of the union of two events :

The probability of B :

The probability of event D is found by applying property vi of

the joint cdf :

.1

1111

,

yx

yxyx

c

ee

eeee

yYxXPyYPxXPBP

.1 yxc eeBPBP

25

2252

,,,,

1111

1111

)2,1()5,1()2,2()5,2(

52,21

eeee

eeee

FFFF

YXP

YXYXYXYX

Page 21: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The Joint pdf of Two Jointly Continuous Random Variable

s

We say that the random variables X and Y are jointly continu

ous if the probabilities of events involving (X, Y) can be expressed as

an integral of a pdf. There is a nonnegative function fX,Y(x,y), called t

he joint probability density function, that is defined on the real plane

such that for every event A, a subset of the plane,

as shown in Fig. 4.7. When a is the entire plane, the integral must eq

ual one :

The joint cdf can be obtained in terms of the joint pdf of jointl

y continuous random variables by integrating over the semi-infinite

A YX .dydxyxfAP )( in X 94,'')','(,

(4.10) .'')','(1 , dydxyxf YX

Page 22: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 23: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

rectangle defined by (x, y) :

It then follows that if X and Y are jointly continuous random variables,

then the pdf can be obtained from the cdf by differentiation :

The probability of a rectangle region is obtained by letting

in Eq. (4.9) :

(4.11)

x y

YXYX dydxyxfyxF '')','(),( ,,

(4.12) .yx

yxFyxf YX

YX

),(

),( ,,

2211:, byabxayxA and

1

1

2

2

.'')','(,2211

b

a

b

a YX dydxyxfbYabXaP (4.13) ,

(4.14)

,

(x,y)dxdy f

dydxyxfdyyYydxxXxP

X,Y

dxx

x

dyy

y YX

'')','(,

Page 24: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The marginal pdf’s fX(x) and fY(y) are obtained by taking the

derivative of the corresponding marginal cdf’s ,

and

Similarly,

),()( , xFxF YXX

.),()( , yFyF YXY

(4.15a) .')',(

'')','()(

,

,

dyyxf

dxdyyxfdx

dxF

YX

x

YXX

(4.15b) .'),'()( ,

dxyxfyF YXY

Page 25: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 26: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.10 Jointly Uniform Random Variables

A randomly selected point (X, Y) in the unit square has the uniform jo

int pdf given by

Find the joint cdf.

There are five cases in this problem, corresponding to the fiv

e regions shown in Fig. 4.9.

1. If x < 0 or y < 0, the pdf is zero and Eq. (4.12) implies

2. If (x,y) is inside the unit interval,

elsewhere.

0

10101),(,

y and xyxf YX

0),(, yxF YX

.''1),(0 0, xydydxyxFx y

YX

Page 27: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

3. If

4. Similarly, if

5. Finally, if

,110 yx and

.''1),(0

1

0, xdydxyxFx

YX ,101 yx and

.),(, yyxF YX

,11 yx and

.1''1),(1

0

1

0, dydxyxF YX

Page 28: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.11

Find the normalization constant c and the marginal pdf’s for the follo

wing joint pdf :

The constant c is found from the normalization condition spe

cified by Eq. (4.10) :

Therefore c= 2. The marginal pdf’s are found by evaluating Eq. (4.15

a) and (4.15b) :

and

.2

1100 0

cdxecedydxece xxx yx

xeedyeedyyxfxf xxyxYXX 0122),()(

00 ,

yedxeedxyxfyf yyxYXY 022),()( 2

00 ,

.elsewhere

0

0),(,

xyeceyxf

yx

YX

Page 29: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 30: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.13 Jointly Gaussian Random Variables

The joint pdf of X and Y, shown in Fig. 4.11 is

We say that X and Y are jointly Gaussian. Find the marginal pdf’s.

The marginal pdf of X is found by integrating fX,Y(x,y) over y :

We complete the square of the argument of the exponent by adding

and subtracting ρ2x2 , that is

x,yyxyxyxf YX )1(2/)2(exp12

1),( 222

2,

-

.)1(22exp12

)( 22

2

)1(2/ 22

dyxyye

xfx

X

22222222 2 xxyxxxyy

,2122

)1(2exp12

)(

2/

2

)1(22/

2222

2

)1(2/

2222

22

xxyx

x

X

edy

ee

dyxxye

xf

-

Page 31: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 32: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Random Variables That Differ in Type

EXAMPLE 4.14 A Communication Channel with Discrete Input and

continuous Output

Let X be the input , Y be output and N be noise.

and Find

therefore

where P[X = +1] = 1 / 2. When the input X = 1, the output Y is unifor

mly distributed in the interval [-1, 3]; therefore

,2/111 XPXP )2,2(: UNNXY where

.0,1 YXP

;|, kXPkXyYPyYkXP

,11|,1 XPXyYPyYXP

.314

11|

x

yXyYP for

8

1

2

1

4

111|00,1 XPXYPYXP

Page 33: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.3 INDEPENDENCE OF TWO RANDOM VARIABLES

X and Y are independent random variables if any event A1 defined in

terms of X is independent of any event A2 defined in terms of Y ;

Suppose that X and Y are a pair of discrete random variables. If we l

et then the independence of X and Y

implies that

Therefore, if X and Y are independent discrete random variables, th

en the joint pmf is equal to the product of the marginal pmf’s

(4,17) in in in in ., 2121 AYPAXPAYAXP

,21 kj yYAxXA and

(4.18) and all for

.)()(

,),(,

kjkYjX

kj

kjkjYX

yxypxp

yYPxXP

yYxXPyxp

Page 34: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Let be a product-form event as above, then

We say, The “discrete random variables X and Y are independent if and o

nly if the joint pmf is equal to the product of the marginal pmf’s for all xj, yk

21 AAA

(4.19)

in in

in in

,

)()(

),(

21

,

1 2

1 2

APAP

ypxp

yxpAP

Ax AykYjX

Ax AykjYX

j k

j k

Page 35: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.16

Are Q and R in Example 4.7 independent? From Example 4.7 we ha

ve

Therefore Q and R are independent.

.1,,0

10,

1

1

11

Mr

,,qrRqQP

pp

pp

ppprRPqQP

rMq

rM

MM

all for

Page 36: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

It can shown that the random variables X and Y are independent if and

only if their joint cdf is equal to the product of its marginal cdf’s :

Similarly, if X and Y are jointly continuous, then X and Y are in

dependent if and only if their joint pdf is equal to the product of the margin

al pdf’s :

EXAMPLE 4.18

Are the random variables X and Y in Example 4.13 independent? Th

e product of the marginal pdf’s of X and Y in Example 4.13 is

The jointly Gaussian r.v’s X and Y are indepdent if and only if ρ=0.

(4.20) . and all for yxyFxFyxF YXYX )()(),(,

(4.21) . and all for yxyfxfyxf YXYX )()(),(,

yxeyfxf yxYX ,

2

1)()( 2/22

Page 37: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.19

Are the random variables X and Y independent in Example 4.8? If we

multiple the marginal cdf’s found in Example 4.8 we find

so X and Y are independent.

If X and Y are independent random variables, then the rando

m variables defined by any air of functions g(X) and h(Y) are also ind

ependent.

1. Consider the one-dimensional events A and B.

2. Let A’ be the set of all values of x such that if x is in A’ the

n g(x) is in A,

yxyxFeeyFxF YXyx

YX and all for ),(11)()( ,

Page 38: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

3. Similarly, let B’ be the set of all values of y. then

.)()(

''

',')(,)(

ByhPAxgP

BYPAXP

BYAXPByhAxgP

in in

in in

in in in in

Page 39: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.4 CONDITIONAL PROBABILITY AND CONDITIONAL EXPECTATION

Conditional Probability

In Section 2.4, we know

If X is discrete, then Eq. (4.22) can be used to obtain the con

ditional cdf of Y given X = xk :

The conditional pdf of Y given X = xk , if the derivative exists, is given

by

(4.22) in

in .,

|xXP

xXAYPxXAYP

xXP

xXP

xXyYPxyF k

k

kkY (4.23) for .0,

,)|(

xyFdy

dxyf kYkY (4.24) .)|()|(

Page 40: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Integrating the conditional pdf :

Note that if X and Y are independent,

so

If X and Y are discrete, then

for xk such that . We defined for xk such t

hat . The probability of any event A given X = xk is fo

und by

(4.25) in in

.)|(| Ay kY dyxyfxXAYP

*, yYPxXyYP k kxXP .)()|()()|( yfxyfyFxyF YYYY and

(4.26)

.)(

),(,|)|( ,

kX

jkyx

k

jkkjkjY xp

yxp

xXP

yYxXPxXyYPxyp

0 kxXP 0)|( kjY xyp

0 kxXP

(4.27) in in

.)|(| Ay

kjYk

j

xypxXAYP

Page 41: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Note that if X and Y are independent, then

EXAMPLE 4.20

Let X be the input and Y the output of the communication channel dis

cussed in Example 4.14. Find the probability that Y is negative given

that X is +1.

If X =+1, then Y is uniformly distributed in the interval [-1, 3],

that is ,

.)()|( jYj

k

jkkjY ypyYP

xXP

yYPxXPxyp

elsewhere

1

314

1)1|(

yyfY

Page 42: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Thus

If X is a continuous random variable, then P[X = x] = 0 so Eq.

(4.22) is undefined. We define the conditional cdf of Y given X = x by

the following limiting procedure:

The conditional cdf on the right side of Eq. (4.28) is :

.4

1

41|0

0

1

dyXYP

(4.28) .)|(lim)|(0

hxXxyFxyF Yh

Y

')'(

'')','(

,)|(

,

dxxf

dydxyxf

hxXxP

hxXxyYPhxXxyF

hx

x X

y hx

x YX

Y

Page 43: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

As we let h approach zero,

The conditional pdf of Y given X = x is obtained by

Note that if X and Y are independent, then

(4.29) .)(

')',(,

hxf

hdyyxf

X

y

YX

(4.30) .)(

')',()|(

,

xf

dyyxfxyF

X

y

YX

Y

xf

yxfxyF

dy

dxyf

X

YXkYkY (4.31) .

)(

),()|()|( ,

*)(),(, xfyxf XYX

.)()|()()|()( yFxyFyfxyfyf YYYYY and and

Page 44: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 45: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.21

Let X and Y be the random variables introduced in Example 4.11. Fin

d

Using the marginal pdf’s

and

.)|()|( xyfyxf YX and

yxee

eeyxf yx

y

yx

X

for 22

2)|(

.0112

2)|( xy

e

e

ee

eexyf

x

y

xx

yx

Y

for

,0 0( ) ( , ) 2 02 1

x x yX X Y

x xf x f x y dy e e dy xe e

,0

2

0( ) ( 2, ) 2 0x y

Y X Yyf y f x y dx e dx yee

.elsewhere

0

0),(,

xyeceyxf

yx

YX

Page 46: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

If we multiply Eq. (4.26) by P[ X = xk ], then

Suppose we are interested in the probability that Y is in A :

(4.32) kkjjk xXPxXyYPyYxXP |,

.)()|(),(, kXkjYjkyx xpxypyxp

k

jk

k j

k j

xkXk

AykjY

xkX

kXx Ay

kjY

x AyjkYX

xpxXAYPAYP

xypxp

xpxyp

yxpAYP

all

in all

all in

all in

(4.33) in in

in

.)(|

)|()(

)()|(

),(,

Page 47: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

If X and Y are continuous, we multiply Eq. (4.31) by fX(x)

To replace summations with integrals and pmf’s with pdf’s ,

EXAMPLE 4.22 Number of Defects in a Region; Random Splitting o

f Poisson Counts

The total number of defects X on a chip is a Poisson random variabl

e with mean α. Suppose that each defect has a probability p of fallin

g in a specific region R and that the location of each defect is indepe

ndent of the locations of all other defects. Find the pmf of the numbe

r of defects Y that fall in the region R.

Form Eq. (4.33)

xfxyfyxf XkYYX (4.34) .)()|(),(,

(4.35) in in

.)(| dxxfxXAYPAYP X

0

.|k

kXPkXjYPjYP

Page 48: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The total number of defect : X = k,

the number of defects that fall in the region R is a binomial r.v with k,

p

Noting that

Thus Y is a Poisson r.v with mean αp.

.01

0

|kjpp

j

k

kj

kXjYP jkj

,jk

.

!!

)!(

1

!

!1

)!(!

!

1 pj

pj

jk

jkj

jk

kjkj

ej

pe

j

ep

jk

p

j

ep

ek

ppjkj

kjYP

p

1-p

Page 49: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.23 Number of Arrivals During a Customer’s Service Ti

me

The number of customers that arrive at a service station during a tim

e t is a Poisson random variable with parameter βt. The time require

d to service each customer is an exponential random variable with p

arameter α. Find the pmf for the number of customers N that arrive d

uring the service time T of a specific customer. Assume that the cust

omer arrivals are independent of the customer service time.

0

0

0

.!

!

)(|

dtetk

dteek

t

dttftTkNPkNP

tkk

ttk

T

1

0 (

1)!1

n n t

dtt e

n

0 1

! k t

kdt

kt e

Page 50: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Let r = (α+β)t, then

where we have used the fact that the last integral is a gamma functio

n and is equal to k!.

Conditional Expectation

The conditional expectation of Y given X = x is defined by

If X and Y are both discrete random variables, we have

k

k

k

rkk

k

drerk

kNP

1

01!

(4.36a)

.)|(| dyxyyfxYE Y

(4.36b) jy

jY xyypxYE )|(|

Page 51: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

We now show that

where the right-hand side is

and

We prove Eq. (4.37) for the case where X and Y are jointly continuou

s random variables, then

(4.37) ,| XYEEYE

continuous XdxxfxYEXYEE X )(||

discrete XxpxYEXYEEkx

kXk )(||

dxdyyxfy

dxxdyfxyyf

dxxfxYEXYEE

YX

XY

X

),(

)()|(

)(||

,

Page 52: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The above result also holds for the expected value of a function of Y :

The kth moment of Y is given by

EXAMPLE 4.25 Average Number of Defects in a Region

Find the mean of Y in Example 4.22 using conditional expectation.

.)(

YEdyyyfY

.|)()( XYhEEyhE

.| XYEEYE kk

.|00

pXpEkXkpPkXPkXYEYEkk

Page 53: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.26 Average Number of Arrivals in a Service Time

Find the mean and variance of the number of customer arrivals N du

ring the service time T of a specific customer in Example 4.23 .

We will need the first two conditional moments of N given T

= t:

The first two moment of N are

,|| 22 tttTNEttTNE

TβEdtttfdttftTNENE TT

00)()(|

.

)()(|

22

0

22

0

22

TEβTβE

dttfttdttftTNENE TT

Page 54: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The variance of N is then

If T is exponential with parameter α, then E[T] = 1/α and VAR[T] =

1/α2 , so

.2

2222

22

TET

TETETE

NENEN

VAR

VAR

.2

2

N α

βNE VAR and

Page 55: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.5 MULTIPLE RANDOM VARIABLES

Joint Distributions

The joint cumulative distribution function of X1, X2,…., Xn is defined a

s the probability of an n-dimensional semi-infinite rectangle associat

e with the point (x1,…, xn):

The joint cdf is defined for discrete, continuous, and random variable

s of mixed type.

(4.38) .,,,),,( 221121,, 21 nnnXXX xXxXxXPxxxFn

Page 56: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.27

Let the event A be defined as follows :

Find the probability of A .

The maximum of three numbers is less than 5 if and only if

each of the three numbers is less than 5 ; therefore

.5,,max 321 XXXA

.)5,5,5(

555

321 ,,

321

XXXF

XXXPAP

Page 57: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The joint probability mass function of n discrete random vari

ables is defined by

The probability of any n-dimensional event A is found by summing th

e pmf over the points in the event

One-dimensional pmf of Xj is found by adding the joint pmf over all v

ariables other than xj:

The marginal pmf for X1,…,Xn-1 is given by

(4.39) .,,),,( 221121,, 21 nnnXXX xXxXxXPxxxpn

(4.40) in Ain

.),,(,, 21,,21 21 nXXXx

n xxxpAXXXPn

(4.41) .),,()( 21,,,1 1

21

1

n

n

j j

jx

nXXXx xx

jjjX xxxpxXPxp

(4.42) .),,(),,( 21,,,121,,, 21121

n

nnx

nXXXnXXX xxxpxxxp

Page 58: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

A family of conditional pmf’s is obtained from the joint pmf by conditio

ning on different subsets of the random variables.

if . Repeated applications of Eq. (4.43a) yield

(4.43a) ,

,,|

,,,,|

111,,

11,,11

1

1

nnXX

nXXnnX xxxp

xxpxxxp

n

n

n

0,, 11,,1nXX xxp

n

(4.43b)

xpxxpxxxp

xxxpxxp

XXnnX

nnXnXX

n

nn

112211

1111,,

121

1

|,,|

,,|,,

Page 59: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.28

A computer system receives messages over three communications li

nes. Let Xj be the number of messages received on line j in one hour.

Suppose that the joint pmf of X1, X2, and X3 is given by

Find p(x1, x2) and p(x1) given that 0< ai < 1.

000

111,,

321

321321321,,321

321

,x,xx

aaaaaaxxxp xxxXXX

.11

111,

21

3

321

21

2121

032132121,

xx

x

xxxXX

aaaa

aaaaaaxxp

.1

11

1

2

21

1

11

021211

x

x

xxX

aa

aaaaxp

Page 60: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

If r.v’s X1, X2,…,Xn are jointly continuous random variables, th

en the probability of any n-dimensional event A is

where is the joint probability density function

The joint cdf of X is obtained from the joint pdf by integration

:

The joint pdf (if the derivative exists) is given by

The marginal pdf for a subset of the random variables is obt

ained b integrating the other variables out. The marginal of X1 is

(4.44) Ain Ain

,),,(,, ''1

''1,,1 1 nnXX

xn dxdxxxfXXP

n

),,( ''1,,1 nXX xxf

n

(4.45) ,),,(),,,(1

121

''1

''1,,21,,,

x x

nnXXnXXX

n

nndxdxxxfxxxF

(4.46) .),,(),,( 1,,1

''1,, 121 nXX

n

n

nXXX xxFxx

xxfnn

(4.47) .),,,()( ''2

''21,,1 11

nnXXX dxdxxxxfxf

n

Page 61: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The marginal pdf for X1,…,Xn-1 is given by

The pdf of Xn given the values of X1,…,Xn-1 is given by

if

Repeated applications of Eq. (4.49a) yield

(4.48) .),,,(),,( ''11,,11,, 111

nnnXXnXX dxxxxfxxf

nn

(4.49a) ),,(

),,(),,|(

11,,

1,,11

11

1

nXX

nXXnnX xxf

xxfxxxf

n

n

n

0),,( 11,, 11 nXX xxf

n

(4.49b)

)()|(),,|(

),,|(),,(

112211

111,,

121

1

xfxxfxxxf

xxxfxxf

XXnnX

nnXnXX

n

nn

Page 62: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.29

The r.v’s X1, X2, and X3 have the joint Gaussian pdf

Find the marginal pdf of X1 and X3 .

The above integral was carried out in Example 4.13 with

.2

,,

2321

22

21

321

2

12

321,,

xxxxx

XXX

exxxf

.2/22

, 2

22/

31,

2122

21

23

31dx

eexxf

xxxxx

XX

2/1

.22

,2/2/

31,

21

23

31

xx

XX

eexxf

Page 63: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Independence

X1,…,Xn-1 are independent if

for any one-dimensional events A1,…,An. It can be shown that X1,…,X

n are independent if and only if

for all x1,…,xn. If the random variables are discrete,

If the random variables are jointly continuous,

nnnn AXPAXPAXAXP in in in in 1111 ,,

(4.50) )()(),,( 11,, 11 nXXnXX xFxFxxFnn

. all for nnXXnXX ,x,xxpxpxxpnn

111,, )()(),,(11

. all for nnXXnXX ,x,xxfxfxxfnn

111,, )()(),,(11

Page 64: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.6 FUNCTIONS OF SEVERAL RANDOM VARIABLESOne Function of Several Random Variables

Let the random variable Z be defined as a function of several rando

m variables:

The cdf of Z is found by first finding the equivalent event of t

hat is, the set then

(4.51) .,,, 21 nXXXgZ

,zZ ,,,1 zgxxR nZ x that such x

(4.52)

in X

inx

.,,

)(''

1''

1,,1 nnXXR

zZ

dxdxxxf

RPzF

nz

Page 65: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.31 Sum of Two Random Variables

Let Z = X + Y. Find FZ(z) and fZ(z) in terms of the joint pdf of X and Y.

The cdf of Z is

The pdf of Z is

Thus the pdf for the sum of two random variables is given by a super

position integral. If X

and Y are independent random variables, then by Eq. (4.21) the pdf i

s given by the convolution integral of the margial pdf’s of X and Y :

.'')','()('

,

xz

YXZ dxdyyxfzF

(4.53) .')','()()( ,

dxxzxfzF

dz

dzf YXZZ

(4.54) .')'()'()(

dxxzfxfzf YXZ

Page 66: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.32 Sum of Nonindependent Gaussian Random Variabl

es

Find the pdf of the sum Z = X + Y of two zero-mean, unit-variance G

aussian random variables with correlation coefficient ρ= -1 / 2.

After completing the square of the argument in the exponent we obt

ain

.'4/32

''exp

4/32

1

'12

'''2'exp

12

1

')','()(

22

21

2

22

212

,

dxzzxx

dxxzxzxx

dxxzxfzf YXZ

.2

)(2/2

z

Z

ezf

Page 67: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Let Z = g(X, Y), and suppose we are given that Y = y, then Z

= g(X, y) is a function of one random variable. And the pdf of Z given

Y = y: fZ(z | Y = y). The pdf of Z is found from

.')'()'|()(

dyyfyzfzf YZZ

Page 68: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.34

Let Z = X / Y. Find the pdf of Z if X and Y are independent and both e

xponentially distributed with mean one.

Assume Y = y, then

The pdf of Z is

.)|()|( yyzfyyzf XZ

.')','('

')'()'|'(')(

,

dyyzyfy

dyyfyzyfyzf

YX

YXZ

Page 69: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Transformations of Random Vectors

Let X1,…, Xn be random variables associate with some experiment, a

nd let the random variables Z1,…, Zn be defined by n functions of X =

(X1,…, Xn) :

The joint cdf of Z1,…, Zn at the point z = (z1,…, zn) is equal to

the probability of the region of x where

.0

1

1

''

')'()'(')(

2

0

''

0

zz

dyeey

dyyfzyfyzf

yzy

YXZ

.)()()( 2211 X X X nn gZgZgZ

:,...,1)( nkzg kk for x

Page 70: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

If X1,…, Xn have a joint pdf, then

EXAMPLE 4.35

Let the random variables W and Z be defined by

Find the joint cdf of W and Z in terms of the joint cdf of X and Y.

If z > w, the above probability is the probability of the semi-infinite rec

tangle defined by the point (z, z) minus the square region denote by

A.

(4.55a) XX .)(,,)(),,( 111,,1 nnnZZ zgzgPzzFn

(4.55b) xx

.),...,(),,( ''1

''1,...,

)'(:'

1,, 11

nnXX

zg

nZZ dxdxxxfzzFn

kk

n

.),max(),min( YXZYXW and

.,max),min(),(, zYXwYXPzwF ZW

Page 71: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

If z < w then

),(),(),(

),(),(),(),(

),(

),(),(

,,,

,,,,

,

,,

wwFwzFzwF

wwFwzFzwFzzF

zzF

APzzFzwF

YXYXYX

YXYXYXYX

YX

YXZW

.),(),( ,, zzFzwF YXZW

Page 72: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

pdf of Linear Transformations

We consider first the linear transformation of two random variables :

or

Denote the above matrix by A. We will assume A has an inverse, so

each point (v, w) has a unique corresponding point (x, y) obtained fro

m

In Fig. 4.15, the infinitesimal rectangle and the parallelogram are equ

ivalent events, so their probabilities must be equal. Thus

eYcXW

bYaXV

.

Y

X

ec

ba

W

V

(4.56) .1

w

vA

y

x

dPwvfdxdyyxf WVYX ),(),( ,,

Page 73: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 74: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

where dP is the area of the parallelogram. The joint pdf of V and W is

thus given by

where x an y are related to (v, w) by Eq. (4.56) It c

an be shown that so the “stretch factor” is

where |A| is the determinant of A.

Let the n-dimensional vector Z be

where A is an invertible matrix. The joint of Z is then

(4.57) ,),(

),( ,,

dxdydP

yxfwvf YX

WV

,dxdybcaedP

,Abcaedxdy

dxdybcae

dxdy

dP

,XZ A

nn

Page 75: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.36 Linear Transformation of Jointly Gaussian Random

Variables

Let X and Y be the jointly Gaussian random variables introduced in E

xample 4.13. Let V and W be obtained from (X, Y) by

Find the joint pdf of V and W.

|A| = 1,

(4.58)

z z x

Z A

Af

A

xxfzzff

zAx

nXXnZZ

n

n

11,,

1,,1

1

1

,,,,)(

.11

11

2

1

Y

XA

Y

X

W

V

,11

11

2

1

W

V

Y

X

Page 76: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

so

where

By substituting for x and y, the argument of the exponent becomes

Thus

.22 WVYWVX and

,2

,2

),( ,,

wvwv

fwvf YXWV

.12

1,

222 12/2

2,

yxyx

YX eyxf

.

121212

2/2/22/ 22

2

22

wvwvwvwvwv

.

12

1),( 12/12/

2/12,

22

wv

WV ewvf

Page 77: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 78: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

pdf of General Transformations

Let the r.v’s V and W be defined by two nonlinear functions of X and

Y :

Assume that the functions v(x, y) and w(x, y) are invertible, then

In Fig. 4.17(b) , make the approximation

and similarly for the y variable. The probabilities of the infinitesimal r

ectangle and the parallelogram are approximately equal. therefore

(4.59) and .),(),( 21 YXgWYXgV

.),(),( 21 wvhywvhx and

,2 1),(),(),(

kdxyxgx

yxgydxxg kkk

dPwvfdxdyyxf WVYX ),(),( ,,

Page 79: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 80: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

and

where dP is the area of the parallelogram. The “stretch factor” at the

point (v, w) is given by the determinant of a matrix of partial derivativ

es :

The determinant J(x, y) is called the Jacobian of the transformation.

(4.60) ,)),(),,((

),( 21,,

dxdydP

wvhwvhfwvf YX

WV

.det),(

y

w

x

wy

v

x

v

yxJ

Page 81: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The Jacobian of the inverse transformation is given by

It can be shown that

We therefore conclude that the joint pdf of V and W can be found usi

ng either of the following expressions :

.det),(

w

y

v

yw

x

v

x

wvJ

.),(

1),(

yxJwvJ

(4.61b)

(4.61a)

wvJwvhwvhf

yxJ

wvhwvhfwvf

YX

YXWV

,)),(),,((

,

)),(),,((),(

21,

21,,

Page 82: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.37 Radius an Angle of Independent Gaussian Random

Variables

Let X and Y be zero-mean, unit-variance independent Gaussian rand

om Variables. Find the joint pdf of V and W defined by

where denotes the angle in the range (0.2π) that is defined by th

e point (x, y). Th

e inverse transformation is given by

The Jacobian is given by

,,

2/122

YXW

YXV

.sincos wvywvx and

.cossin

sincos),( v

wvw

wvwwvJ

Page 83: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Thus

The pdf of a Rayleigh random variable is given by

We therefore conclude that the radius V and the angle W are indepe

ndent random variables.

.2002

12

),(

2/

2/)(sin)(cos,

2

2222

πw,vve

ev

wvf

v

wvwvWV

.0)( 2/2

vvevf vV

(1)

(2)

1( ) , 0 2

2: uniform random varia ble

Wf w w

W

Page 84: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.38 Student’s t-distribution

Let X be a zero-mean, unit-variable Gaussian random variable and l

et Y be a chi-square random variable with n degrees of freedom. As

sume that X and Y are independent. Find the pdf of

Define the auxiliary function W = Y. Then

The Jacobian of the inverse transformation is

nYXV //

and .X V W/n Y W

./10

)2/(/),( nw

wnvnwwvJ

.

2/2

)2/(),(

2/2

2/

2),(

/12/2/1

/

2/12/2/

,

22

nn

ewwvJ

n

eyewvf

nvwn

wynwvx

ynx

WV

2 / 2 / 2 1 / 2

,

( / 2)(1) ( , ) , 0

2 ( / 2)2

x n y

X Y

e y ef x y y

n

Page 85: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The pdf of V is

let

We finally obtain the Student’s t-distribution

.)2/(

2/2

1)(

0

/12/2/1 2

dwew

nnvf nvwn

V

,)1/(2/' 2 nvww

.')'(

2/

/1)(

0

'2/1

2/12

dwewnn

nvvf wn

n

V

2/

2/1/1)(

2/12

nn

nnvvf

n

V

10

!

( 1) / 2 : integer

k tk

kt e dt

k n

1

2

2

1

2

2 2

( 1) / 2 ( / 2)(1 / )( 1) / 2

0 0

12

( 1) / 2 11

(1 / )( 1) /

1

2

2/

2

2

2(2)1

( / 2)2

( )

2( )!!1

2 1 /n n

wn w v nn

n

nv n

n

v nn

v

w e dw

n

w e dw

Page 86: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Problem:

11

(2) a b d b

c d c aad bc

2 2,

1 1( , ) exp

2 2X Yf x y x y

2 21, 2

1( , ) exp(1) 2 5 3

2 V Wf v w v w v w

3 5 3 5

2 1 2

v x y x

w x y y

13 5 2 1

1 2 1 3

x v v

y w w

Page 87: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Consider the problem of finding the joint pdf for n functions o

f n random variables X = (X1,…, Xn):

We assume as before that the set of equations

has a unique solution given by

The joint pdf of Z is then given by

,)(,)(,)( 2211 X ,X X nn gZgZgZ

,)(,)(,)( 2211 x ,x x nn gzgzgz )62.4(

,)(,)(,)( 2211 x ,x x nn hxhxhx

(4.63b)xxx

(4.63a) xxx

,,,,)(,),(),(

,,,

)(,),(),(),,(

2121,,

21

21,,1,,

1

1

1

nnXX

n

nXXnZZ

zzzJhhhf

xxxJ

hhhfzzf

n

n

n

Page 88: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

where are the determinants of the tran

sformation and the inverse transformation, respectively,

and

nn zzJxxJ ,,,, 11 and

n

nn

n

n

x

g

x

g

x

g

x

g

xxJ

1

1

1

1

1 det,,

n

nn

n

n

z

h

z

h

z

h

z

h

zzJ

1

1

1

1

1 det,,

Page 89: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.7 EXPECTED VALUE OF FUNCTIONS OF RANDOM VARIABLES

The expected value of Z = g(X, Y) can be found using the following e

xpressions :

(4.64) discrete.

continuousjointly

X,Yyxpyxg

X, YyxfyxgZE

i nniYXni

YX

),(),(

),(,

,

,

Page 90: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.39 Sum of Random Variables

Let Z = X + Y . Find E[Z].

Thus, the result shows that the expected value of a sum of n random vari

ables is equal to the sum of the expected values :

(4.65)

.')'('')'('

'')','(''')','('

'')','(''

,,

,

YEXEdyyfydxxfx

dydxyxfydxdyyxfx

dydxyxfyx

YXEZE

YX

YXYX

YX

(4.66) .121 nn XEXEXXXE

Page 91: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

In general if X1,…, Xn are independent random variables, then

The Correlation and covariance of Two Random Variables

The jkth joint moment of X and Y is defined by

If j = 0, to obtain the moments of Y,

If k = 0, to obtain the moments of X ,

If j = 1, k = 1, to call E[XY] as the correlation of X and Y.

If E[XY]=0, we say that X and Y are orthogonal.

(4.67)

.22112211 nnnn XgEXgEXgEXgxXgXgE

(4.68) discrete.

continuousjointly

X,Yyxpyx

X, YyxfyxYXE

i nniYX

kn

ji

YXkj

kj

),(

),(

,

,

Page 92: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The jkth central moment of X and Y is defined as the joint mo

ment of the centered random variables, X – E[X] and Y – E[Y] :

Note: j = 2 k = 0 gives VAR(X) j =

0 k = 2 gives VAR(Y), j = k =1, th

at is defined as the covariance of X and Y

.kj YEYXEXE

(4.69) COV .)()(),( YEYXEXEYX

(4.70)

COV

.

2

),(

YEXEXYE

YEXEYEXEXYE

YEXEXYEYXEXYEYX

Page 93: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.41 Covariance of Independent Random Variables

Let X and Y are independent random variables. Find their

covariance.

Therefore pairs of independent random variables have covariance zero.

The correlation coefficient of X and Y is defined by

where are the standard

deviations of X and Y, respectively

,0

)()(

)()(),(

COV

YEYEXEXE

YEYXEXEYX

(4.71)

COV,

,,

YXYXYX

YEXEXYEYX

YX YX VAR and VAR

Page 94: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The correlation coefficient is a number that is at most 1 in m

agnitude :

proof :

The extreme values of ρX,Y are achieved when X an Y are rel

ated linearly, Y = aX + b; ρX,Y =1 if a > 0 and ρX,Y = -1 if a < 0. X a

nd Y are said to be uncorrelated if ρX,Y = 0. If X and Y are independent

(獨立) , then X and Y are uncorrelated. In Example 4.18, we saw tha

t if X and Y are jointly Gaussian and ρX,Y = 0 , then X and Y are independen

t Gaussian random variables.

(4.72) 11 , YX

.12

121

0

,

,

2

YX

YX

YX

YEYXEXE

Page 95: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.42 Uncorrelated but Dependent Random Variables

Let be uniformly distributed in the interval (0,2π). Let

The point (X, Y) then corresponds to the point on the unit circle spec

ified by the angle , as shown in Fig. 4.18. This is not the case in Ex

ample 3.28, so X and Y are dependent ( 相依 ).

We now show that X and Y are uncorrelated ( 不相關) :

sincos YX and

.02sin4

1

cossin2

1cossin

2

0

2

0

d

dEXYE

Page 96: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 97: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

*Joint Characteristic Function

The joint characteristic function of n random variables is defined as

Consider

If X and Y are jointly continuous random variables, then

The inversion formula for the Fourier transform implies that the joint

pdf is given by

(4.73a) .),,( 2211

21 21,,nn

n

XwXwXwjnXXX eEwww

(4.73b) .),( 2121,

YwXwjYX eEww

(4.73c) .),(),( 21,21,

dxdyeyxfww ywxwj

YXYX

(4.74) .),(4

1),( 2121,2,

21

dwdwewwyxf ywxwj

YXYX

Page 98: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The marginal characteristic functions can be obtained form the joint

characteristic function :

If X and Y are independent random variables, then

The characteristic function of the sum Z = aX + bY can be obtained fr

om the joint characteristic function of X and Y as follows:

If X and Y are independent random variables, the characteristic funct

ion of Z = aX + bY is then

(4.75) .),0()()0,()( ,, wwww YXYYXX

(4.76) )()(

),(

21

21,

21

2121

wweEeE

eeEeEww

YXYjwXjw

YjwXjwYwXwjYX

(4.77a) .),()( , bwaweEeEw YXwbYwaXjbYaXjw

Z

(4.77b) .)()(),()( , bwawbwaww YXYXZ

Page 99: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The joint moments of X and Y can be obtained by taking deri

vatives of the joint characteristic funciton.

derivatives :

.

!!

!!

),(

0 0

21

0 0

21

21,21

i k

kiki

i k

ki

YwXwjYX

k

jw

i

jwYXE

k

Yjw

i

XjwE

eEww

(4.78) .),(1

0,0

21,21

21

ww

YXki

ki

kiki ww

wwjYXE

Page 100: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.44

Suppose U and V are independent zero-mean, unit-variance Gaussi

an random variables, and let

Find the joint characteristic function of X and Y, and find E[XY].

The joint characteristic function of X and Y is

Since U and V are independent random variables, then

.2 VUYVUX

.

),(

2121

2121

2

)2()(21,

VwwUwwj

VUwVUwjYwXwjYX

eE

eEeEww

,

2),(

2221

21

221

221

2121

5622

1

2

12

2

1

21212

21,

wwwwwwww

VUVwwjUwwj

YX

eee

wwwweEeEww

20, 1m

Page 101: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The correlation E[XY] is found from Eq. (4.78) with i = 1and k

=1:

f

e

wwwwe

wwwwj

XYE

ww

wwww

wwww

ww

YX

.3

62

1

644

1106

),(1

0,0

5622

1

2121

5622

1

0,0

21,21

2

2

21

2221

21

2221

21

21

Page 102: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

4.8 JOINTLY GAUSSIAN RANDOM VARIABLESThe random variables X and Y are said to be jointly Gaussia

n if their joint pdf has the form

for

The pdf is constant for values x and y for which the argume

nt of the exponent is constant :

(4.79)

2

,21

2

2

2

2

2

1

1,

2

1

12

,

,

12

212

1exp

),(

YX

YXYX

YX

mymymxmx

yxf

yx and

Page 103: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

When ρX,Y = 0, X and Y are independent ; when ρX,Y ≠ 0, the major axi

s of the ellipse is oriented along the angle

Note that the angle is 45º when the variance are equal. Th

e marginal pdf of X is found by integrating fX,Y(x, y) over all y.

that is, X is a Gaussian random variable with mean m1 and variance

.

constant

2

2

2

2

2

1

1,

2

1

1 2

mymymxmxYX

(4.80) .2

arctan2

122

21

21,

YX

(4.81) ,

1

2/

2)(

21

21

mx

X

exf

21

Page 104: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.
Page 105: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

The conditional pdf of X given Y = y is

We now show that the ρX,Y in Eq. (4.79) is indeed the correla

tion coefficient between X and Y. The covariance between X and Y is

defined by

Now the conditional expectation of (X – m1)(Y – m2) given Y = y is

(4.82) .2

,21

2

121

2,2

12

,

,

12

121

exp

)(

),()|(

YX

YXYX

Y

YXX

mmyx

yf

yxfyxf

.|

,

21

21

YmYmXEE

mYmXEYX

COV

Page 106: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

where we have used the fact that the conditional mean of X given Y

= y is Therefore

and

,

|

||

22

1,2

12

1221

mymy

myYXEmy

yYmXEmyyYmYmXE

YX

./ 221,1 mym YX

,| 22

2

1,21 mYYmYmXE YX

.

|,

21,

22

2

1,21

YX

YX mYEYmYmXEEYXCOV

Page 107: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.45

The amount of yearly rainfall in city 1 and in city 2 is modeled by a p

air of jointly Gaussian random variables, X and Y, with pdf given by E

q. (4,79). Find the most likely value of X given that we know Y = y.

The conditional pdf of X given Y = y is given by Eq. (4.82), w

hich is maximum at the conditional mean

.| 22

1,1 mymyXE YX

Page 108: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

n Jointly Gaussian Random Variables

The random variables X1, X2,…, Xn are said to be jointly Gaussian if t

heir joint pdf is given by

where x and m are column vectors defined by

and K is the covariance matrix that is defined by

(4.83)

mxmxxX ,

2

21

exp),,()( 2/12/

1

21,,, 21 k

Kxxxff

n

T

nXXX n

4

3

2

1

2

1

2

1

,

XE

XE

XE

XE

m

m

m

x

x

x

nn

m x

Page 109: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Equation (4.83) shows that the pdf of jointly Gaussian random

variables is completely specified by the individual means and variances an

d the pairwise covariances.

(4.84)

VARCOV

COVVARCOV

COVCOVVAR

nn

n

n

XXX

XXXXX

XXXXX

K

1

2212

1121

,

,,

,,

Page 110: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.46

Verify that the tow-dimensional Gaussian pdf given in Eq. (4.79) has

the form of Eq. (4.83).

The covariance matrix for the two-dimensional case is given

by

The inverse of the covariance matrix is

The term in the exponent is therefore

,2221,

21,21

YX

YXK

.1

12121,

21,22

2,

22

21

1

YX

YX

YX

K

Page 111: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

.1

///2/

,1

1

,1

1

2,

2222211,

211

221121,

221,122

212,

22

21

2

1

2121,

21,22

212,

22

21

YX

YX

YX

YX

YX

YX

YX

YX

mymymxmx

mymx

mymxmymx

my

mxmymx

Page 112: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

EXAMPLE 4.48 Independence of Uncorrelated Jointly Gaussian Ra

ndom Variables

Suppose X1, X2,…, Xn are jointly Gaussian random variables with

Show that X1, X2,…, Xn are independe

nt random variables.

Therefore

and

.0, jiXX ji for COV

2ii diagXdiagK VAR

21 1

i

diagK

n

i i

iiT mxK

1

2

1 .

mxmx

Page 113: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

Thus form Eq. (4.83)

.)(

2

2/21

exp

2

2/21

exp)(

1

12

2

2/12/

1

2

n

iiX

n

i i

ii

n

n

i ii

xf

mx

K

mxf

i

xX

Page 114: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

2

111 2 2

21 2 2

2 2

1 2

1 2 1 2

2. =0,

01 1 ( , ) exp

2 2 0

1 1 exp

2 2

x mf x y x m y m

y m

x m y m

1211 1 2

1 2 221 2 2

21 2

2 2

2

1 2

21 21 2

1exp

21. ( , )

2 1

2exp

2 1 , ,

2 1

x mx m y m

y mf x y

r rs s

x m y mr s

2-dimensional Gaussian pdf, n=2

Page 115: CHAPTER 4 Multiple Random Variable 4.1 Vector Random Variables 4.2 Pairs of Random Variables 4.3 Independence of Two Random Variables 4.4 Conditional Probability.

2 21 2 1 2

2 2

4. If 2, 1, 0, 0, then

1 ( , ) exp

4 22 2

m m

x yf x y

2 211 1

2 22 2

0 03.

0 0K K