lac chi's differentiation technique

5

Click here to load reader

description

lac chi's differentiation technique in red

Transcript of lac chi's differentiation technique

Page 1: lac chi's differentiation technique

Lecture 25

Agenda

1. Moment Generating Function

2. Uniqueness of Moment Generating Function

Moment Generating Function

Let us recall that a moment generating function MX for a random variableX, is defined by,

MX(t) = E(etX) =∑

x∈Range(X)

etxP (X = x) [If X is discrete]

=

∫ ∞−∞

etxfX(x)dx [If X is continuous]

It is named so because the moments of the random variable can be ob-tained by differentiating it an appropriate number of times,

dk

dtkMX(t)|t=0 = E(Xk)

for all k ≥ 1.

Mgf of Uniform

Let X ∼ Uniform(a, b) and t 6= 0,

E(etX) =

∫ b

a

etx1

b− adx

=1

b− a

∫ b

a

etxdx

=1

b− a

[etx

t

]ba

=1

b− aetb − eta

t

=etb − eta

t(b− a)

1

Page 2: lac chi's differentiation technique

Thus,

d

dtMX(t) =

t(betb − aeta)− (etb − eta)t2(b− a)

Now note that if you go ahead and put t = 0, to get E(X), you end up introuble. So we take the limit as t→ 0.

limt→0

t(betb − aeta)− (etb − eta)t2(b− a)

= limt→0

(betb − aeta) + t(b2etb − a2eta)− (betb − aeta)2t(b− a)

[Applying L’Hospital’s rule]

= limt→0

t(b2etb − a2eta)2t(b− a)

= limt→0

(b2etb − a2eta)2(b− a)

=(b2e0×b − a2e0×a)

2(b− a)

=(b2 − a2)2(b− a)

=b+ a

2

This should be E(X), and we know that’s true because we already knowE(X) = a+b

2.

2

Page 3: lac chi's differentiation technique

Mgf of standard normal

Let, Z ∼ N(0, 1) and t ∈ R, then,

MZ(t) = E(etZ)

=

∫ ∞−∞

etz1√2πe−

z2

2 dz

= et2

2

∫ ∞−∞

etze−t2

21√2πe−

z2

2 dz

= et2

2

∫ ∞−∞

1√2πe−

(z−t)22 dz

= et2

2

∫ ∞−∞

1√2πe−

y2

2 dy [by putting y = z − t]

= et2

2 × 1

= et2

2

Thus we have the mgf of standard normal, now let’s get the mgf of normal.

Mgf of normal

Let X ∼ N(µ, σ2). Now before doing the calculations we note that, if we putZ = X−µ

σ, then Z ∼ N(0, 1) and we can write X = µ+ σZ.

For t ∈ R,

MX(t) = E(etX)

= E(et(µ+σZ))

= E(etµ × etσZ)

= etµ × E(etσZ)

= etµ × et2σ2

2

= exp

(tµ+

t2σ2

2

)

3

Page 4: lac chi's differentiation technique

Mgf of Poisson

Let X ∼ Poisson(λ) for some λ > 0. Take some t ∈ R,

MX(t) = E(etX)

=∞∑x=0

etxe−λλx

x!

= e−λ∞∑x=0

etxλx

x!

= e−λ∞∑x=0

(etλ)x

x!

= e−λ exp (etλ)

= e(etλ−λ)

= e(λ(et−1))

Uniqueness of Moment Generating Function

Theorem 1. If two random variables X and Y have same distribution, thenthey have the same moment generating function, i.e.

MX(t) = MY (t)

for all t ∈ R.

The above theorem is easy to prove, but the thing about mgf is that, theabove result also goes other way round. And that’s what makes mgf such apowerful tool. We won’t be proving it, but using it.

Theorem 2. If two random variables X and Y have same moment generat-ing function, i.e.

MX(t) = MY (t)

for all t ∈ R, then they have the same distribution, i.e.

FX(t) = FY (t)

for all t ∈ R.

4

Page 5: lac chi's differentiation technique

We recall that mgf can’t be defined everywhere, so how can we say

MX(t) = MY (t)

for all t ∈ R ?By the above equality, we mean when MX(t) is defined and finite then MY (t)is also defined and finite and MX(t) = MY (t). But when E(etX) =∞, thenE(etY ) =∞.

Now for example, if I tell you that for a random variable X, it’s mgf is

MX(t) = e(2.5(et−1)) for all t ∈ R, you can go ahead and sayX ∼ Poisson(2.5).

Or maybe if I say you, that for a random variable Y , MY (t) = exp (3.5t+ 2t2),for all t ∈ R, then you can say Y ∼ N(3.5, 4).

Homework::

1. In class we did the mgf for Uniform, Poisson and Normal. Find

the mgf for Bernoulli(p), Geometric(p) and Gamma(α, β).

2. Suppose for a random variable X,

MX(t) = 0.3e4t + 0.4e−3t + 0.3e7.9t.

Find the distribution of X.

5