Limits and the Law of Large Numbers Lecture XIII.

31
Limits and the Law of Large Numbers Lecture XIII

Transcript of Limits and the Law of Large Numbers Lecture XIII.

Page 1: Limits and the Law of Large Numbers Lecture XIII.

Limits and the Law of Large Numbers

Lecture XIII

Page 2: Limits and the Law of Large Numbers Lecture XIII.

Almost Sure Convergence

Let represent the entire random sequence {Zt}. As discussed last time, our interest typically centers around the averages of this sequence:

n

t tn Zn

b1

1

Page 3: Limits and the Law of Large Numbers Lecture XIII.

Definition 2.9: Let {bn()} be a sequence of real-valued random variables. We say that bn() converges almost surely to b, written

if and only if there exists a real number b such that

bb san ..

1: bbP n

Page 4: Limits and the Law of Large Numbers Lecture XIII.

The probability measure P describes the distribution of and determines the joint distribution function for the entire sequence {Zt}.

Other common terminology is that bn() converges to b with probability 1 (w.p.1) or that bn() is strongly consistent for b.

Page 5: Limits and the Law of Large Numbers Lecture XIII.

Example 2.10: Let

where {Zt} is a sequence of independently and identically distributed (i.i.d.) random variables with E(Zt)=<. Then

by the Komolgorov strong law of large numbers (Theorem 3.1).

n

t tn Zn

Z1

1

..sanZ

Page 6: Limits and the Law of Large Numbers Lecture XIII.

Proposition 2.11: Given g: RkRl (k,l<∞) and any sequence {bn} such that

where bn and b are k x 1 vectors, if g is continuous at b, then

bb san ..

bgbg san ..

Page 7: Limits and the Law of Large Numbers Lecture XIII.

Theorem 2.12: Suppose– y=X0+;

– X’/n a.s. 0;– X’X/a.s.M, finite and positive definite.

Then n exists a.s. for all n sufficiently large, and na.s.0.

Page 8: Limits and the Law of Large Numbers Lecture XIII.

Proof: Since X’X/n a.s.M, it follows from Proposition 2.11 that det(X’X/n) a.s.det(M). Because M is positive definite by (iii), det(M)>0. It follows that det(X’X/n)>0 a.s. for all n sufficiently large, so (X’X/N)-1 exists a.s. for all n sufficiently large. Hence

nyX

nXX

n''ˆ 1

Page 9: Limits and the Law of Large Numbers Lecture XIII.

In addition,

It follows from Proposition 2.11 that

nX

nXX

n ''ˆ 1

0

01

0..

0 0ˆ Msa

Page 10: Limits and the Law of Large Numbers Lecture XIII.

Convergence in Probability

A weaker stochastic convergence concept is that of convergence in probability.

Definition 2.23: Let {bn()} be a sequence of real-valued random variables. If there exists a real number b such that for every > 0,

as n , then bn() converges in probability to b.

1: bbP n

Page 11: Limits and the Law of Large Numbers Lecture XIII.

The almost sure measure of probability takes into account the joint distribution of the entire sequence {Zt}, but with convergence in probability, we only need to be concerned with the joint distribution of those elements that appear in bn().

Convergence in probability is also referred to as weak consistency.

Page 12: Limits and the Law of Large Numbers Lecture XIII.

Theorem 2.24: Let { bn()} be a sequence of random variables. If

If bn converges in probability to b, then there exists a subsequence {bnj} such that

bbbb pn

san then ,..

bb san j

..

Page 13: Limits and the Law of Large Numbers Lecture XIII.

Convergence in the rth Mean

Definition 2.37: Let {bn()} be a sequence of real-valued random variables. If there exists a real number b such that

as n for some r > 0, then bn() converges in the rth mean to b, written as

0 r

n bbE

bb mrn .

Page 14: Limits and the Law of Large Numbers Lecture XIII.

Proposition 2.38: (Jensen’s inequality) Let g: R1R1 be a convex function on an interval BR1 and let Z be a random variable such that P[ZB]=1. Then g(E(Z)) E(g(Z)). If g is concave on B, then g(E(Z)) E(g(Z)).

Page 15: Limits and the Law of Large Numbers Lecture XIII.

Proposition 2.41: (Generalized Chebyshev Inequality) Let Z be a random variable such that E|Z|r < , r > 0. Then for ever > 0

r

rZE

ZP

Page 16: Limits and the Law of Large Numbers Lecture XIII.

Theorem 2.42: If bn()r.m. b for some r > 0, then bn()p b.

Page 17: Limits and the Law of Large Numbers Lecture XIII.

Laws of Large Numbers

Proposition 3.0: Given restrictions on the dependence, heterogeneity, and moments of a sequence of random variables {Zt},

where

0.. sannZ

nn

n

t tn ZEZn

Z and

11

Page 18: Limits and the Law of Large Numbers Lecture XIII.

Independent and Identically Distributed Observations Theorem 3.1: (Komolgorov) Let {Zt} be a

sequence of i.i.d. random variables. Then

if and only if E|Zt| < and E(Zt) = .

This result is consistent with Theorem 6.2.1 (Khinchine) Let {Xi} be independent and identically distributed (i.i.d.) with E[Xi] = . Then

..sanZ

PnX

Page 19: Limits and the Law of Large Numbers Lecture XIII.

Proposition 3.4: (Holder’s Inequality) If p > 1 and 1/p+1/q=1 and if E|Y|p < and E|Z|q < , then E|YZ|[E|Y|p]1/p[E|Z|q]1/q.

If p=q=2, we have the Cauchy-Schwartz inequality

21

221

2 ZEYEYZE

Page 20: Limits and the Law of Large Numbers Lecture XIII.

Asymptotic Normality

Under the traditional assumptions of the linear model (fixed regressors and normally distributed error terms) n is distributed multivariate normal with:

for any sample size n.

12

0

0

ˆ

XXV

E

n

n

Page 21: Limits and the Law of Large Numbers Lecture XIII.

However, when the sample size becomes large the distribution of n is approximately normal under some general conditions.

Page 22: Limits and the Law of Large Numbers Lecture XIII.

Definition 4.1: Let {bn} be a sequence of random finite-dimensional vectors with joint distribution functions {Fn}. If Fn(z) F(z) as n for every continuity point z, where F is the distribution function of a random variable Z, then bn converges in distribution to the random variable Z, denoted

Zb dn

Page 23: Limits and the Law of Large Numbers Lecture XIII.

Other ways of stating this concept are that bn converges in law to Z:

Or, bn is asymptotically distributed as F

In this case, F is called the limiting distribution of bn.

Zb Ln

FbA

n ~

Page 24: Limits and the Law of Large Numbers Lecture XIII.

Example 4.3: Let {Zt} be a i.i.d. sequence of random variables with mean and variance 2 < . Define

Then by the Lindeberg-Levy central limit theorem (Theorem 6.2.2),

n

tt

n

nnn

ZnZV

ZEZb1

21

21

1

1,0~ NbA

n

Page 25: Limits and the Law of Large Numbers Lecture XIII.

Theorem (6.2.2): (Lindeberg-Levy) Let {Xi} be i.i.d. with E[Xi]= and V(Xi)=2. Then ZnN(0,1).

Definition 4.8: Let Z be a k x 1 random vector with distribution function F. The characteristic function of Z is defined as

where i2=-1 and is a k x 1 real vector. ZiEf 'exp

Page 26: Limits and the Law of Large Numbers Lecture XIII.

Example 4.10: Let Z~N(,2). Then

This proof follows from the derivation of the moment generating function in Lecture VII.

2exp22 if

Page 27: Limits and the Law of Large Numbers Lecture XIII.

Specifically, note the similarity between the definition of the moment generating function and the characteristic function:

Theorem 4.11 (Uniqueness Theorem) Two distribution functions are identical if and only if their characteristic functions are identical.

ziEf

txEtM X

exp

exp

Page 28: Limits and the Law of Large Numbers Lecture XIII.

Note that we have a similar theorem for moment generating functions.

Proof of Lindeberg-Levy:– First define f() as the characteristic function

for Zt- and let fn() be the characteristic function of

n

tt

n

nn Zn

Zn1

21

1

Page 29: Limits and the Law of Large Numbers Lecture XIII.

– By the structure of the characteristic function we have

nfnf

nff

n

n

n

lnln

Page 30: Limits and the Law of Large Numbers Lecture XIII.

– Taking a second order Taylor series expansion of f() around =0 gives

Thus,

222

21 of

nnonnfn as 221lnln222

Page 31: Limits and the Law of Large Numbers Lecture XIII.

Thus, by the Uniqueness Theorem the characteristic function of the sample approaches the characteristic function of the standard normal.