Matrix Econometrics I
-
Upload
shere0002923 -
Category
Documents
-
view
218 -
download
0
Transcript of Matrix Econometrics I
-
7/29/2019 Matrix Econometrics I
1/3
1 Useful mathematics, notation and definitions
Fact 1.1. If we multiply a column vectorx by a matrix Amxn , this defines afunction: Amn : R
n ! Rm
Notation 1. A = [a(1)...a(n)], ai 2 Rm refers to columns 1 through n of matrix
A.
Notation 2. A =
264
a1...
am1
375, ai 2 Rn refers to the rows of A.
Definition 1.1. C(A) = ha(1),...,a(n)i refers to the space spanned by thecolumns of A.
Proposition 1.1. D = AB and B is nonsingular =) rank(D) = rank(A)
Proposition 1.2. Let Amn, then, Rank(A) {m, n}Proposition 1.3. D = AB =) Rank(D) min{Rank(A), Rank(B)}
Proposition 1.4. Let Amn =) rank(A) = rank(AA0) = rank(A0A).
1.1 Some Vector Calculus
For S() = min
(Y-X)(Y-x) where Yn1, Xnk and k1
Fact 1.2. Derivative w.r.t. a column vector
@S()
@=
0BB@
@S()@1
...@S()
@k
1CCA
Fact 1.3. Derivative w.r.t. a row vector
@S()
@0=
@S()@1
, . . . ,@S()@k
Fact 1.4. Let() : Rk ! Rm, then () =
0BBB@
1()2()
...
m()
1CCCA is a vector valued
mapping, then
@()@0
=0BB@
@1()@ 0
...@m()@ 0
1CCA
where each element is a row vector as in fact 1.3 and@()@ 0
is a m k matrix1.
1This symbol 0 means transpose.
1
-
7/29/2019 Matrix Econometrics I
2/3
Fact 1.5. Let () : Rk
! Rm
, () =
0BBB@
1()2()
...
m()
1CCCA , be a vector valued
mapping, then
@ 0()
@=
@1()
@, . . . ,
@m()
@
where each element is a column vector as in fact 1.2 and@0()@
is an k mmatrix.
Fact 1.6. Lety = Ax where A does not depend on x, then @y@x0
= A where @y@x0
is an m n matrix.
Fact 1.7. Lety = Ax where A does not depend on x, then
@y
@x = A0
where
@y
@xis an nm matrix.
Proposition 1.5. Suppose y = Ax (), where =
0B@
1...
r
1CA, then if A does
not depend on ,@y()
@0=
@y
@x0@x
@0= A
@x ()
@0
the resulting matrix is m r.
Note 1. A@x0()@0
is not well defined, to see this :
@x0
@0=
0BBB@
@x1
@1
@x2
@1
@x3
@1 @xn@1
@x1@2
@x2@2
@x3@2
@xn@2
......
......
...@x1@r
@x2@r
@x3@r
@xn@r
1CCCArn
, but Amn!
Fact 1.8. y = x0Bx = (x0Bx)0 = x0B0x = y0, since y is a scalar.
Proposition 1.6. If Fact 1.8 holds =) we can always find a decompositionA of B s.t A is symmetric matrix.
Proof. Let A = 12 (B + B0) be a symmetric matrix, then:
y =1
2(x0Bx + x0B0x)) =
1
2[x0 (B + B0) x] =
1
2(2x0Ax) = y0
Proposition 1.7. y = x0Ax =) @y@x
= 2x0A = 2Ax
2
-
7/29/2019 Matrix Econometrics I
3/3
X
0
y-X=e
x1
x2
y
Figure 1: Geometry of Least Squares.
1.2 Geometry of Least Squares
y = x11 + ... + xkk implies that y lives in the linear space generated byhx1,...,xki R
n , however when we add ", we account for all aspects of y thatdo no live in hx1,...,xki.
Definition 1.2. We call P a projection matrix if:
1. P : Rn ! Rn.
2. P =P0
3. P2 = P
Fact 1.9. In our particular caseP = X(X0X)1X
3