Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

28
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis

Transcript of Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Page 1: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Linear Algebra Review

1

CS479/679 Pattern RecognitionDr. George Bebis

Page 2: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

n-dimensional Vector

• An n-dimensional vector v is denoted as follows:

• The transpose vT is denoted as follows:

Page 3: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Inner (or dot) product

• Given vT = (x1, x2, . . . , xn) and wT = (y1, y2, . . . , yn), their dot product defined as follows:

or

(scalar)

Page 4: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Orthogonal / Orthonormal vectors

• A set of vectors x1, x2, . . . , xn is orthogonal if

• A set of vectors x1, x2, . . . , xn is orthonormal if

k

Page 5: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Linear combinations

• A vector v is a linear combination of the vectors v1, ..., vk if:

where c1, ..., ck are constants.

• Example: vectors in R3 can be expressed as a linear combinations of unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)

Page 6: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Space spanning

• A set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector in W can be written as a linear combination of the vectors in S

- The unit vectors i, j, and k span R3

w

Page 7: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Linear dependence

• A set of vectors v1, ..., vk are linearly dependent if at least one of them is a linear combination of the others.

(i.e., vj does not appear on the right side)

Page 8: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Linear independence

• A set of vectors v1, ..., vk is linearly independent if no vector can be represented as a linear combination of the remaining vectors, i.e.:

Example:

Page 9: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Vector basis

• A set of vectors (v1, ..., vk) forms a basis in some vector space W if:

(1) (v1, ..., vk) are linearly independent

(2) (v1, ..., vk) span W

• Standard bases:R2 R3 Rn

Page 10: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix Operations

• Matrix addition/subtraction– Matrices must be of same size.

• Matrix multiplication

Condition: n = q

m x n q x p m x p

n

Page 11: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Identity Matrix

Page 12: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix Transpose

Page 13: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Symmetric Matrices

Example:

Page 14: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Determinants2 x 2

3 x 3

n x n

Properties:

Page 15: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix Inverse

• The inverse A-1 of a matrix A has the property: AA-1=A-1A=I

• A-1 exists only if

• Terminology– Singular matrix: A-1 does not exist– Ill-conditioned matrix: A is “close” to being singular

Page 16: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix Inverse (cont’d)

• Properties of the inverse:

Page 17: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix trace

Properties:

Page 18: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Rank of matrix

• Equal to the dimension of the largest square sub-matrix of A that has a non-zero determinant.

Example: has rank 3

Page 19: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Rank of matrix (cont’d)

• Alternative definition: the maximum number of linearly independent columns (or rows) of A.

i.e., rank is not 4!

Example:

Page 20: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Rank of matrix (cont’d)

Page 21: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Eigenvalues and Eigenvectors

• The vector v is an eigenvector of matrix A and λ is an eigenvalue of A if:

i.e., the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude.

(assume non-zero v)

Page 22: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Computing λ and v

• To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial:

Example:

Page 23: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Properties

• Eigenvalues and eigenvectors are only defined for square matrices (i.e., m = n)

• Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv)

• Suppose λ1, λ2, ..., λn are the eigenvalues of A, then:

Page 24: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix diagonalization

• Given an n x n matrix A, find P such that: P-1AP=Λ where Λ is diagonal

• Take P = [v1 v2 . . . vn], where v1,v2 ,. . . vn are the eigenvectors of A:

Page 25: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix diagonalization (cont’d)

Example:

Page 26: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

• Only if P-1 exists (i.e., P must have n linearly independent eigenvectors, that is, rank(P)=n)

• If A is diagonalizable, then the corresponding eigenvectors v1,v2 ,. . . vn form a basis in Rn

Are all n x n matrices diagonalizable P-1AP ?

Page 27: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Matrix decomposition

• Let us assume that A is diagonalizable, then A can be decomposed as follows:

Page 28: Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Special case: symmetric matrices

• The eigenvalues of a symmetric matrix are real and its eigenvectors are orthogonal.

P-1=PT

A=PDPT=