Linear Algebra and Matrices

43
Linear Algebra and Matrices Methods for Dummies 20 th October, 2010 Melaine Boly Christian Lambert

description

Linear Algebra and Matrices. Methods for Dummies 20 th October, 2010 Melaine Boly Christian Lambert. Overview. Definitions-Scalars, vectors and matrices Vector and matrix calculations Identity, inverse matrices & determinants Eigenvectors & dot products Relevance to SPM and terminology. - PowerPoint PPT Presentation

Transcript of Linear Algebra and Matrices

Page 1: Linear Algebra and Matrices

Linear Algebra and Matrices

Methods for Dummies20th October, 2010

Melaine BolyChristian Lambert

Page 2: Linear Algebra and Matrices

Overview

• Definitions-Scalars, vectors and matrices• Vector and matrix calculations• Identity, inverse matrices & determinants• Eigenvectors & dot products• Relevance to SPM and terminology

Linear Algebra & Matrices, MfD 2010

Page 3: Linear Algebra and Matrices

Part IMatrix Basics

Linear Algebra & Matrices, MfD 2010

Page 4: Linear Algebra and Matrices

Scalar• A quantity (variable), described by a single

real number

Linear Algebra & Matrices, MfD 2010

e.g. Intensity of each voxel in an MRI scan

Page 5: Linear Algebra and Matrices

Vector• Not a physics vector (magnitude, direction)

Linear Algebra & Matrices, MfD 2010

xnxx21

i.e. A column of numbers

VECTOR=

EXAMPLE:

Page 6: Linear Algebra and Matrices

Matrices• Rectangular display of vectors in rows and columns• Can inform about the same vector intensity at different

times or different voxels at the same time• Vector is just a n x 1 matrix

Linear Algebra & Matrices, MfD 2010

333231232221131211

xxxxxxxxx

Page 7: Linear Algebra and Matrices

Matrices

Square (3 x 3)

333231

232221

131211

ddddddddd

D

Linear Algebra & Matrices, MfD 2010

Matrix locations/size defined as rows x columns (R x C)

d i j : ith row, jth column

333231232221131211

xxxxxxxxx

Rectangular (3 x 2)

333231232221131211

xxxxxxxxx

333231232221131211

xxxxxxxxx

333231232221131211

xxxxxxxxx

333231232221131211

xxxxxxxxx

963852741

A

654

321

A

3 dimensional (3 x 3 x 5)

Page 8: Linear Algebra and Matrices

Matrices in MATLAB

963

54

1111

000

Linear Algebra & Matrices, MfD 2010

963852741

XX=[1 4 7;2 5 8;3 6 9]Matrix(X)

;=end of a row

Description Type into MATLAB Meaning

2nd Element of 3rd column X(2,3) 8X(3, :) 3rd row

(X(row,column))Reference matrix valuesNote the : refers to all of row or column and , is the divider between rows and columns

Elements 2&3 of column 2 X( [1 2], 2)

Special types of matrixAll zeros size 3x1

All ones size 2x2

zeros(3,1)

ones(2,2)

Page 9: Linear Algebra and Matrices

Transpositioncolumn row row column

Linear Algebra & Matrices, MfD 2010

943

Td

476145321

A

413742651

TA

211

b 211Tb 943d

Page 10: Linear Algebra and Matrices

Matrix CalculationsAddition

– Commutative: A+B=B+A– Associative: (A+B)+C=A+(B+C)

1111

1111

2222

1111

2222

BA

Subtraction- By adding a negative matrix

6543

15320412

1301

5242

BA

Linear Algebra & Matrices, MfD 2010

1221

4321

3542

4321

3542

BA

Page 11: Linear Algebra and Matrices

Scalar multiplication

Scalar * matrix = scalar multiplication

Linear Algebra & Matrices, MfD 2010

Page 12: Linear Algebra and Matrices

Matrix Multiplication“When A is a mxn matrix & B is a kxl matrix, AB is only possible

if n=k. The result will be an mxl matrix”

A1 A2 A3

A4 A5 A6 A7 A8 A9

A10 A11 A12

m

n

xB13 B14

B15 B16

B17 B18

l

k

Simply put, can ONLY perform A*B IF:

Number of columns in A = Number of rows in B

= m x l matrix

Linear Algebra & Matrices, MfD 2010

Hint:1)If you see this message in MATLAB:

??? Error using ==> mtimesInner matrix dimensions must agree -Then columns in A is not equal to rows in B

Page 13: Linear Algebra and Matrices

Matrix multiplication• Multiplication method:Sum over product of respective rows and columns

Matlab does all this for you!

Simply type: C = A * B

Linear Algebra & Matrices, MfD 2010

Hints:1)You can work out the size of the output (2x2). In MATLAB, if you pre-allocate a matrix this size (e.g. C=zeros(2,2)) then the calculation is quicker

Page 14: Linear Algebra and Matrices

Matrix multiplication• Matrix multiplication is NOT commutative i.e the

order matters!– AB≠BA

• Matrix multiplication IS associative– A(BC)=(AB)C

• Matrix multiplication IS distributive– A(B+C)=AB+AC– (A+B)C=AC+BC

Linear Algebra & Matrices, MfD 2010

Page 15: Linear Algebra and Matrices

Identity matrixA special matrix which plays a similar role as the number 1 in number

multiplication?

For any nxn matrix A, we have A In = In A = A For any nxm matrix A, we have In A = A, and A Im = A (so 2 possible matrices)

Linear Algebra & Matrices, MfD 2010

If the answers always A, why use an identity matrix?Can’t divide matrices, therefore to solve may problems have to use the inverse. The

identity is important in these types of calculations.

Page 16: Linear Algebra and Matrices

Identity matrix

11 22 33 11 00 00 1+0+01+0+0 0+2+00+2+0 0+0+30+0+3

44 55 66 XX 00 11 00 == 4+0+04+0+0 0+5+00+5+0 0+0+60+0+6

77 88 99 00 00 11 7+0+07+0+0 0+8+00+8+0 0+0+90+0+9

Worked exampleA I3 = A for a 3x3 matrix:• In Matlab: eye(r, c) produces an r x c identity matrix

Linear Algebra & Matrices, MfD 2010

Page 17: Linear Algebra and Matrices

Part IIMore Advanced Matrix Techniques

Linear Algebra & Matrices, MfD 2010

Page 18: Linear Algebra and Matrices

Vector components& orthonormal base

Linear Algebra & Matrices, MfD 2010

• A given vector (a b) can be summarized by its components, but only in a particular base (set of axes; the vector itself can be independent from the choice of this particular base).

example a and b are the components ofin the given base (axes chosen

for expression of the coordinates in vector space)

v

v

v

a

b

x axis

y axis

Orthonormal base: set of vectors chosen to express the componentsof the others, perpendicular to each other and all with norm (length) = 1

Page 19: Linear Algebra and Matrices

Linear combination & dimensionality

Linear Algebra & Matrices, MfD 2010

Vectorial space: space defined by different vectors (for example for dimensions…).The vectorial space defined by some vectors is a space that contains them and all the vectors that can be obtained by multiplying these vectors by a real number then adding them (linear combination).

A matrix A (mn) can itself be decomposed in as many vectors as its number of columns (or lines). When decomposed, one can represent each column of the matrix by a vector. The ensemble of n vector-column defines a vectorial space proper to matrix A.Similarly, A can be viewed as a matricial representation of this ensemble of vectors, expressing their components in a given base.

Page 20: Linear Algebra and Matrices

Linear dependency and rank

Linear Algebra & Matrices, MfD 2010

If one can find a linear relationship between the lines or columns of a matrix, then the rank of the matrix (number of dimensions of its vectorial space) will not be equal to its number of column/lines – the matrix will be said to be rank-deficient.

Example 4221X

21

1x

42

2x2x

1x

y

x

4

21

2

When representing the vectors, we see that x1 and x2 are superimposed. If we look better, we see that we can express one by a linear combination of the other: x2 = 2 x1.

The rank of the matrix will be 1.In parallel, the vectorial space defined will has only one dimension.

Page 21: Linear Algebra and Matrices

Linear dependency and rank

Linear Algebra & Matrices, MfD 2010

• The rank of a matrix corresponds to the dimensionality of the vectorial space defined by this matrix. It corresponds to the number of vectors defined by the matrix that are linearly independents from each other.

• Linealy independent vectors are vectors defining each one one more dimension in space, compared to the space defined by the other vectors. They cannot be expressed by a linear combination of the others.

Note. Linearly independent vectors are not necessarily orthogonal (perpendicular).Example: take 3 linearly independent vectorsx1, x2 et x3.

Vectors x1 and x2 define a plane (x,y)And vector x3 has an additional non-zero component in the z axis. But x3 is not perpendicular to x1 or x2.

plan (x,y)

2x1x

3x

Page 22: Linear Algebra and Matrices

Eigenvalues et eigenvectors

Linear Algebra & Matrices, MfD 2010

One can represent the vectors from matrix X (eigenvectors of A) as a set of orthogonal vectors (perpendicular), and thus representing the different dimensions of the original matrix A.The amplitude of the matrix A in these different dimensions will be given by the eigenvalues corresponding to the different eigenvectors of A (the vectors composing X). Note: if a matrix is rank-deficient, at least one of its eigenvalues is zero.

In Principal Component Analysis (PCA), the matrix is decomposed into eigenvectors and eigenvalues AND the matrix is rotated to a new coordinate system such that the greatest variance by any projection of the data comes to lie on the first coordinate (called the first principal component), the second greatest variance on the second coordinate, and so on.

For A’:u1, u2 = eigenvectorsk1, k2 = eigenvalues

Page 23: Linear Algebra and Matrices

Vector Products

Inner product XTY is a scalar(1xn) (nx1)

Outer product XYOuter product XYTT is a matrix is a matrix(nx1) (1xn) (nx1) (1xn)

xyT x1

x2

x3

y1 y2 y3 x1y1 x1y2 x1y3

x2y1 x2y2 x2y3

x3y1 x3y2 x3y3

ii

iT yxyxyxyx

yyy

xxx

3

1332211

3

2

1

321yx

Inner product = scalar

Two vectors:

3

2

1

xxx

x

3

2

1

yyy

y

Outer product = matrix

Linear Algebra & Matrices, MfD 2010

Page 24: Linear Algebra and Matrices

Scalar product of vectors

Linear Algebra & Matrices, MfD 2010

Calculate the scalar product of two vectors is equivqlent to make the projection of one vector on the other one. One can indeed show that x1 x2 = x1 . x2 . cos where is the angle that separates two vectors when they have both the same origin.

x1 x2 = . . cos

In parallel, if two vectors are orthogonal, their scalar product is zero: the projection of one onto the other will be zero.

1x

2x

cos

1x

2x

Page 25: Linear Algebra and Matrices

Determinants

Linear Algebra & Matrices, MfD 2010

Le déterminant d’une matrice est un nombre scalaire représentant certaines propriétés intrinsèques de cette matrice. Il est noté detA ou |A|.

Sa définition est un détour indispensable avant d’aborder l’opération correspondant à la division de matrices, avec le calcul de l’inverse.

Page 26: Linear Algebra and Matrices

Determinants

Linear Algebra & Matrices, MfD 2010

For a matrix 11: For a matrix 22:

For a matrix 33: a11 a12 a13

a21 a22 a23 = a11a22a33+a12a23a31+a13a21a32–a11a23a32 –a12a21a33 –a13a22a31

a31 a32 a33

= a11(a22a33 –a23a32)–a12(a21a33 –a23a31)+a13(a21a32 –a22a31)

The determinant of a matrix can be calculate by multiplying each element of one of its lines by the determinant of a sub-matrix formed by the elements that stay when one suppress the line and column containing this element. One give to the obtained product the sign (-1)i+j.

211222112221

1211 aaaaaaaa

1111)det( aa

Page 27: Linear Algebra and Matrices

Determinants•Determinants can only be found for square matrices.•For a 2x2 matrix A, det(A) = ad-bc. Lets have at closer look at that:

The determinant gives an idea of the ’volume’ occupied by the matrix in vector space A matrix A has an inverse matrix A-1 if and only if det(A)≠0.

• In Matlab: det(A) = det(A)

Linear Algebra & Matrices, MfD 2010

a bc d

det(A) = = ad - bc [ ]

Page 28: Linear Algebra and Matrices

Determinants

Linear Algebra & Matrices, MfD 2010

The determinant of a matrix is zero if and only if there exist a linear relationship between the lines or the columns of the matrix – if the matrix is rank-deficient.In parallel, one can define the rank of a matrix A as the size of the largest square sub-matrix of A that has a non-zero determionant.

4221X

21

1x

42

2x2x

1x

y

x

4

21

2

Here x1 and x2 are superimposed in space, because one can be expressed by a linear combination of the other: x2 = 2 x1.

The determinant of the matrix X will thus be zero.

The largest square sub-matrix with a non-zero determinant will be a matrix of 1x1 => the rank of the matrix is 1.

Page 29: Linear Algebra and Matrices

Determinants

Linear Algebra & Matrices, MfD 2010

• In a vectorial space of n dimensions, there will be no more than n linearly independent vectors.

• If 3 vectors (21) x’1, x’2, x’3 are represented by a matrix X’:

Graphically, we have:y y

x x

1

2

3

1 2 3

(1,2) (2,2)

(3,1)

3

2

1

1 2 3

1x

2x

3x

2x

3xb

1xa

312 xbxax

212231'X

Here x3 can be expressed by a linear combination of x1 and x2.

The determinant of the matrix X’ will thus be zero.

The largest square sub-matrix with a non-zero determinant will be a matrix of 2x2 => the rank of the matrix is 2.

Page 30: Linear Algebra and Matrices

Determinants

Linear Algebra & Matrices, MfD 2010

The notions of determinant, of the rank of a matrix and of linear dependency are closely linked.

Take a set of vectors x1, x2,…,xn, all with the same number of elements: these vectors are linearly dependent if one can find a set of scalars c1, c2,…,cn non equal to zero such as:

c1 x1+ c2 x2+…+ cn xn= 0A set of vectors are linearly dependent if one of then can be expressed as a linear combination of the others. They define in space a smaller number of dimensions than the total number of vectors in the set. The resulting matrix will be rank-deficient and the determinant will be zero.

Similarly, if all the elements of a line or column are zero, the determinant of the matrix will be zero.

If a matrix present two rows or columns that are equal, its determinant will also be zero

Page 31: Linear Algebra and Matrices

Matrix inverse• Definition. A matrix A is called nonsingular or invertible if there

exists a matrix B such that:

• Notation. A common notation for the inverse of a matrix A is A-1. So:

• The inverse matrix is unique when it exists. So if A is invertible, then A-1 is also invertible and then (AT)-1 = (A-1)T

11 11 XX 2 2 33

-1-1 33

== 22 + + 11 3 33 3

-1-1 + + 11 3 33 3 == 11 00

-1-1 22 1 1 33

1 1 33

-2-2+ + 22 3 3

3311 + + 22 3 3 3 3 00 11

• In Matlab: A-1 = inv(A) •Matrix division: A/B= A*B-1

Linear Algebra & Matrices, MfD 2010

Page 32: Linear Algebra and Matrices

Matrix inverse• For a XxX square matrix:

• The inverse matrix is:

Linear Algebra & Matrices, MfD 2010

• E.g.: 2x2 matrix

For a matrix to be invertible, its determinant has to be non-zero (it has to be squareand of full rank). A matrix that is not invertible is said to be singular.Reciprocally, a matrix that is invertible is said to be non-singular.

Page 33: Linear Algebra and Matrices

Pseudoinverse

Linear Algebra & Matrices, MfD 2010

In SPM, design matrices are not square (more lines than columns, especially for fMRI).

The system is said to be overdetermined – there is not a unique solution, i.e. there is more than one solution possible.

SPM will use a mathematical trick called the pseudoinverse, which is an approximation used in overdetermined systems, where the solution is constrained to be the one where the values that are minimum.

Page 34: Linear Algebra and Matrices

Linear Algebra & Matrices, MfD 2010

Part IIIHow are matrices relevant

to fMRI data?

Page 35: Linear Algebra and Matrices

NormalisationNormalisation

Statistical Parametric MapStatistical Parametric MapImage time-seriesImage time-series

Parameter estimatesParameter estimates

General Linear ModelGeneral Linear ModelRealignmentRealignment SmoothingSmoothing

Design matrix

AnatomicalAnatomicalreferencereference

Spatial filterSpatial filter

StatisticalStatisticalInferenceInference

RFTRFT

p <0.05p <0.05

Linear Algebra & Matrices, MfD 2010

Page 36: Linear Algebra and Matrices

Linear Algebra & Matrices, MfD 2010

Time

BOLD signal

Time

single voxeltime series

Voxel-wise time series analysis

ModelspecificationParameterestimationHypothesis

Statistic

SPM

Page 37: Linear Algebra and Matrices

How are matrices relevant to fMRI data?

=

+

= +Y X

data

vecto

rdesi

gn

matrix

param

eters

error

vecto

r

Linear Algebra & Matrices, MfD 2010

GLM equation

Page 38: Linear Algebra and Matrices

How are matrices relevant to fMRI data?

Y

data

vecto

r

Response variable

e.g BOLD signal at a particular voxel

A single voxel sampled at successive time points. Each voxel is considered as independent observation.

Preprocessing ...

Intensity

Ti

me

YT

ime

Y = X . β + εLinear Algebra & Matrices, MfD 2010

Page 39: Linear Algebra and Matrices

How are matrices relevant to fMRI data?

X

design

matrix

param

eters

Explanatory variables– These are assumed to be

measured without error.– May be continuous;– May be dummy,

indicating levels of an experimental factor.

Y = X . β + ε

Solve equation for β – tells us how much of the BOLD signal is explained by X

Linear Algebra & Matrices, MfD 2010

Page 40: Linear Algebra and Matrices

In Practice

• Estimate MAGNITUDE of signal changes • MR INTENSITY levels for each voxel at

various time points• Relationship between experiment and

voxel changes are established • Calculation and notation require linear

algebra and matrices manipulations

Linear Algebra & Matrices, MfD 2010

Page 41: Linear Algebra and Matrices

Summary

• SPM builds up data as a matrix.• Manipulation of matrices enables unknown

values to be calculated.

Y = X . β + εObserved = Predictors * Parameters +

ErrorBOLD = Design Matrix * Betas + Error

Linear Algebra & Matrices, MfD 2010

Page 42: Linear Algebra and Matrices

References• SPM course http://www.fil.ion.ucl.ac.uk/spm/course/• Web Guideshttp://mathworld.wolfram.com/LinearAlgebra.htmlhttp://www.maths.surrey.ac.uk/explore/emmaspages/option1.htmlhttp://www.inf.ed.ac.uk/teaching/courses/fmcs1/(Formal Modelling in Cognitive Science course)• http://www.wikipedia.org• Previous MfD slides

Linear Algebra & Matrices, MfD 2010

Page 43: Linear Algebra and Matrices

ANY QUESTIONS ?