Overview of SPM

38
Overview of SPM Realignment Smoothing Normalisation General linear model Statistical parametric map (SP Image time-series Parameter estimates Design matrix Template Kernel Gaussian field theory p <0.05 Statistical inference

description

Overview of SPM. Statistical parametric map (SPM). Design matrix. Image time-series. Kernel. Realignment. Smoothing. General linear model. Gaussian field theory. Statistical inference. Normalisation. p

Transcript of Overview of SPM

Page 1: Overview of SPM

Overview of SPM

Realignment Smoothing

Normalisation

General linear model

Statistical parametric map (SPM)Image time-series

Parameter estimates

Design matrix

Template

Kernel

Gaussian field theory

p <0.05

Statisticalinference

Page 2: Overview of SPM

The General Linear Model (GLM)

With many thanks for slides & images to:FIL Methods group, Virginia Flanagin and Klaas Enno Stephan

Frederike Petzschner

Translational Neuromodeling Unit (TNU)Institute for Biomedical Engineering, University of Zurich & ETH Zurich

Page 3: Overview of SPM

Image a very simple experiment…

time

• One session• 7 cycles of rest and

listening• Blocks of 6 scans with 7 sec

TR

Page 4: Overview of SPM

time

Time

single voxel time series

Image a very simple experiment…

Question: Is there a change in the BOLD response between listening and rest?

What we know.

What we measure.

Page 5: Overview of SPM

time

Time

single voxel time series

Image a very simple experiment…

Question: Is there a change in the BOLD response between listening and rest?

What we know.

What we measure.

Page 6: Overview of SPM

linear model

effects estimate

error estimate

statistic

You need a model of your data…

Question: Is there a change in the BOLD response between listening and rest?

Page 7: Overview of SPM

BOLD signal

Time =1 2+ +

error

x1 x2 e

Explain your data… as a combination of experimental manipulation,confounds and errors

Single voxel regression model:

regressor

Page 8: Overview of SPM

BOLD signal

Time =1 2+ +

error

x1 x2 e

eXy Single voxel regression model:

Explain your data… as a combination of experimental manipulation,confounds and errors

Page 9: Overview of SPM

n

= + +error

e

1

2

eXy

1n

p

p1

n1

n: number of scansp: number of regressors

The black and white version in SPM

Desig

nmat

rix

erro

r

Page 10: Overview of SPM

The design matrix embodies all available knowledge about experimentally controlled

factors and potential confounds. Talk: Experimental Design Wed 9:45 – 10:45

Model assumptions

Designmatrix

errorYou want to estimate your parameters such that you minimize:

This can be done using an Ordinary least squares estimation (OLS) assuming an i.i.d. error:

Page 11: Overview of SPM

errorGLM assumes identical and independently distributed errors

i.i.d. = error covariance is a scalar multiple of the identity matrix: Cov(e) = 2I

1001

)(eCov

1004

)(eCov

2112

)(eCov

non-identity non-independencet1 t2

t1

t2

Page 12: Overview of SPM

= +

e

2

1

y X

„Option 1“: Per hand

How to fit the model and estimate the parameters?

error

Page 13: Overview of SPM

= +

e

2

1

y X

How to fit the model and estimate the parameters?

error

Data predicted by our modelError between predicted and actual dataGoal is to determine the betas such that we minimize the quadratic error

OLS (Ordinary Least Squares)

Page 14: Overview of SPM

OLS (Ordinary Least Squares) The goal is to

minimize the quadratic error between data and model

Page 15: Overview of SPM

OLS (Ordinary Least Squares) The goal is to

minimize the quadratic error between data and model

Page 16: Overview of SPM

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

The goal is to minimize the quadratic error between data and model

Page 17: Overview of SPM

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

The goal is to minimize the quadratic error between data and model

Page 18: Overview of SPM

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

You find the extremum of a function by taking its derivative and setting it to zero

The goal is to minimize the quadratic error between data and model

Page 19: Overview of SPM

OLS (Ordinary Least Squares)

This is a scalar and the transpose of a scalar is a scalar

You find the extremum of a function by taking its derivative and setting it to zero

SOLUTION: OLS of the Parameters

The goal is to minimize the quadratic error between data and model

Page 20: Overview of SPM

ye

Design space defined by X

x1

x2

A geometric perspective on the GLM

ˆ Xy

yXXX TT 1)(ˆ OLS estimates

Page 21: Overview of SPM

x1

x2x2*

y

Correlated and orthogonal regressors

When x2 is orthogonalized with regard to x1, only the parameter estimate for x1 changes, not that for x2!

Correlated regressors = explained variance is shared between regressors

121

2211

exxy

1;1 *21

*2

*211

exxy

Design space defined by X

Page 22: Overview of SPM

linear model

effects estimate

error estimate

statistic

We are nearly there…

= +

Page 23: Overview of SPM

What are the problems?

1. BOLD responses have a delayed and dispersed form.

2. The BOLD signal includes substantial amounts of low-frequency noise.

3. The data are serially correlated (temporally autocorrelated) this violates the assumptions of the noise model in the GLM

Design Error

Page 24: Overview of SPM

t

dtgftgf0

)()()(

The response of a linear time-invariant (LTI) system is the convolution of the input with the system's response to an impulse (delta function).

Problem 1: Shape of BOLD response

Page 25: Overview of SPM

Solution: Convolution model of the BOLD response

expected BOLD response

= input function impulse response

function (HRF)

HRF

t

dtgftgf0

)()()(

blue = datagreen = predicted response, taking convolved with HRFred = predicted response, NOT taking into account the HRF

Page 26: Overview of SPM

Problem 2: Low frequency noise

blue = datablack = mean + low-frequency driftgreen = predicted response, taking into account low-frequency driftred = predicted response, NOT taking into

account low-frequency drift

Page 27: Overview of SPM

Problem 2: Low frequency noise

blue = datablack = mean + low-frequency driftgreen = predicted response, taking into account low-frequency driftred = predicted response, NOT taking into

account low-frequency drift

Linear model

Page 28: Overview of SPM

discrete cosine transform (DCT)

set

Solution 2: High pass filtering

Page 29: Overview of SPM

Problem 3: Serial correlations

non-identity non-independencet1 t2

t1

t2

i.i.d

n: number of scans

n

n

Page 30: Overview of SPM

Problem 3: Serial correlations

n: number of scans

n

n

autocovariancefunction

withttt aee 1 ),0(~ 2 Nt

1st order autoregressive process: AR(1)

Page 31: Overview of SPM

Problem 3: Serial correlations

• Pre-whitening: 1. Use an enhanced noise model with multiple error covariance components, i.e. e ~ N(0,2V) instead of e ~ N(0,2I). 2. Use estimated serial correlation to specify filter matrix W for whitening the data.

WeWXWy

This is i.i.d

Page 32: Overview of SPM

How do we define W ?

• Enhanced noise model

• Remember linear transform for Gaussians

• Choose W such that error covariance becomes spherical

• Conclusion: W is a simple function of V so how do we estimate V ?

WeWXWy

),0(~ 2VNe

),(~

),,(~22

2

aaNy

axyNx

2/1

2

22 ),0(~

VW

IVW

VWNWe

Page 33: Overview of SPM

Find V: Multiple covariance components

),0(~ 2VNe

iiQV

eCovV

)(

= 1 + 2

Q1 Q2

Estimation of hyperparameters with EM (expectation maximisation) or ReML (restricted maximum likelihood).

V

enhanced noise model error covariance components Qand hyperparameters

Page 34: Overview of SPM

linear model

effects estimate

error estimate

statistic

We are there…

= +

• GLM includes all known experimental effects and confounds

• Convolution with a canonical HRF

• High-pass filtering to account for low-frequency drifts

• Estimation of multiple variance components (e.g. to account for serial correlations)

Page 35: Overview of SPM

linear model

effects estimate

error estimate

statistic

We are there…

= +

c = 1 0 0 0 0 0 0 0 0 0 0

Null hypothesis: 01

)ˆ(

ˆ

T

T

cStdct

Talk: Statistical Inference and design efficiency. Next Talk

Page 36: Overview of SPM

We are there…

Time

single voxel time series

• Mass-univariate approach: GLM applied to > 100,000 voxels

• Threshold of p<0.05 more than 5000 voxels significant by chance! • Massive problem with

multiple comparisons! • Solution: Gaussian

random field theory

Page 37: Overview of SPM

Outlook: further challenges

• correction for multiple comparisons Talk: Multiple Comparisons Wed 8:30 – 9:30

• variability in the HRF across voxels Talk: Experimental Design Wed 9:45 – 10:45

• limitations of frequentist statistics Talk: entire Friday

• GLM ignores interactions among voxels Talk: Multivariate Analysis Thu 12:30 – 13:30

Page 38: Overview of SPM

Thank you!

• Friston, Ashburner, Kiebel, Nichols, Penny (2007) Statistical Parametric Mapping: The Analysis of Functional Brain Images. Elsevier.

• Christensen R (1996) Plane Answers to Complex Questions: The Theory of Linear Models. Springer.

• Friston KJ et al. (1995) Statistical parametric maps in functional imaging: a general linear approach. Human Brain Mapping 2: 189-210.

Read

me