Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems...

52
Lecture 6 Resolution and Generalized Inverses
  • date post

    19-Dec-2015
  • Category

    Documents

  • view

    220
  • download

    0

Transcript of Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems...

Page 1: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Lecture 6

Resolutionand

Generalized Inverses

Page 2: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

SyllabusLecture 01 Describing Inverse ProblemsLecture 02 Probability and Measurement Error, Part 1Lecture 03 Probability and Measurement Error, Part 2 Lecture 04 The L2 Norm and Simple Least SquaresLecture 05 A Priori Information and Weighted Least SquaredLecture 06 Resolution and Generalized InversesLecture 07 Backus-Gilbert Inverse and the Trade Off of Resolution and VarianceLecture 08 The Principle of Maximum LikelihoodLecture 09 Inexact TheoriesLecture 10 Nonuniqueness and Localized AveragesLecture 11 Vector Spaces and Singular Value DecompositionLecture 12 Equality and Inequality ConstraintsLecture 13 L1 , L∞ Norm Problems and Linear ProgrammingLecture 14 Nonlinear Problems: Grid and Monte Carlo Searches Lecture 15 Nonlinear Problems: Newton’s Method Lecture 16 Nonlinear Problems: Simulated Annealing and Bootstrap Confidence Intervals Lecture 17 Factor AnalysisLecture 18 Varimax Factors, Empircal Orthogonal FunctionsLecture 19 Backus-Gilbert Theory for Continuous Problems; Radon’s ProblemLecture 20 Linear Operators and Their AdjointsLecture 21 Fréchet DerivativesLecture 22 Exemplary Inverse Problems, incl. Filter DesignLecture 23 Exemplary Inverse Problems, incl. Earthquake LocationLecture 24 Exemplary Inverse Problems, incl. Vibrational Problems

Page 3: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Purpose of the Lecture

Introduce the idea of a Generalized Inverse,the Data and Model Resolution Matrices

and the Unit Covariance Matrix

Quantify the spread of resolution and the size of the covariance

Use the maximization of resolution and/or covariance as the guiding principle for solving inverse problems

Page 4: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Part 1

The Generalized Inverse,

the Data and Model Resolution Matrices

and the Unit Covariance Matrix

Page 5: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

all of the solutions

of the formmest = Md + v

Page 6: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

mest = Md + vlet’s focus on

this matrix

Page 7: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

mest = G-gd + vrename it the “generalized

inverse”and use the symbol G-g

Page 8: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

(let’s ignore the vector v for a moment)

Generalized Inverse G-goperates on the data to give an estimate of the

model parametersifdpre = Gmest

thenmest = G-gdobs

Page 9: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Generalized Inverse G-gif dpre = Gmest then mest = G-gdobs

sort of looks like a matrix inverseexcept

M⨉N, not squareandGG-g≠I and G-gG≠I

Page 10: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

so actuallythe generalized inverse is not a

matrix inverse at all

Page 11: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

dpre = Gmest and mest = G-gdobs

dpre = Ndobs with N = GG-g “data resolution matrix”

plug one equation into the other

Page 12: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Data Resolution Matrix, N

How much does diobs contribute to its own

prediction?

dpre = Ndobs

Page 13: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

ifN=I

dipre = di

obs

dpre = dobs

diobs completely controls its own prediction

Page 14: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

dpre dobs=

(A)

The closer N is to I, the more diobs controls its own

prediction

Page 15: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

0 5 100

5

10

15

z

d

0 5 100

5

10

15

z

d

straight line problem

Page 16: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

= =

i i

j

j

dpre = N dobs

only the data at the ends control their own prediction

Page 17: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

dobs = Gmtrue and mest = G-gdobs

mest = Rmtrue with R = G-gG “model resolution matrix”

plug one equation into the other

Page 18: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Model Resolution Matrix, R

How much does mitrue contribute to its own estimated value?

mest = Rmtrue

Page 19: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

ifR=I

miest = mitrue

mest = mtrue

miest reflects mitrue only

Page 20: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

else ifR≠Imiest =

… + Ri,i-1mi-1true + Ri,imitrue + Ri,i+1mi+1true+ …

miest is a weighted averageof all the elements of mtrue

Page 21: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

mest mtrue=

The closer R is to I, the more miest reflects only mitrue

Page 22: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Discrete version ofLaplace Transform

large c: d is “shallow” average of m(z)small c: d is “deep” average of m(z)

Page 23: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

e-chiz

z

m(z)z

z

⨉e-cloz dlo

dhi

integrate

integrate

Page 24: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

= =

i i

j

j

mest = R mtrue

the shallowest model parameters are “best resolved”

Page 25: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Covariance associated with the Generalized Inverse

“unit covariance matrix”divide by σ2 to remove effect of the

overall magnitude of the measurement error

Page 26: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

unit covariance for straight line problem

model parameters uncorrelated when this term zerohappens when data are centered about the origin

Page 27: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Part 2

The spread of resolution and the size of the covariance

Page 28: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

a resolution matrix has small spread if only its main diagonal has

large elements

it is close to the identity matrix

Page 29: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

“Dirichlet” Spread Functions

Page 30: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

a unit covariance matrix has small size if its diagonal elements are small

error in the data corresponds to only small error in the model parameters

(ignore correlations)

Page 31: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.
Page 32: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

Part 3

minimization ofspread of resolution

and/orsize of covariance

as the guiding principlefor creating a generalized inverse

Page 33: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

over-determined case

note that forsimple least squaresG-g = [GTG]-1GTmodel resolutionR=G-gG = [GTG]-1GTG=I

always the identify matrix

Page 34: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

suggests that we try to minimize the spread of the data resolution matrix, Nfind G-g that minimizes spread(N)

Page 35: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

spread of the k-th row of N

now compute

Page 36: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

first term

Page 37: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

second term

third term is zero

Page 38: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

putting it all together

which is justsimple least squaresG-g = [GTG]-1GT

Page 39: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

the simple least squares solutionminimizes the spread of data resolution

and has zero spread of the model resolution

Page 40: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

under-determined case

note that forminimum length solutionG-g = GT [GGT]-1

data resolutionN=GG-g = G GT [GGT]-1 =Ialways the identify matrix

Page 41: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

suggests that we try to minimize the spread of the model resolution matrix, R

find G-g that minimizes spread(R)

Page 42: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

minimization leads to

[GGT]G-g = GT

which is justminimum length solutionG-g = GT [GGT]-1

Page 43: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

the minimum length solutionminimizes the spread of model resolution

and has zero spread of the data resolution

Page 44: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

general case

leads to

Page 45: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

a Sylvester Equation, so explicit solution in terms of matrices

leads to

general case

Page 46: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

1

special case #1

0 ε2

[GTG+ε2I]G-g=GTG-g=[GTG+ε2I]-1GT

I

damped least squares

Page 47: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

0

special case #2

1 ε2

G-g[GGT+ε2I] =GTG-g=GT [GGT+ε2I]-1

I

damped minimum length

Page 48: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

so

no new solutions have arisen …

… just a reinterpretation of previously-derived solutions

Page 49: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

reinterpretation

instead of solving for estimates of the model parameters

We are solving for estimates of weighted averages of the model parameters,

where the weights are given by the model resolution matrix

Page 50: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

criticism of Direchlet spread() functions

when m represents m(x)is that they don’t capture the sense of

“being localized” very well

Page 51: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

These two rows of the model resolution matrix have the same spread …

Rij Rijindex, j index, ji i… but the left case is better “localized”

Page 52: Lecture 6 Resolution and Generalized Inverses. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.

we will take up this issue in the next lecture