Index of Contents Anatomic Structures › com3371 › week7 › ruiz.pdfAugust 1, 2000 Juan...
Transcript of Index of Contents Anatomic Structures › com3371 › week7 › ruiz.pdfAugust 1, 2000 Juan...
1
August 1, 2000 Juan [email protected]
1
Statistical Classification ofAnatomic Structures
Juan Ruiz-Alzola, PhD
ULPGC University, Spain, andSPL, Harvard Medical School &Brigham and Women’s Hospital
August 1, 2000 Juan [email protected]
2
Index of Contents
• General Issues and Motivation.
• Decision Theory.• Background Removal.• Towards Automatic Segmentation.
• Conclusions.• References.
August 1, 2000 Juan [email protected]
3
General Issues and MotivationProcedure:
•Sequence of slices
•Mental reconstruction
•Fixed viewpoint
•Difficult fussion
•Subjetive assesment
•Requires high skillsAugust 1, 2000 Juan Ruiz-Alzola
General Issues and MotivationSegmentation is needed for:
Tissular/funct. analysis Models construction
These processes can be implicit or explicit.
The later can be human or automatic.
August 1, 2000 Juan [email protected]
5
General Issues and Motivation
Analysis:
•Pathology detection
•Therapy evaluation
•Basic research
Model construction:
•Training
•Surgical planning
•Surgical guidance
Segmentation
August 1, 2000 Juan [email protected]
6
General Issues and MotivationAutomatization => Explicit segmentation
Human:
•Structure outline
•Very slow. High cost
•Impossible in real-time
•High variability (15 %)
Automatic:
•Intensity-based(statistical)
•Template-based
•Combined
Mixed: operator controlled
2
August 1, 2000 Juan [email protected]
7
General Issues and Motivation
August 1, 2000 Juan [email protected]
8
General Issues and MotivationAtlas
August 1, 2000 Juan [email protected]
9
General Issues and MotivationVirtual Colonoscopy
August 1, 2000 Juan [email protected]
10
General Issues and MotivationSurgical planning
T1w with contrast T1w with contrast T2w
August 1, 2000 Juan [email protected]
11
General Issues and MotivationSurgical planning
August 1, 2000 Juan [email protected]
12
General Issues and MotivationSurgical guidance
3
August 1, 2000 Juan [email protected]
13
General Issues and Motivation
Magnetic Resonance Therapy at BWH
August 1, 2000 Juan [email protected]
14
Decision Theory
Nature choses
θ ∈Θ (states)z
Observation space: Z
δ5
δ4
δ3
δ2
δ1pz(z/θ)
Generate an observation
Make a decisionminimizing Loss L(θ,δ)
ϕ(z) ∈ϑ
δi ∈∆
(decisions)State of the
economyStocks market
•Buy
•Sell
•...
Example:
August 1, 2000 Juan [email protected]
15
Decision TheoryTaxonomy of problems in Decision Theory
Decision space ∆
Discrete Continous
Binary
H0: null hypothesis
H1: alternative hypothesis
Example: Detection ofSignal in Noise
Hypothesis testing
Multiple
H1
Hn
Example: Classificationand Pattern Recognition
Examples:•PDF learning•Noise filtering•Signal restauration
Estimation theory
August 1, 2000 Juan [email protected]
16
Decision Theory
An agent is confronted to the following problem: one coin out of two possiblenon-fair ones is tossed. After getting the result (head or tail) the agent is tomake a decision about which one was actually tossed. The agent has prioraccess to both coins in order to analyze them before the experiment is done.
A first example: Tossing two non-fair coins
Problem model in terms of Decision Theory
State of nature => coin (a or b) Is there any preference to pick a coin?
Probability pz(z/θ)pz(z/a)
pa
1-pa
H T z
pz(z/b)pb
1-pb
H T z
August 1, 2000 Juan [email protected]
17
Decision TheoryLet´s admit that we know the probabilty function of both coins:
HEAD TAIL
Coin: a pz(H/a) = 0.2 pz(T/a) = 0.8
Coin: b pz(H/b) = 0.9 pz(T/b) = 0.1
Likelihoods
Probalilities
A common sense decision rule would be:
H => Choose b
T => Choose aMaximum Likelihood rule
August 1, 2000 Juan [email protected]
18
Decision TheoryEstimation of the probabilities: model learning or training
Parametric estimation: the functional form of the PDF is known. In our case(H=0, T=1):
Benoulli Trial: pz(z/θ) = pθδ[z] + (1-pθ)δ[z-1]
Learning: repeat the experiment in order to estimate the parameters.
Supervised learning: it´s known which class generates the observation.
Several estimation approaches are possible. The most intuitive is the
Law of Large Numbers (direct averaging):
p(H/θ) = NHEADS / NTOTAL p(T/ θ) = NTAILS / NTOTAL
Exercise: Apply the Maximum Likelihood principle to obtain the same estimator
4
August 1, 2000 Juan [email protected]
19
Decision TheoryBinary Classification or Detection
States: Θ = {θ0 , θ1}
Observation pdf: pz(z/θ)
H0: θo generates the observation
H1: θ1 generates the observation
A hypothesis is selected after getting an observation using a decisionrule that optimizes some criterium.
Decision rule orDiscriminant function
d(z) =0 if z ∈ Z0 (select H0)
1 if z ∈ Z1 = Z0C(select H1){
Detection:H0 : only noise
H1 : signal (and noise){August 1, 2000 Juan Ruiz-Alzola
Decision TheoryThe Maximum Likelihood Detector
Choose the hypothesis that makes the obeservation more probable
pz(z/θ0) pz(z/θ1)><
H0
H1
The Likelihood Ratio Test
pz(z/θ0)
pz(z/θ1)l(z) =
H0><H1
Tpz(z/θ0)
pz(z/θ1)log l(z) = log
H0><H1
log (T)( )T = 1 <=> Maximum Likelihood Detector
August 1, 2000 Juan [email protected]
21
Decision TheoryThe Maximum a Posteriori Detector
The states of Nature are controlled by an underlying a priori probability lawthat is incorporated into the decision rule.
Chooses the hypothesis that maximizes the posterior PDF
pθ0 (θ0/z) p θ1(θ1/z)><
H0
H1
Applying Bayes Theorem it´s easy to formulate a Likelihood Ratio Test withT = P(θ1) / P(θ1) . It can be shown that this test minimizes the prob. of error
August 1, 2000 Juan [email protected]
22
Decision TheoryExtension to Multicategory Classifiers
Both ML and MAP rules are easily extended to the multicategory case justselecting the hypothesis (out of N>2) that leads the ML or the MAP.
The discriminant function partitions the observation space (usuallyvector based). The borders are easily found making equalities in the MLand MAP rules.
z1
z2
C1
C2
C3
August 1, 2000 Juan [email protected]
23
Decision TheoryA very brief introduction to non-parametric classsifiers
Very often we don´t know the functional form of the PDF´s. In thesecases we cannot apply a parameter estimation approach to learn themodel and we cannot apply a likelihood ratio test either.
In these cases the k-NN (k nearest neighbors) is commonly used:
z1
z2
Green: class 1
Red: class 2
Blue: Unknown
Training set (supervised)}Alternative way of finding discriminant boundaries
August 1, 2000 Juan [email protected]
24
Background RemovalAny dataset obtained from a physical sensor is perturbed by noise.
Two fundamental operationsFiltering: estimate the value of the signal
Detection: decide if there is signal present{Signal in noise model : I(x) = S(x) + n(x)
Detection of signal: hypothesis test applied to each voxel
H0 : voxel only has noise
H1 : voxel has signal and noise
A binary mask is constructed, setting to 1 every voxel where H1 is accepted.
Morphological operations are applied afterwards to obtain spatial coherence.
5
August 1, 2000 Juan [email protected]
25
Background RemovalDataset from Brigham & Women´s open magnet
Slice #15 Intensity histogram
August 1, 2000 Juan [email protected]
26
Background RemovalDataset from Brigham & Women´s open magnet
Slice #20 Intensity histogram
August 1, 2000 Juan [email protected]
27
Background RemovalDataset from Brigham & Women´s open magnet
Slice #25 Intensity histogram
August 1, 2000 Juan [email protected]
28
Background RemovalComparison of histograms
Slice #15
Slice #20
Slice #25
Whole volume
August 1, 2000 Juan [email protected]
29
Background RemovalHistogram equalization to visualize noise
Pointwise transformation Slice #15
August 1, 2000 Juan [email protected]
30
Background RemovalHistogram equalization to visualize noise
Pointwise transformation Slice #20
6
August 1, 2000 Juan [email protected]
31
Background RemovalHistogram equalization to visualize noise
Pointwise transformation Slice #25
August 1, 2000 Juan [email protected]
32
Background RemovalEstimation of the probabilities: model learning or training
Simple model: the conditional PDF´s are gaussian
We have to estimate the mean and the variance of each PDF
Background from image frame Foreground from image center
Sample mean:
Sample variance:
{ } ∑=
==N
i izN
zE1
1/ˆˆ θη
( ) ∑=
−−==
N
i izN
zarV1
21
1/ˆ2ˆ ηθσ )
August 1, 2000 Juan [email protected]
33
Background RemovalConstructing the likelihood ratio test
p(z / background) = G(z; ηb,σb)
p(z / foreground) = G(z; ηf,σf)ln
G(z;ηb,σb)
G(z; ηf,σf)( )
B
>
<
F
lnP(F)
P(B)
The boundaries are easily found solving a second order equation.
Usually, only one of the solutions is physically feasible.
FBz
0
August 1, 2000 Juan [email protected]
34
Background RemovalBinary mask post-processing
The noise makes unavoidable the presence of False Detections.
It´s necessary a postprocessing to eliminate those detections.
1. Binary image formed from the output of the detector
2. Morphological operations enforcing consistency
2.1 Conected components analysis
2.2 Largest island selection
2.3 Holes filling
August 1, 2000 Juan [email protected]
35
Background RemovalSlice #15
Detector output Postprocessing
August 1, 2000 Juan [email protected]
36
Background Removal
Detector outputSlice #20
Postprocessing
7
August 1, 2000 Juan [email protected]
37
Background Removal
Detector outputSlice #25
Postprocessing
August 1, 2000 Juan [email protected]
38
Towards Automatic SegmentationExample: gray matter, white matter and lesions. Two channels.
PDw T2w
Images provided by Dr. S. Warfield, SPL, B&WH and Harvard Univ.
August 1, 2000 Juan [email protected]
39
Towards Automatic Segmentation
All classes White matter + lesion White matter Lesion
Images provided by Dr. S. Warfield, SPL, B&WH and Harvard Univ.
Overlapping of class distributions (joint PWd and T2w distribution)
There are no clear boundaries => New features must be added
Incorporation of a priori anatomic knowledge (atlas)
Feature space analysis
August 1, 2000 Juan [email protected]
40
Towards Automatic Segmentation
Images provided by Dr. S. Warfield, SPL, B&WH and Harvard Univ.
Atlas-moderated statistical classification paradigm
August 1, 2000 Juan [email protected]
41
Towards Automatic SegmentationSegmentation Results
Images provided by Dr. S. Warfield, SPL, B&WH and Harvard Univ.
kNN Atlas moderated
August 1, 2000 Juan [email protected]
42
Conclusions•Importance of image segmentation in medicalapplications
•Direct use of well-nown statistical techniques insimple problems as foreground segmentation.
•Need of more advanced techniques for generaltissue segmentation
•Importance of embedding anatomic knowledge inthe classifiers
•Automatic tissue segmentation: Hot research topic
8
August 1, 2000 Juan [email protected]
43
References•Pattern Classification and Scene Analysis, Duda &Hart, John Wiley & Sons, 1973
•Surgical Planning Lab web page:http:\\www.spl.harvard.edu
•Proceedings of the MICCAI conference (1998,1999,2000), Lecture Notes on Computer Science,Springer-Verlag.