Topological Learning for Brain Networks

32
Topological Learning for Brain Networks Tananun Songdechakraiwut Moo K. Chung University of Wisconsin-Madison www.stat.wisc.edu/~mchung

Transcript of Topological Learning for Brain Networks

Page 1: Topological Learning for Brain Networks

Topological Learning for Brain Networks

Tananun SongdechakraiwutMoo K. Chung

University of Wisconsin-Madisonwww.stat.wisc.edu/~mchung

Page 2: Topological Learning for Brain Networks

Satellite meeting of OHBM 2021

June 17-18, 2021Seoul, KoreaVirtual Zoom

Organizers:Vince CalhounMoo K. ChungYong JeongMartin LindquistHea-Jeong ParkAnqi Qiu

http://sites.google.com/view/nbia2021

Page 3: Topological Learning for Brain Networks

AbstractWe present a novel topological learning framework that can integrate networks of different sizes and topology through persistent homology. This is possible through the introduction of a topological loss function that enables such challenging tasks. The use of the proposed loss function bypasses the intrinsic computational bottleneck associated with matching networks. The method is effectively applied to a twin brain imaging study in determining if the functional brain network is genetically heritable. The biggest challenge is in overlaying the functional brain networks obtained from the resting-state functional magnetic resonance imaging (fMRI) onto the structural brain network obtained through diffusion tensor imaging (DTI). While the functional network exhibits more cycles, the structural network is tree-like. Simply overlaying or regressing functional network on top of structural network will destroy the topology of both networks. We found the functional brain networks of identical twins are very similar demonstrating the strong genetic contribution on our though patterns. This is a joint work with PhD student Tananun Songdechakraiwut.

arXiv:2012.00675

Anuj Srivastava: elastic graph matching arXiv: 2007.04793

Steve Marron: persistent homology Annals of applied statistics 2016

Huang et al. 2020 IEEE Transactions on Medical Imaging

Page 4: Topological Learning for Brain Networks

Acknowledgement

Zhan Luo, Ian Carroll, Gregory Kirk, Nagesh Adluru, Andrew Alexander, Seth Pollack, Hill Goldsmith, Richard Davidson, Alex Smith, Gary ShiuUniv. of Wisconsin-MadisonLi Shen Univ. of Pennsylvania Hernando Ombao, Chee Ming Ting KAUST, Saudi ArabiaYuan Wang University of South CarolinaDong Soo Lee, Hyekyung lee Seoul National University, KoreaShih-Gu Huang, Anqi Qiu National University of SingaporeIlwoo Lyu Vanderbilt University

Grants: NIH R01 Brain Initiative EB022856, R01 EB028753, NSF DMS-2010778

Page 5: Topological Learning for Brain Networks

Resting-state functional magnetic resonance imaging (rs-fMRI)

1200 time points and 300000 voxels per subject over 14min 33 seconds inside a 3T scanner at rest

After motion correction, scrubbing….400 subjects (124 MZ twins70 same-sex DZ twins) x 2GB = 800GB data

Page 6: Topological Learning for Brain Networks

Resting state fMRI (every 30 second)

0

15000

Page 7: Topological Learning for Brain Networks

Correlation network

Correlation network of 300000 time seriesComplete graph with many cycles.

GPU

Chung et al. 2019 Network Neuroscience

Page 8: Topological Learning for Brain Networks

Diffusion tensor DDiffusion tensor Imaging (DTI)

Transition probability

p(x0 ! x) / exph� ((x� x0)>D�1(x� x0)

4⌧

i

1 million tracts

Page 9: Topological Learning for Brain Networks

Epsilon-neighbor network construction Parcellation free brain network construction

Part I: Fiber tractography

White matter fibers

Part II: ε-neighbor construction

ε-neighborfrom point settopology

Iteratively add one edge at a time

minq

kq � pk ✏

p

Part III: 3D network graph

Multiscale brain network

Chung et al. 2011 SPIE 7962

Finding: 96% of all nodes are connected to each other to form a tree-like single connected component

Page 10: Topological Learning for Brain Networks

How big is brain network data?

vi

vj

p=25972 voxels (3mm) in the brainà 25972 x 25972 = 0.67 billion connections5.2GB memory

vjvi

53

63

46

Connectivity matrix

wijwij

2019 Cambridge University Press300000 voxels (1mm) à 90 billion connectionsà 700 GB memory

Page 11: Topological Learning for Brain Networks

+

DTI

Biological data reduction: Parcellation based network construction

MRI

Parcellationpartition braininto 116 regions

Structural connectivity

functional connectivity+fMRI

Parcellation boundaries don’t overlapacross subjects and modality

Page 12: Topological Learning for Brain Networks

Gazillions of parcellations. Why?

Let’s do something different

Page 13: Topological Learning for Brain Networks

Proposal: Deformable shape

Functional network of subject k

PStructural network template

b⇥k = argmin⇥

LF (⇥, Gk) + �Ltop(⇥, P )

Goodness-of-fitFrobenius norm

Topological lossControl amountof topology

Gk = (V,wk)

b⇥k

Topological registration

network

Page 14: Topological Learning for Brain Networks

0.1

3

2

1

0

�0

�1

A B

CD

0.2 0.3 0.4

Graph filtration

Chung et al. 2019 Network Neuroscience

0.6

0.3

0.4

0.50.

5

0.2

Lee et al. 2012 IEEE Transactions on Medical Imaging

Monotonicity of Betti curves

0.3

0.4

0.5

0.6

# connectedcomponents

# cycles

ADCD = ADB + DCB à vector space

Page 15: Topological Learning for Brain Networks

0.1

3

2

1

0

�0

�1

B

0.2 0.3 0.4

Persistence = Life time (death – birth) of a feature

0.6

0.3

0.4

0.50.

5

0.2

Edges destroy cycles Edges create components

Life time of cycle Life time of node B

death birth

Page 16: Topological Learning for Brain Networks

A B

CD

Theorem 1 Barcodes partition the edge set

0.6

0.3

0.4

0.5

0.2

E1 E0Edges destroy cycles Edges create components

#(E1) = 1 +|V |(|V |� 3)

2#(E0) = |V |� 1

#(E) =|V |(|V |� 1)

2

E = E0 [ E1E0 E1

Maximum spanning tree

[=O(|E| log |V |)

Page 17: Topological Learning for Brain Networks

Topological loss

Functional network Structural network

P = (V P , wP )⇥ = (V ⇥, w⇥)

Ltop(⇥, P ) = L0D(⇥, P ) + L1D(⇥, P )

L0D(⇥, P ) = min⌧

X

b2E0

⇥b� ⌧(b)

⇤2

bijection

L1D(⇥, P ) = min⌧

X

d2E1

⇥d� ⌧(d)

⇤2

Page 18: Topological Learning for Brain Networks

Theorem 2 Optimal topological matching

⌧⇤0 The i-th smallest birth value to the i-th smallest birth value

L1D(⇥, P ) = min⌧

X

d2E1

⇥d� ⌧(d)

⇤2

=X

d2E1

⇥d� ⌧⇤1 (d)

⇤2

The i-th smallest death value to the i-th smallest death value

L0D(⇥, P ) = min⌧

X

b2E0

⇥b� ⌧(b)

⇤2

=X

b2E0

⇥b� ⌧⇤0 (b)

⇤2

Page 19: Topological Learning for Brain Networks

Topological matching via sorting with data augmentation

0.6

0.3

0.4

0.5

0.2

Edges destroy cycles Edges create components

0.1

E0E1

0.3

0.2

0.4

0.6

Page 20: Topological Learning for Brain Networks

Topological mean b⇥ = argmin⇥

nX

k=1

Ltop(⇥, Gk)

Birth values of are given by averaging the sorted birth values of all the networks .

⇥Death death

1. Sort birth/death values. 2. Match them3. Average

Gk

Page 21: Topological Learning for Brain Networks

Template-based brain network analysis

Functional network

PStructural network

b⇥k = argmin⇥

LF (⇥, Gk) + �Ltop(⇥, P )Align individual functional network to structural template

Frobenius norm Topological loss

Control amountof topology

Gk = (V,wk) b⇥k

@Ltop(⇥, P )

@w⇥ij

=

(2⇥w⇥

ij � ⌧0⇤(w⇥ij)

⇤if w⇥

ij 2 E0;

2⇥w⇥

ij � ⌧1⇤(w⇥ij)

⇤if w⇥

ij 2 E1

Run time O(|E| log |V |)

Topological gradient descent

Page 22: Topological Learning for Brain Networks

Optimal amount of topology?

1.0000 ± 0.0002 over 412 subjects

Topological stability

Frobenius norm

Total loss

Topological loss

Page 23: Topological Learning for Brain Networks

Topological learning at group level

G1 = (V,w1), · · · , Gn = (V,wn)Functional networks

PStructural network

Register every functional network to structural template

Frobenius normTopological loss

Control amountof topological learning

b⇥k = argmin⇥

1

n

nX

k=1

LF (⇥, Gk) + �Ltop(⇥, P )

Page 24: Topological Learning for Brain Networks

168 males

232 femalesTopological learning at group level

Page 25: Topological Learning for Brain Networks

Structuraltemplate

Learnedgroup levelnetwork

Topologically equivalent

Page 26: Topological Learning for Brain Networks

Simulation study Within module connection probability pBetween module connection probability 1-p

generate10 networks vs. 10 networks

Page 27: Topological Learning for Brain Networks

Permutation test for topological lossBetween-class loss

LW /X

k

X

i,j2Ck

L(Gi, Gj)

LB /X

i2C1,j2C2

L(Gi, Gj)

Within-class loss

� =LB

LW

Statistic

Page 28: Topological Learning for Brain Networks

Permutation test is not easy to apply to existing graph matching algorithms!

100 second per permutation à permutation test with 100000 permutations

= 2778 hours = 115 days

àPermutation test via random transpositionsChung et al. 2019 Connectomics in NeuroImaging

Page 29: Topological Learning for Brain Networks

Transposition test on loss functionsSubject 2 in group 1 swapping with subject 8 in group 2

Compute the incremental change of loss functions over transposition

LW ! LW +�(tranposition)

LB ! LB +�(tranposition)

Page 30: Topological Learning for Brain Networks

Average p-value in 50 independent simulations

nodes modules

Graduatedassignment

Spectralmatching

Integerprojectedfixed point

Reweightedrandom walkmatching

Page 31: Topological Learning for Brain Networks

HI above 1.00

Original Pearson correlation

After topological learning

Heritability index = 2 (corr(MZ) – corr (DZ))

Page 32: Topological Learning for Brain Networks

Pearson correlation over sliding window

116 nodes functional network on top of white matter fiber tracts

Thank you. What next? Dynamic Topological Data Analysis (TDA)