Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton...

45
Learning Important Features Through Propagating Activation Differences Avanti Shrikumar Peyton Greenside Anshul Kundaje Stanford University ICML, 2017 Presenter: Ritambhara Singh Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University) Learning Important Features Through Propagating Activation Differences ICML, 2017 Presenter: Ritambhara Singh / 39

Transcript of Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton...

Page 1: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Learning Important Features Through PropagatingActivation Differences

Avanti Shrikumar Peyton Greenside Anshul Kundaje

Stanford University

ICML, 2017Presenter: Ritambhara Singh

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 1

/ 39

Page 2: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 2

/ 39

Page 3: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 3

/ 39

Page 4: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Motivation

Interpretabilty of neural networks :Assign importance score toinputs for a given output.

Importance is defined in terms of differences from a ‘reference’ state.

Propagates importance signal even when gradient is zero.

Gives separate consideration to positive and negative contributions.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4

/ 39

Page 5: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Motivation

Interpretabilty of neural networks :Assign importance score toinputs for a given output.

Importance is defined in terms of differences from a ‘reference’ state.

Propagates importance signal even when gradient is zero.

Gives separate consideration to positive and negative contributions.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4

/ 39

Page 6: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Motivation

Interpretabilty of neural networks :Assign importance score toinputs for a given output.

Importance is defined in terms of differences from a ‘reference’ state.

Propagates importance signal even when gradient is zero.

Gives separate consideration to positive and negative contributions.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4

/ 39

Page 7: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Motivation

Interpretabilty of neural networks :Assign importance score toinputs for a given output.

Importance is defined in terms of differences from a ‘reference’ state.

Propagates importance signal even when gradient is zero.

Gives separate consideration to positive and negative contributions.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4

/ 39

Page 8: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 5

/ 39

Page 9: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Interpretation of Neural Networks

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 6

/ 39

Page 10: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 7

/ 39

Page 11: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

State-of-the-art

Perturbation-based forward propagation approaches: Zeiler andFergus (2013), Zhou and Troyanskaya (2015).

Backpropagation-based approaches: Saliency maps: Simonyan etal. (2013), Guided Backpropagation: Springenberg et al. (2014)

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 8

/ 39

Page 12: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 9

/ 39

Page 13: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Saturation problem

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 10

/ 39

Page 14: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Saturation problem

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 11

/ 39

Page 15: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Thresholding Problem

y = max(0, x − 10)

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 12

/ 39

Page 16: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 13

/ 39

Page 17: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Philosophy

Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.

Summation-to-delta property:

Σni=1C∆xi∆t = ∆t (1)

Blame ∆t on ∆x1,∆x2, . . .

C∆xi∆t can be non-zero even when δtδxi

is zero.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14

/ 39

Page 18: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Philosophy

Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.

Summation-to-delta property:

Σni=1C∆xi∆t = ∆t (1)

Blame ∆t on ∆x1,∆x2, . . .

C∆xi∆t can be non-zero even when δtδxi

is zero.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14

/ 39

Page 19: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Philosophy

Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.

Summation-to-delta property:

Σni=1C∆xi∆t = ∆t (1)

Blame ∆t on ∆x1,∆x2, . . .

C∆xi∆t can be non-zero even when δtδxi

is zero.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14

/ 39

Page 20: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Philosophy

Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.

Summation-to-delta property:

Σni=1C∆xi∆t = ∆t (1)

Blame ∆t on ∆x1,∆x2, . . .

C∆xi∆t can be non-zero even when δtδxi

is zero.

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14

/ 39

Page 21: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 15

/ 39

Page 22: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Defining Reference

Given neuron x with inputs i1, i2, . . . such that x = f (i1, i2, . . . )

Given reference activations i01 , i02 , . . . of the input:

x0 = f (i01 , i02 , . . . ) (2)

Choose reference input and propagate activations though the net.

Good reference will rely on domain knowledge: “What am I interestedin measuring difference against?”

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 16

/ 39

Page 23: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 17

/ 39

Page 24: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Saturation Problem

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 18

/ 39

Page 25: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Thresholding Problem

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 19

/ 39

Page 26: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 20

/ 39

Page 27: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Multipliers

m∆x∆t =C∆x∆t

∆t(3)

Multiplier is the contribution of ∆x to ∆t divided by ∆x

Compare: partial derivative = δtδx

Infinitesimal contribution of δx to δt, divided by δx

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 21

/ 39

Page 28: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Chain Rule

m∆xi∆z = Σjm∆xi∆yjm∆yj∆z (4)

Can be computed efficiently via backpropagation

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 22

/ 39

Page 29: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 23

/ 39

Page 30: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Separating positive and negative contribution

In some cases, important to treat positive and negative contributionsdifferently.

Introduce ∆x+i and ∆x−i , such that:

∆xi = ∆x+i + ∆x−i ;C∆xi∆t = C∆x+

i ∆t + C∆x−i ∆t

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 24

/ 39

Page 31: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 25

/ 39

Page 32: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Linear Rule

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 26

/ 39

Page 33: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Rescale Rule

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 27

/ 39

Page 34: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Where it works

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 28

/ 39

Page 35: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Where it works

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 29

/ 39

Page 36: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Where it fails: “min” (AND) relation

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 30

/ 39

Page 37: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Where it fails: “min” (AND) relation

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 31

/ 39

Page 38: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

RevealCancel Rule

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 32

/ 39

Page 39: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Solution: “min” (AND) relation

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 33

/ 39

Page 40: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 34

/ 39

Page 41: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

MNIST digit classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 35

/ 39

Page 42: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Outline

1 IntroductionMotivationBackgroundState-of-the-artDrawbacks

2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions

3 ResultsMNIST digit classificationDNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 36

/ 39

Page 43: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

DNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 37

/ 39

Page 44: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

DNA sequence classification

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 38

/ 39

Page 45: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating

Summary

Novel approach for computing importance scores based on differencesfrom the ‘reference’.

Using difference-from-reference allows information to propagate evenwhen the gradient is zero

Separates contributions from positive and negative terms

Video at : https://www.youtube.com/watch?v=v8cxYjNZAXc&

index=1&list=PLJLjQOkqSRTP3cLB2cOOi_bQFw6KPGKML

Slides at: https://drive.google.com/file/d/0B15F_

QN41VQXbkVkcTVQYTVQNVE/view

Future Direction

Applying DeepLIFT to RNNsCompute ‘reference’ empirically from data

Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 39

/ 39