Skip to content

Commit 4371d00

Browse files
authored
Move LRP into RelevancePropagation.jl (#157)
1 parent a41476a commit 4371d00

File tree

124 files changed

+41
-4421
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

124 files changed

+41
-4421
lines changed

Project.toml

Lines changed: 0 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -4,38 +4,22 @@ authors = ["Adrian Hill <[email protected]>"]
44
version = "1.0.0-DEV"
55

66
[deps]
7-
ColorSchemes = "35d6a980-a343-548e-a6ea-1d62b119f2f4"
87
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
9-
Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
108
ImageCore = "a09fc81d-aa75-5fe9-8630-4744c3626534"
119
ImageTransformations = "02fcd773-0e25-5acc-982a-7f6622650795"
12-
MacroTools = "1914dd2f-81c6-5fcd-8719-6d5c9610ff09"
13-
Markdown = "d6f4376e-aef5-505a-96c1-9c027394607a"
1410
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
1511
Reexport = "189a3867-3050-52da-a836-e630ba90ab69"
1612
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
17-
Tullio = "bc48ee85-29a4-5162-ae0b-a64e1601d4bc"
1813
XAIBase = "9b48221d-a747-4c1b-9860-46a1d8ba24a7"
1914
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
2015

21-
[weakdeps]
22-
Tullio = "bc48ee85-29a4-5162-ae0b-a64e1601d4bc"
23-
24-
[extensions]
25-
TullioLRPRulesExt = "Tullio"
26-
2716
[compat]
28-
ColorSchemes = "3.18"
2917
Distributions = "0.25"
30-
Flux = "0.13, 0.14"
3118
ImageCore = "0.9, 0.10"
3219
ImageTransformations = "0.9, 0.10"
33-
MacroTools = "0.5"
34-
Markdown = "1"
3520
Random = "1"
3621
Reexport = "1"
3722
Statistics = "1"
38-
Tullio = "0.3"
3923
XAIBase = "1.2"
4024
Zygote = "0.6"
4125
julia = "1.6"

docs/make.jl

Lines changed: 1 addition & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -34,18 +34,7 @@ makedocs(;
3434
"Heatmapping" => "generated/heatmapping.md",
3535
"Input augmentations" => "generated/augmentations.md",
3636
],
37-
"LRP" => Any[
38-
"Basic usage" => "generated/lrp/basics.md",
39-
"Assigning rules to layers" => "generated/lrp/composites.md",
40-
"Supporting new layer types" => "generated/lrp/custom_layer.md",
41-
"Custom LRP rules" => "generated/lrp/custom_rules.md",
42-
"Concept Relevance Propagation" => "generated/lrp/crp.md",
43-
"Developer documentation" => "lrp/developer.md"
44-
],
45-
"API Reference" => Any[
46-
"General" => "api.md",
47-
"LRP" => "lrp/api.md"
48-
],
37+
"API Reference" => "api.md",
4938
],
5039
#! format: on
5140
linkcheck=true,

docs/src/api.md

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,6 @@ heatmap
88

99
# Analyzers
1010
```@docs
11-
LRP
1211
Gradient
1312
InputTimesGradient
1413
SmoothGrad
@@ -24,13 +23,6 @@ NoiseAugmentation
2423
InterpolationAugmentation
2524
```
2625

27-
# Model preparation
28-
```@docs
29-
strip_softmax
30-
canonize
31-
flatten_model
32-
```
33-
3426
# Input preprocessing
3527
```@docs
3628
preprocess_imagenet

docs/src/index.md

Lines changed: 1 addition & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ CurrentModule = ExplainableAI
44

55
# ExplainableAI.jl
66

7-
Explainable AI in Julia using [Flux.jl](https://fluxml.ai).
7+
Explainable AI in Julia.
88

99
## Installation
1010
To install this package and its dependencies, open the Julia REPL and run
@@ -22,27 +22,9 @@ Pages = [
2222
]
2323
Depth = 3
2424
```
25-
### LRP
26-
```@contents
27-
Pages = [
28-
"generated/lrp/basics.md",
29-
"generated/lrp/composites.md",
30-
"generated/lrp/custom_layer.md",
31-
"generated/lrp/custom_rules.md",
32-
"lrp/developer.md",
33-
]
34-
Depth = 3
35-
```
3625

3726
## API reference
38-
### General
3927
```@contents
4028
Pages = ["api.md"]
4129
Depth = 2
4230
```
43-
44-
### LRP
45-
```@contents
46-
Pages = ["lrp/api.md"]
47-
Depth = 2
48-
```

docs/src/literate/augmentations.jl

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -56,9 +56,8 @@ heatmap(input, analyzer)
5656

5757
# Is is also possible to define your own distributions or mixture distributions.
5858
#
59-
# `NoiseAugmentation` can be combined with any analyzer type, for example [`LRP`](@ref):
60-
analyzer = NoiseAugmentation(LRP(model), 50)
61-
heatmap(input, analyzer)
59+
# `NoiseAugmentation` can be combined with any analyzer type from the Julia-XAI ecosystem,
60+
# for example `LRP` from [RelevancePropagation.jl](https://github.com/Julia-XAI/RelevancePropagation.jl).
6261

6362
# ## Integration augmentation
6463
# The [`InterpolationAugmentation`](@ref) wrapper computes explanations
@@ -80,7 +79,5 @@ analyzer = InterpolationAugmentation(Gradient(model), 50)
8079
expl = analyzer(input; input_ref=matrix_of_ones)
8180
heatmap(expl)
8281

83-
# Once again, `InterpolationAugmentation` can be combined with any analyzer type,
84-
# for example [`LRP`](@ref):
85-
analyzer = InterpolationAugmentation(LRP(model), 50)
86-
heatmap(input, analyzer)
82+
# Once again, `InterpolationAugmentation` can be combined with any analyzer type from the Julia-XAI ecosystem,
83+
# for example `LRP` from [RelevancePropagation.jl](https://github.com/Julia-XAI/RelevancePropagation.jl).

docs/src/literate/example.jl

Lines changed: 2 additions & 66 deletions
Original file line numberDiff line numberDiff line change
@@ -11,13 +11,6 @@ model
1111
#md # !!! note "Supported models"
1212
#md #
1313
#md # ExplainableAI.jl can be used on any differentiable classifier.
14-
#md #
15-
#md # Only LRP requires models from Flux.jl.
16-
17-
# ## Preparing the model
18-
# For models with softmax activations on the output,
19-
# it is necessary to call [`strip_softmax`](@ref) before analyzing.
20-
model = strip_softmax(model);
2114

2215
# ## Preparing the input data
2316
# We use MLDatasets to load a single image from the MNIST dataset:
@@ -44,7 +37,7 @@ input = reshape(x, 28, 28, 1, :);
4437
# ## Explanations
4538
# We can now select an analyzer of our choice and call [`analyze`](@ref)
4639
# to get an [`Explanation`](@ref):
47-
analyzer = LRP(model)
40+
analyzer = InputTimesGradient(model)
4841
expl = analyze(input, analyzer);
4942

5043
# The return value `expl` is of type [`Explanation`](@ref) and bundles the following data:
@@ -57,13 +50,12 @@ expl = analyze(input, analyzer);
5750
# * `expl.extras`: optional named tuple that can be used by analyzers
5851
# to return additional information.
5952
#
60-
# We used an LRP analyzer, so `expl.analyzer` is `:LRP`.
53+
# We used `InputTimesGradient`, so `expl.analyzer` is `:InputTimesGradient`.
6154
expl.analyzer
6255

6356
# By default, the explanation is computed for the maximally activated output neuron.
6457
# Since our digit is a 9 and Julia's indexing is 1-based,
6558
# the output neuron at index `10` of our trained model is maximally activated.
66-
expl.output_selection
6759

6860
# Finally, we obtain the result of the analyzer in form of an array.
6961
expl.val
@@ -81,29 +73,6 @@ heatmap(input, analyzer)
8173
# refer to the [heatmapping section](@ref docs-heatmapping).
8274

8375
# ## [List of analyzers](@id docs-analyzers-list)
84-
# Currently, the following analyzers are implemented:
85-
# - [`Gradient`](@ref)
86-
# - [`InputTimesGradient`](@ref)
87-
# - [`SmoothGrad`](@ref)
88-
# - [`IntegratedGradients`](@ref)
89-
# - [`LRP`](@ref)
90-
# - Rules
91-
# - [`ZeroRule`](@ref)
92-
# - [`EpsilonRule`](@ref)
93-
# - [`GammaRule`](@ref)
94-
# - [`GeneralizedGammaRule`](@ref)
95-
# - [`WSquareRule`](@ref)
96-
# - [`FlatRule`](@ref)
97-
# - [`ZBoxRule`](@ref)
98-
# - [`ZPlusRule`](@ref)
99-
# - [`AlphaBetaRule`](@ref)
100-
# - [`PassRule`](@ref)
101-
# - [`Composite`](@ref)
102-
# - [`EpsilonGammaBox`](@ref)
103-
# - [`EpsilonPlus`](@ref)
104-
# - [`EpsilonPlusFlat`](@ref)
105-
# - [`EpsilonAlpha2Beta1`](@ref)
106-
# - [`EpsilonAlpha2Beta1Flat`](@ref)
10776

10877
# ## Neuron selection
10978
# By passing an additional index to our call to [`analyze`](@ref),
@@ -135,36 +104,3 @@ heatmap(expl)
135104

136105
# For more information on heatmapping batches,
137106
# refer to the [heatmapping documentation](@ref docs-heatmapping-batches).
138-
139-
# ## [GPU support](@id gpu-docs)
140-
# All analyzers support GPU backends,
141-
# building on top of [Flux.jl's GPU support](https://fluxml.ai/Flux.jl/stable/gpu/).
142-
# Using a GPU only requires moving the input array and model weights to the GPU.
143-
#
144-
# For example, using [CUDA.jl](https://github.com/JuliaGPU/CUDA.jl):
145-
146-
# ```julia
147-
# using CUDA, cuDNN
148-
# using Flux
149-
# using ExplainableAI
150-
#
151-
# # move input array and model weights to GPU
152-
# input = input |> gpu # or gpu(input)
153-
# model = model |> gpu # or gpu(model)
154-
#
155-
# # analyzers don't require calling `gpu`
156-
# analyzer = LRP(model)
157-
#
158-
# # explanations are computed on the GPU
159-
# expl = analyze(input, analyzer)
160-
# ```
161-
162-
# Some operations, like saving, require moving explanations back to the CPU.
163-
# This can be done using Flux's `cpu` function:
164-
165-
# ```julia
166-
# val = expl.val |> cpu # or cpu(expl.val)
167-
#
168-
# using BSON
169-
# BSON.@save "explanation.bson" val
170-
# ```

docs/src/literate/heatmapping.jl

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,8 +24,9 @@ convert2image(MNIST, x)
2424
# ## Automatic heatmap presets
2525
# The function [`heatmap`](@ref) automatically applies common presets for each method.
2626
#
27-
# Since [`InputTimesGradient`](@ref) and [`LRP`](@ref) both compute attributions,
28-
# their presets are similar. Gradient methods however are typically shown in grayscale:
27+
# Since [`InputTimesGradient`](@ref) computes attributions,
28+
# heatmaps are shown in a blue-white-red color scheme.
29+
# Gradient methods however are typically shown in grayscale:
2930
analyzer = Gradient(model)
3031
heatmap(input, analyzer)
3132
#-
@@ -35,7 +36,7 @@ heatmap(input, analyzer)
3536
# ## Custom heatmap settings
3637
# ### Color schemes
3738
# We can partially or fully override presets by passing keyword arguments to [`heatmap`](@ref).
38-
# For example, we can use a custom color scheme from ColorSchemes.jl using the keyword argument `cs`:
39+
# For example, we can use a custom color scheme from ColorSchemes.jl using the keyword argument `colorscheme`:
3940
using ColorSchemes
4041

4142
expl = analyze(input, analyzer)

0 commit comments

Comments
 (0)