File tree Expand file tree Collapse file tree 1 file changed +5
-1
lines changed Expand file tree Collapse file tree 1 file changed +5
-1
lines changed Original file line number Diff line number Diff line change @@ -51,7 +51,11 @@ model = strip_softmax(model)
51
51
# Applying the [`GammaRule`](@ref) to two linear layers in a row will yield different results
52
52
# than first fusing the two layers into one linear layer and then applying the rule.
53
53
# This fusing is called "canonization" and can be done using the [`canonize`](@ref) function:
54
- model = canonize (model)
54
+ model_canonized = canonize (model)
55
+
56
+ # After canonization, the first `BatchNorm` layer has been fused into the preceding `Conv` layer.
57
+ # The second `BatchNorm` layer wasn't fused
58
+ # since its preceding `Conv` layer has a ReLU activation function.
55
59
56
60
# ### [Flattening the model](@id docs-lrp-flatten-model)
57
61
# ExplainableAI.jl's LRP implementation supports nested Flux Chains and Parallel layers.
You can’t perform that action at this time.
0 commit comments