@@ -54,8 +54,8 @@ _apply_scale_bias(x, scale, bias) = x .* scale .+ bias
54
54
55
55
Shared code path for all built-in norm functions.
56
56
57
- `μ` and `σ²` should be calculated on the fly using [`NNlib. norm_stats`](@ref),
58
- or extracted from an existing collection such as [`NNlib. RunningStats`](@ref).
57
+ `μ` and `σ²` should be calculated on the fly using [`norm_stats`](@ref),
58
+ or extracted from an existing collection such as [`RunningStats`](@ref).
59
59
`bias` and `scale` are consistent with cuDNN and Flux.Scale.
60
60
We opt for `scale` over `weight` to avoid confusion with dense layers.
61
61
If the size of the statistics and affine parameters differ,
@@ -79,7 +79,7 @@ Contains running mean and variance estimates for stateful norm functions.
79
79
If the parameters are mutable, they will be updated in-place.
80
80
Otherwise, they will be replaced wholesale.
81
81
82
- See also [`NNlib. update_running_stats!`](@ref).
82
+ See also [`update_running_stats!`](@ref).
83
83
"""
84
84
mutable struct RunningStats{M <: AbstractArray , V <: AbstractArray , MT <: Real }
85
85
mean:: M
@@ -129,10 +129,10 @@ end
129
129
reduce_dims) where {N}
130
130
131
131
Performs a moving average update for layers with tracked statistics.
132
- `μ` and `σ²` are the sample mean and variance, most likely from [`NNlib. norm_stats`](@ref).
133
- `reduce_dims` should also match the `dims` argument of [`NNlib. norm_stats`](@ref).
132
+ `μ` and `σ²` are the sample mean and variance, most likely from [`norm_stats`](@ref).
133
+ `reduce_dims` should also match the `dims` argument of [`norm_stats`](@ref).
134
134
135
- See also [`NNlib. RunningStats`](@ref).
135
+ See also [`RunningStats`](@ref).
136
136
"""
137
137
function update_running_stats! (stats:: RunningStats , x, μ, σ², reduce_dims:: Dims )
138
138
V = eltype (σ²)
@@ -168,7 +168,7 @@ Normalizes `x` along the first `S` dimensions.
168
168
169
169
For an additional learned affine transform, provide a `S`-dimensional `scale` and `bias`.
170
170
171
- See also [`NNlib. batchnorm`](@ref), [`NNlib. instancenorm`](@ref), and [`NNlib. groupnorm`](@ref).
171
+ See also [`batchnorm`](@ref), [`instancenorm`](@ref), and [`groupnorm`](@ref).
172
172
173
173
# Examples
174
174
@@ -205,14 +205,14 @@ Functional [Batch Normalization](https://arxiv.org/abs/1502.03167) operation.
205
205
Normalizes `x` along each ``D_1×...×D_{N-2}×1×D_N`` input slice,
206
206
where `N-1` is the "channel" (or "feature", for 2D inputs) dimension.
207
207
208
- Provide a [`NNlib. RunningStats`](@ref) to fix a estimated mean and variance.
208
+ Provide a [`RunningStats`](@ref) to fix a estimated mean and variance.
209
209
`batchnorm` will renormalize the input using these statistics during inference,
210
210
and update them using batch-level statistics when training.
211
211
To override this behaviour, manually set a value for `training`.
212
212
213
213
If specified, `scale` and `bias` will be applied as an additional learned affine transform.
214
214
215
- See also [`NNlib. layernorm`](@ref), [`NNlib. instancenorm`](@ref), and [`NNlib. groupnorm`](@ref).
215
+ See also [`layernorm`](@ref), [`instancenorm`](@ref), and [`groupnorm`](@ref).
216
216
"""
217
217
function batchnorm (x:: AbstractArray{<:Any, N} ,
218
218
running_stats:: Union{RunningStats, Nothing} = nothing ,
@@ -247,7 +247,7 @@ To override this behaviour, manually set a value for `training`.
247
247
248
248
If specified, `scale` and `bias` will be applied as an additional learned affine transform.
249
249
250
- See also [`NNlib. layernorm`](@ref), [`NNlib. batchnorm`](@ref), and [`NNlib. groupnorm`](@ref).
250
+ See also [`layernorm`](@ref), [`batchnorm`](@ref), and [`groupnorm`](@ref).
251
251
"""
252
252
function instancenorm (x:: AbstractArray{<:Any, N} ,
253
253
running_stats:: Union{RunningStats, Nothing} = nothing ,
@@ -281,7 +281,7 @@ The number of channels must be an integer multiple of the number of groups.
281
281
282
282
If specified, `scale` and `bias` will be applied as an additional learned affine transform.
283
283
284
- See also [`NNlib. layernorm`](@ref), [`NNlib. batchnorm`](@ref), and [`NNlib. instancenorm`](@ref).
284
+ See also [`layernorm`](@ref), [`batchnorm`](@ref), and [`instancenorm`](@ref).
285
285
286
286
# Examples
287
287
0 commit comments