@@ -51,7 +51,7 @@ Leaky [Rectified Linear Unit](https://en.wikipedia.org/wiki/Rectifier_(neural_ne
51
51
activation function.
52
52
You can also specify the coefficient explicitly, e.g. `leakyrelu(x, 0.01)`.
53
53
"""
54
- leakyrelu (x:: Real , a = oftype (x/ 1 , 0.01 )) = max (a* x, x/ 1 )
54
+ leakyrelu (x:: Real , a = oftype (x/ 1 , 0.01 )) = max (a* x, x/ one (x) )
55
55
56
56
57
57
"""
@@ -62,7 +62,7 @@ Exponential Linear Unit activation function.
62
62
See [Fast and Accurate Deep Network Learning by Exponential Linear Units](https://arxiv.org/abs/1511.07289).
63
63
You can also specify the coefficient explicitly, e.g. `elu(x, 1)`.
64
64
"""
65
- elu (x, α = one (x)) = ifelse (x ≥ 0 , x/ 1 , α * (exp (x) - one (x)))
65
+ elu (x, α = one (x)) = ifelse (x ≥ 0 , x/ one (x) , α * (exp (x) - one (x)))
66
66
67
67
68
68
"""
@@ -99,7 +99,7 @@ See [Self-Normalizing Neural Networks](https://arxiv.org/pdf/1706.02515.pdf).
99
99
function selu (x:: Real )
100
100
λ = oftype (x/ 1 , 1.0507009873554804934193349852946 )
101
101
α = oftype (x/ 1 , 1.6732632423543772848170429916717 )
102
- λ * ifelse (x > 0 , x/ 1 , α * (exp (x) - 1 ))
102
+ λ * ifelse (x > 0 , x/ one (x) , α * (exp (x) - one (x) ))
103
103
end
104
104
105
105
0 commit comments