You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is also called the ["softplus"](https://en.wikipedia.org/wiki/Rectifier_(neural_networks))
166
166
transformation, being a smooth approximation to `max(0,x)`. Its inverse is [`logexpm1`](@ref).
167
167
168
+
The generalized `softplus` function (Wiemann et al., 2024) takes an additional optional parameter `a` that control
169
+
the approximation error with respect to the linear spline. It defaults to `a=1.0`, in which case the softplus is
170
+
equivalent to `log1pexp`.
171
+
168
172
See:
169
173
* Martin Maechler (2012) [“Accurately Computing log(1 − exp(− |a|))”](http://cran.r-project.org/web/packages/Rmpfr/vignettes/log1mexp-note.pdf)
170
-
"""
174
+
* Wiemann, P. F., Kneib, T., & Hambuckers, J. (2024). Using the softplus function to construct alternative link functions in generalized linear models and beyond. Statistical Papers, 65(5), 3155-3180.
175
+
"""
171
176
log1pexp(x::Real) =_log1pexp(float(x)) # ensures that BigInt/BigFloat, Int/Float64 etc. dispatch to the same algorithm
172
177
173
178
# Approximations based on Maechler (2012)
@@ -255,10 +260,22 @@ Return `log(exp(x) - 1)` or the “invsoftplus” function. It is the inverse o
255
260
[`log1pexp`](@ref) (aka “softplus”).
256
261
"""
257
262
logexpm1(x::Real) = x <=18.0?log(_expm1(x)) : x <=33.3? x -exp(-x) :oftype(exp(-x), x)
258
-
logexpm1(x::Float32) = x <=9f0?log(expm1(x)) : x <=16f0? x -exp(-x) :oftype(exp(-x), x)
263
+
logexpm1(x::Float32) = x <=9.0f0?log(expm1(x)) : x <=16.0f0? x -exp(-x) :oftype(exp(-x), x)
0 commit comments