Skip to content

Commit bf85177

Browse files
add own docstring to softplus
1 parent 0fdda7a commit bf85177

File tree

2 files changed

+43
-4
lines changed

2 files changed

+43
-4
lines changed

activate.jl

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
using Revise
2+
using Pkg
3+
4+
# Package
5+
Pkg.activate("C:/Users/domma/Dropbox/Software/LogExpFunctions.jl/")
6+
7+
using LogExpFunctions
8+
using CairoMakie
9+
10+
11+
xrange = range(-1.5, 1.5, length=100)
12+
yexp = exp.(xrange)
13+
ysoftplus1 = softplus.(xrange)
14+
ysoftplus2 = softplus.(xrange; a=2)
15+
ysoftplus3 = softplus.(xrange; a=3)
16+
17+
ysoftplus5 = softplus.(xrange; a=5)
18+
ysoftplus10 = softplus.(xrange; a=10)
19+
20+
21+
# f = lines(xrange, yexp, color=:black)
22+
f = lines(xrange, ysoftplus1, color=:red)
23+
lines!(xrange, ysoftplus2, color=:orange)
24+
lines!(xrange, ysoftplus3, color=:darkorange)
25+
lines!(xrange, ysoftplus5, color=:green)
26+
lines!(xrange, ysoftplus10, color=:blue)
27+
28+
ablines!(0, 1, color=:grey, linestyle=:dash)
29+
f
30+
31+
softplus(0; a=3)

src/basicfuns.jl

Lines changed: 12 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -165,13 +165,11 @@ Return `log(1+exp(x))` evaluated carefully for largish `x`.
165165
This is also called the ["softplus"](https://en.wikipedia.org/wiki/Rectifier_(neural_networks))
166166
transformation, being a smooth approximation to `max(0,x)`. Its inverse is [`logexpm1`](@ref).
167167
168-
The generalized `softplus` function (Wiemann et al., 2024) takes an additional optional parameter `a` that control
169-
the approximation error with respect to the linear spline. It defaults to `a=1.0`, in which case the softplus is
170-
equivalent to `log1pexp`.
168+
This is also called the ["softplus"](https://en.wikipedia.org/wiki/Rectifier_(neural_networks))
169+
transformation (in its default parametrization, see [`softplus`](@ref)), being a smooth approximation to `max(0,x)`.
171170
172171
See:
173172
* Martin Maechler (2012) [“Accurately Computing log(1 − exp(− |a|))”](http://cran.r-project.org/web/packages/Rmpfr/vignettes/log1mexp-note.pdf)
174-
* Wiemann, P. F., Kneib, T., & Hambuckers, J. (2024). Using the softplus function to construct alternative link functions in generalized linear models and beyond. Statistical Papers, 65(5), 3155-3180.
175173
"""
176174
log1pexp(x::Real) = _log1pexp(float(x)) # ensures that BigInt/BigFloat, Int/Float64 etc. dispatch to the same algorithm
177175

@@ -262,6 +260,16 @@ Return `log(exp(x) - 1)` or the “invsoftplus” function. It is the inverse o
262260
logexpm1(x::Real) = x <= 18.0 ? log(_expm1(x)) : x <= 33.3 ? x - exp(-x) : oftype(exp(-x), x)
263261
logexpm1(x::Float32) = x <= 9f0 ? log(expm1(x)) : x <= 16f0 ? x - exp(-x) : oftype(exp(-x), x)
264262

263+
"""
264+
$(SIGNATURES)
265+
266+
The generalized `softplus` function (Wiemann et al., 2024) takes an additional optional parameter `a` that control
267+
the approximation error with respect to the linear spline. It defaults to `a=1.0`, in which case the softplus is
268+
equivalent to [`log1pexp`](@ref).
269+
270+
See:
271+
* Wiemann, P. F., Kneib, T., & Hambuckers, J. (2024). Using the softplus function to construct alternative link functions in generalized linear models and beyond. Statistical Papers, 65(5), 3155-3180.
272+
"""
265273
softplus(x::Real) = log1pexp(x)
266274
softplus(x::Real, a::Real) = log1pexp(a * x) / a
267275

0 commit comments

Comments
 (0)