Skip to content

Commit eec6d74

Browse files
committed
fix formatting errors, remove original definition
1 parent 671f155 commit eec6d74

File tree

1 file changed

+5
-4
lines changed

1 file changed

+5
-4
lines changed

src/softmax.jl

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,23 +3,23 @@ export softmax, softmax!, ∇softmax, ∇softmax!,
33

44
"""
55
softmax(xs) = exp.(xs) ./ sum(exp.(xs))
6+
67
[Softmax](https://en.wikipedia.org/wiki/Softmax_function) takes
78
log-probabilities (any real vector) and returns a probability distribution that
89
sums to 1.
10+
911
If given a matrix it will treat it as a batch of vectors, with each column
1012
independent.
13+
1114
julia> softmax([1,2,3.])
1215
3-element Array{Float64,1}:
1316
0.0900306
1417
0.244728
1518
0.665241
1619
"""
17-
softmax(xs) = softmax!(similar(xs), xs)
18-
1920
function softmax(xs::AbstractArray{T}; dims=1) where {T}
2021
max = maximum(xs, dims=dims)
21-
out = exp.(xs .- max)
22-
out = out ./ sum(out, dims=dims)
22+
out = exp.(xs .- max) ./ sum(exp.(xs .- max), dims=dims)
2323
end
2424

2525
function softmax!(out::AbstractVecOrMat{T}, xs::AbstractVecOrMat{T}) where {T}
@@ -60,6 +60,7 @@ end
6060

6161
"""
6262
logsoftmax(xs) = log.(exp.(xs) ./ sum(exp.(xs)))
63+
6364
`logsoftmax(xs)` computes the log of `softmax(xs)`, but in a more numerically stable
6465
way than directly taking the log of the softmax function, which is commonly used in
6566
computing cross entropy loss.

0 commit comments

Comments
 (0)