|
384 | 384 | Return how much the predicted distribution `ŷ` diverges from the expected Poisson
|
385 | 385 | distribution `y`. Calculated as
|
386 | 386 |
|
387 |
| - sum(ŷ .- y .* log.(ŷ)) / size(y, 2) |
| 387 | + agg(ŷ .- y .* log.(ŷ)) |
388 | 388 |
|
389 | 389 | [More information](https://peltarion.com/knowledge-center/documentation/modeling-view/build-an-ai-model/loss-functions/poisson).
|
390 | 390 | """
|
|
399 | 399 | Return the [hinge_loss loss](https://en.wikipedia.org/wiki/Hinge_loss) given the
|
400 | 400 | prediction `ŷ` and true labels `y` (containing 1 or -1). Calculated as
|
401 | 401 |
|
402 |
| - sum(max.(0, 1 .- ŷ .* y)) / size(y, 2) |
| 402 | + agg(max.(0, 1 .- ŷ .* y)) |
403 | 403 |
|
404 | 404 | See also: [`squared_hinge_loss`](@ref).
|
405 | 405 | """
|
|
412 | 412 | squared_hinge_loss(ŷ, y)
|
413 | 413 |
|
414 | 414 | Return the squared hinge_loss loss given the prediction `ŷ` and true labels `y`
|
415 |
| -(containing 1 or -1); calculated as `sum((max.(0, 1 .- ŷ .* y)).^2) / size(y, 2)`. |
| 415 | +(containing 1 or -1). Calculated as |
| 416 | +
|
| 417 | + agg((max.(0, 1 .- ŷ .* y)).^2) |
416 | 418 |
|
417 | 419 | See also [`hinge_loss`](@ref).
|
418 | 420 | """
|
|
458 | 460 | binary_focal_loss(ŷ, y; agg=mean, γ=2, ϵ=eps(ŷ))
|
459 | 461 |
|
460 | 462 | Return the [binary_focal_loss](https://arxiv.org/pdf/1708.02002.pdf)
|
461 |
| -The input, 'ŷ', is expected to be normalized (i.e. [`softmax`](@ref) output). |
| 463 | +The input `ŷ` is expected to be normalized (i.e. [`softmax`](@ref) output). |
462 | 464 |
|
463 | 465 | For `γ == 0`, the loss is mathematically equivalent to [`binarycrossentropy`](@ref).
|
464 | 466 |
|
|
0 commit comments