Skip to content

Commit 514dd09

Browse files
authored
fix CrossEntropyLoss op en doc, test=release/2.0 (#24150)
1 parent 1887749 commit 514dd09

File tree

1 file changed

+10
-5
lines changed

1 file changed

+10
-5
lines changed

python/paddle/nn/layer/loss.py

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -26,20 +26,22 @@
2626

2727
class CrossEntropyLoss(fluid.dygraph.Layer):
2828
"""
29-
This operator implements the cross entropy loss function. This OP combines `softmax`,
30-
`cross_entropy`, and `reduce_sum`/`reduce_mean` together.
29+
This operator implements the cross entropy loss function. This OP combines ``softmax``,
30+
``cross_entropy``, and ``reduce_sum``/``reduce_mean`` together.
3131
32-
It is useful when training a classification problem with `C` classes.
33-
If provided, the optional argument `weight` should be a 1D Variable assigning
32+
It is useful when training a classification problem with ``C`` classes.
33+
If provided, the optional argument ``weight`` should be a 1D Variable assigning
3434
weight to each of the classes.
3535
3636
For predictions label, and target label, the loss is calculated as follows.
37+
3738
.. math::
3839
3940
loss_j = -\\text{input[class]} +
4041
\\log\\left(\\sum_{i=0}^{K}\\exp(\\text{input}_i)\\right), j = 1,..., K
4142
42-
If weight is not `None`:
43+
If weight is not ``None``:
44+
4345
.. math::
4446
4547
loss_j = \\text{weight[class]}(-\\text{input[class]} +
@@ -59,9 +61,12 @@ class CrossEntropyLoss(fluid.dygraph.Layer):
5961
If :attr:`size_average` is ``'sum'``, the reduced sum loss is returned.
6062
If :attr:`reduction` is ``'none'``, the unreduced loss is returned.
6163
Default is ``'mean'``.
64+
6265
Returns:
6366
The tensor variable storing the cross_entropy_loss of input and label.
67+
6468
Return type: Variable.
69+
6570
Examples:
6671
.. code-block:: python
6772

0 commit comments

Comments
 (0)