26
26
27
27
class CrossEntropyLoss (fluid .dygraph .Layer ):
28
28
"""
29
- This operator implements the cross entropy loss function. This OP combines `softmax`,
30
- `cross_entropy`, and `reduce_sum`/` reduce_mean` together.
29
+ This operator implements the cross entropy loss function. This OP combines `` softmax` `,
30
+ `` cross_entropy`` , and `` reduce_sum``/`` reduce_mean` ` together.
31
31
32
- It is useful when training a classification problem with `C ` classes.
33
- If provided, the optional argument `weight` should be a 1D Variable assigning
32
+ It is useful when training a classification problem with ``C` ` classes.
33
+ If provided, the optional argument `` weight` ` should be a 1D Variable assigning
34
34
weight to each of the classes.
35
35
36
36
For predictions label, and target label, the loss is calculated as follows.
37
+
37
38
.. math::
38
39
39
40
loss_j = -\\ text{input[class]} +
40
41
\\ log\\ left(\\ sum_{i=0}^{K}\\ exp(\\ text{input}_i)\\ right), j = 1,..., K
41
42
42
- If weight is not `None`:
43
+ If weight is not ``None``:
44
+
43
45
.. math::
44
46
45
47
loss_j = \\ text{weight[class]}(-\\ text{input[class]} +
@@ -59,9 +61,12 @@ class CrossEntropyLoss(fluid.dygraph.Layer):
59
61
If :attr:`size_average` is ``'sum'``, the reduced sum loss is returned.
60
62
If :attr:`reduction` is ``'none'``, the unreduced loss is returned.
61
63
Default is ``'mean'``.
64
+
62
65
Returns:
63
66
The tensor variable storing the cross_entropy_loss of input and label.
67
+
64
68
Return type: Variable.
69
+
65
70
Examples:
66
71
.. code-block:: python
67
72
0 commit comments