Skip to content

Commit 3fc8760

Browse files
committed
[cost] unique name
1 parent 6893f44 commit 3fc8760

File tree

1 file changed

+6
-2
lines changed

1 file changed

+6
-2
lines changed

tensorlayer/cost.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
from tensorflow.python.ops import standard_ops
1010

1111
## Cost Functions
12-
def cross_entropy(output, target, name="cross_entropy_loss"):
12+
def cross_entropy(output, target, name=None):
1313
"""Returns the TensorFlow expression of cross-entropy of two distributions, implement
1414
softmax internally.
1515
@@ -19,16 +19,20 @@ def cross_entropy(output, target, name="cross_entropy_loss"):
1919
A distribution with shape: [batch_size, n_feature].
2020
target : Tensorflow variable
2121
A batch of index with shape: [batch_size, ].
22+
name : string
23+
Name of this loss.
2224
2325
Examples
2426
--------
25-
>>> ce = tl.cost.cross_entropy(y_logits, y_target_logits)
27+
>>> ce = tl.cost.cross_entropy(y_logits, y_target_logits, 'my_loss')
2628
2729
References
2830
-----------
2931
- About cross-entropy: `wiki <https://en.wikipedia.org/wiki/Cross_entropy>`_.\n
3032
- The code is borrowed from: `here <https://en.wikipedia.org/wiki/Cross_entropy>`_.
3133
"""
34+
assert name is not None, print("Please give a unique name to tl.cost.cross_entropy")
35+
3236
if tf.__version__ <= "0.12":
3337
return tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(logits=output, targets=target, name=name))
3438
else: # TF 1.0

0 commit comments

Comments
 (0)