Skip to content

Commit 67dc5c7

Browse files
author
Yibing Liu
committed
Polish the doc of nce layer
1 parent 279ebdd commit 67dc5c7

File tree

2 files changed

+31
-3
lines changed

2 files changed

+31
-3
lines changed

paddle/fluid/operators/nce_op.cc

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -128,8 +128,10 @@ class NCEOpMaker : public framework::OpProtoAndCheckerMaker {
128128
"user should avoid setting this attribute.")
129129
.SetDefault({});
130130
AddComment(R"DOC(
131-
Compute and return the noise-contrastive estimation training loss.
132-
See [Noise-contrastive estimation: A new estimation principle for unnormalized statistical models](http://www.jmlr.org/proceedings/papers/v9/gutmann10a/gutmann10a.pdf).
131+
Compute and return the noise-contrastive estimation training loss. See
132+
`Noise-contrastive estimation: A new estimation principle for unnormalized
133+
statistical models
134+
<http://www.jmlr.org/proceedings/papers/v9/gutmann10a/gutmann10a.pdf>`_.
133135
By default this operator uses a uniform distribution for sampling.
134136
)DOC");
135137
}

python/paddle/fluid/layers/nn.py

Lines changed: 27 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3472,7 +3472,33 @@ def nce(input,
34723472
num_neg_samples (int): ${num_neg_samples_comment}
34733473
34743474
Returns:
3475-
Variable: output of nce layer.
3475+
Variable: The output nce loss.
3476+
3477+
Examples:
3478+
.. code-block:: python
3479+
3480+
window_size = 5
3481+
words = []
3482+
for i in xrange(window_size):
3483+
words.append(layers.data(
3484+
name='word_{0}'.format(i), shape=[1], dtype='int64'))
3485+
3486+
dict_size = 10000
3487+
label_word = int(window_size / 2) + 1
3488+
3489+
embs = []
3490+
for i in xrange(window_size):
3491+
if i == label_word:
3492+
continue
3493+
3494+
emb = layers.embedding(input=words[i], size=[dict_size, 32],
3495+
param_attr='emb.w', is_sparse=True)
3496+
embs.append(emb)
3497+
3498+
embs = layers.concat(input=embs, axis=1)
3499+
loss = layers.nce(input=embs, label=words[label_word],
3500+
num_total_classes=dict_size, param_attr='nce.w',
3501+
bias_attr='nce.b')
34763502
"""
34773503
helper = LayerHelper('nce', **locals())
34783504
assert isinstance(input, Variable)

0 commit comments

Comments
 (0)