Skip to content

Commit 9cd020c

Browse files
luotao1guoshengCS
authored andcommitted
Merge pull request #11626 from ktlichkid/fix-log
Fix log and relu layer
1 parent 291e849 commit 9cd020c

File tree

1 file changed

+9
-9
lines changed
  • python/paddle/fluid/layers

1 file changed

+9
-9
lines changed

python/paddle/fluid/layers/nn.py

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4920,16 +4920,16 @@ def random_crop(x, shape, seed=None):
49204920
return out
49214921

49224922

4923-
def log(x):
4923+
def log(input):
49244924
"""
49254925
Calculates the natural log of the given input tensor, element-wise.
49264926
49274927
.. math::
49284928
4929-
Out = \\ln(x)
4929+
Out = \\ln(input)
49304930
49314931
Args:
4932-
x (Variable): Input tensor.
4932+
input (Variable): Input tensor.
49334933
49344934
Returns:
49354935
Variable: The natural log of the input tensor computed element-wise.
@@ -4938,7 +4938,7 @@ def log(x):
49384938
49394939
.. code-block:: python
49404940
4941-
output = fluid.layers.log(x)
4941+
output = fluid.layers.log(input)
49424942
"""
49434943
helper = LayerHelper('log', **locals())
49444944
dtype = helper.input_dtype()
@@ -4947,18 +4947,18 @@ def log(x):
49474947
return out
49484948

49494949

4950-
def relu(x):
4950+
def relu(input):
49514951
"""
49524952
Relu takes one input data (Tensor) and produces one output data (Tensor)
4953-
where the rectified linear function, y = max(0, x), is applied to
4953+
where the rectified linear function, y = max(0, input), is applied to
49544954
the tensor elementwise.
49554955
49564956
.. math::
49574957
4958-
Out = \\max(0, x)
4958+
Out = \\max(0, input)
49594959
49604960
Args:
4961-
x (Variable): The input tensor.
4961+
input (Variable): The input tensor.
49624962
49634963
Returns:
49644964
Variable: The output tensor with the same shape as input.
@@ -4967,7 +4967,7 @@ def relu(x):
49674967
49684968
.. code-block:: python
49694969
4970-
output = fluid.layers.relu(x)
4970+
output = fluid.layers.relu(input)
49714971
"""
49724972
helper = LayerHelper('relu', **locals())
49734973
dtype = helper.input_dtype()

0 commit comments

Comments
 (0)