Skip to content

Commit 9d1685b

Browse files
BigFishMasterguoshengCS
authored andcommitted
Fix relu and log function by changing input parameter name from 'input' to 'x' (#11683)
* Fix relu and log * Update nn.py
1 parent b7634a8 commit 9d1685b

File tree

1 file changed

+9
-9
lines changed
  • python/paddle/fluid/layers

1 file changed

+9
-9
lines changed

python/paddle/fluid/layers/nn.py

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4920,16 +4920,16 @@ def random_crop(x, shape, seed=None):
49204920
return out
49214921

49224922

4923-
def log(input):
4923+
def log(x):
49244924
"""
49254925
Calculates the natural log of the given input tensor, element-wise.
49264926
49274927
.. math::
49284928
4929-
Out = \\ln(input)
4929+
Out = \\ln(x)
49304930
49314931
Args:
4932-
input (Variable): Input tensor.
4932+
x (Variable): Input tensor.
49334933
49344934
Returns:
49354935
Variable: The natural log of the input tensor computed element-wise.
@@ -4938,7 +4938,7 @@ def log(input):
49384938
49394939
.. code-block:: python
49404940
4941-
output = fluid.layers.log(input)
4941+
output = fluid.layers.log(x)
49424942
"""
49434943
helper = LayerHelper('log', **locals())
49444944
dtype = helper.input_dtype(input_param_name='x')
@@ -4947,18 +4947,18 @@ def log(input):
49474947
return out
49484948

49494949

4950-
def relu(input):
4950+
def relu(x):
49514951
"""
49524952
Relu takes one input data (Tensor) and produces one output data (Tensor)
4953-
where the rectified linear function, y = max(0, input), is applied to
4953+
where the rectified linear function, y = max(0, x), is applied to
49544954
the tensor elementwise.
49554955
49564956
.. math::
49574957
4958-
Out = \\max(0, input)
4958+
Out = \\max(0, x)
49594959
49604960
Args:
4961-
input (Variable): The input tensor.
4961+
x (Variable): The input tensor.
49624962
49634963
Returns:
49644964
Variable: The output tensor with the same shape as input.
@@ -4967,7 +4967,7 @@ def relu(input):
49674967
49684968
.. code-block:: python
49694969
4970-
output = fluid.layers.relu(input)
4970+
output = fluid.layers.relu(x)
49714971
"""
49724972
helper = LayerHelper('relu', **locals())
49734973
dtype = helper.input_dtype(input_param_name='x')

0 commit comments

Comments
 (0)