Skip to content

Commit cfee9c1

Browse files
authored
[cherry-pick2.4]for CodeStyle (#47608)
* only run pre-commit * only run pre-commit
1 parent 99c872f commit cfee9c1

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

57 files changed

+13869
-9290
lines changed

python/paddle/autograd/py_layer.py

Lines changed: 55 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -54,16 +54,16 @@ def __init__(self):
5454
def save_for_backward(self, *tensors):
5555
"""
5656
Saves given tensors that backward need. Use ``saved_tensor`` in the `backward` to get the saved tensors.
57-
57+
5858
.. note::
59-
This API should be called at most once, and only inside `forward`.
59+
This API should be called at most once, and only inside `forward`.
6060
6161
Args:
6262
tensors(list of Tensors): Tensors to be stored.
6363
6464
Returns:
6565
None
66-
66+
6767
Examples:
6868
.. code-block:: python
6969
@@ -94,7 +94,7 @@ def saved_tensor(self):
9494
Get the tensors stored by ``save_for_backward``.
9595
9696
Returns:
97-
list of Tensors or None: If context contains tensors stored by `save_for_backward`,
97+
list of Tensors or None: If context contains tensors stored by `save_for_backward`,
9898
then return these tensors, otherwise return None.
9999
100100
Examples:
@@ -124,17 +124,14 @@ def backward(ctx, dy):
124124

125125

126126
def with_mateclass(meta, *bases):
127-
128127
class impl(meta):
129-
130128
def __new__(cls, name, temp_bases, attrs):
131129
return meta(name, bases, attrs)
132130

133131
return type.__new__(impl, "impl", (), {})
134132

135133

136134
class CPyLayer(object):
137-
138135
@classmethod
139136
@dygraph_only
140137
def apply(cls, *args, **kwargs):
@@ -147,7 +144,7 @@ def apply(cls, *args, **kwargs):
147144
148145
Returns:
149146
tensors or other types : output of PyLayer.
150-
147+
151148
Examples:
152149
.. code-block:: python
153150
@@ -182,12 +179,14 @@ def backward(ctx, dy):
182179

183180

184181
class PyLayerBackward(LegacyPyLayerContext):
185-
186182
def backward(self, *args, **kwargs):
187183
with paddle.fluid.dygraph.guard():
188184
with paddle.fluid.dygraph.no_grad():
189-
if self._amp_state and 'enable' in self._amp_state and self._amp_state[
190-
'enable']:
185+
if (
186+
self._amp_state
187+
and 'enable' in self._amp_state
188+
and self._amp_state['enable']
189+
):
191190
with auto_cast(**args[0]._amp_state):
192191
return self._forward_cls.backward(*args, **kwargs)
193192
else:
@@ -197,10 +196,10 @@ def backward(self, *args, **kwargs):
197196

198197

199198
class LayerMeta(type):
200-
201199
def __init__(cls, name, bases, attrs):
202-
cls._backward_function = type(name + '_backward', (PyLayerBackward, ),
203-
{"_forward_cls": cls})
200+
cls._backward_function = type(
201+
name + '_backward', (PyLayerBackward,), {"_forward_cls": cls}
202+
)
204203

205204
return super(LayerMeta, cls).__init__(name, bases, attrs)
206205

@@ -210,15 +209,15 @@ class LegacyPyLayer(with_mateclass(LayerMeta, CPyLayer)):
210209
Build a custom `Layer` by creating subclasses. Subclasses need to follow the following rules:
211210
1. Subclasses contain `forward` and `backward` function. Both forward and backward are @staticmethod.
212211
Their first argument should be a context and `None` can not be included in the returned result.
213-
2. Input of backward contains a context as the first argument, and the rest arguments are the
214-
gradient of forward's output tensors. so the number of backward's input tensors equal to
215-
the number of forward output tensors. If you need the forward's inputs or outputs in `backward`,
212+
2. Input of backward contains a context as the first argument, and the rest arguments are the
213+
gradient of forward's output tensors. so the number of backward's input tensors equal to
214+
the number of forward output tensors. If you need the forward's inputs or outputs in `backward`,
216215
you can use `save_for_backward` to store the required tensors, and then use them in the backward.
217216
3. Output of backward function can only be `Tensor` or tuple/list of `Tensor`.
218-
Output tensors of backward are the gradient of forward's input tensors,
217+
Output tensors of backward are the gradient of forward's input tensors,
219218
so the number of backward's output tensors equal to the number of forward input tensors.
220219
After building the custom Layer, run it through the `apply` method.
221-
220+
222221
223222
Examples:
224223
.. code-block:: python
@@ -259,8 +258,8 @@ def backward(ctx, dy):
259258
@staticmethod
260259
def forward(ctx, *args, **kwargs):
261260
"""
262-
It is to be overloaded by subclasses. It must accept a object of `PyLayerContext` as
263-
the first argument, followed by any number of arguments (tensors or other types).
261+
It is to be overloaded by subclasses. It must accept a object of `PyLayerContext` as
262+
the first argument, followed by any number of arguments (tensors or other types).
264263
`None` can not be included in the returned result.
265264
266265
Args:
@@ -269,7 +268,7 @@ def forward(ctx, *args, **kwargs):
269268
270269
Returns:
271270
tensors or other types : output of PyLayer.
272-
271+
273272
Examples:
274273
.. code-block:: python
275274
@@ -292,14 +291,15 @@ def backward(ctx, dy):
292291
return grad
293292
"""
294293
raise NotImplementedError(
295-
"You must implement the forward function for PyLayer.")
294+
"You must implement the forward function for PyLayer."
295+
)
296296

297297
@staticmethod
298298
def backward(ctx, *args, **kwargs):
299299
"""
300-
This is a function to calculate the gradient. It is to be overloaded by subclasses.
301-
It must accept a object of `PyLayerContext` as the first argument, and the rest
302-
arguments are the gradient of forward's output tensors. Output tensors of backward
300+
This is a function to calculate the gradient. It is to be overloaded by subclasses.
301+
It must accept a object of `PyLayerContext` as the first argument, and the rest
302+
arguments are the gradient of forward's output tensors. Output tensors of backward
303303
are the gradient of forward's input tensors.
304304
305305
Args:
@@ -308,7 +308,7 @@ def backward(ctx, *args, **kwargs):
308308
309309
Returns:
310310
Tensor or list of Tensors: The gradient of forward's input tensor(s).
311-
311+
312312
Examples:
313313
.. code-block:: python
314314
@@ -332,24 +332,24 @@ def backward(ctx, dy):
332332
"""
333333

334334
raise NotImplementedError(
335-
"You must implement the backward function for PyLayer.")
335+
"You must implement the backward function for PyLayer."
336+
)
336337

337338

338339
class EagerPyLayerContext(object):
339-
340340
def save_for_backward(self, *tensors):
341341
"""
342342
Saves given tensors that backward need. Use ``saved_tensor`` in the `backward` to get the saved tensors.
343-
343+
344344
.. note::
345-
This API should be called at most once, and only inside `forward`.
345+
This API should be called at most once, and only inside `forward`.
346346
347347
Args:
348348
tensors(list of Tensors): Tensors to be stored.
349349
350350
Returns:
351351
None
352-
352+
353353
Examples:
354354
.. code-block:: python
355355
@@ -380,7 +380,7 @@ def saved_tensor(self):
380380
Get the tensors stored by ``save_for_backward``.
381381
382382
Returns:
383-
list of Tensors or None: If context contains tensors stored by `save_for_backward`,
383+
list of Tensors or None: If context contains tensors stored by `save_for_backward`,
384384
then return these tensors, otherwise return None.
385385
386386
Examples:
@@ -410,11 +410,11 @@ def backward(ctx, dy):
410410
def mark_not_inplace(self, *args):
411411
"""
412412
Marks inputs as not inplace.
413-
This should be called at most once, only from inside the `forward` method,
413+
This should be called at most once, only from inside the `forward` method,
414414
and all arguments should be Tensor inputs.
415415
416-
If the Tensor returned by `forward` method is the same as the Tensor input of forward,
417-
and this Tensor is marked as not_inplace, then Paddle will help the user create a new Tensor as output.
416+
If the Tensor returned by `forward` method is the same as the Tensor input of forward,
417+
and this Tensor is marked as not_inplace, then Paddle will help the user create a new Tensor as output.
418418
Thereby preventing the auto grad information of the input Tensor from being overwritten.
419419
420420
Examples:
@@ -427,7 +427,7 @@ class Exp(paddle.autograd.PyLayer):
427427
def forward(ctx, x):
428428
ctx.mark_not_inplace(x)
429429
return x
430-
430+
431431
@staticmethod
432432
def backward(ctx, grad_output):
433433
out = grad_output.exp()
@@ -438,7 +438,7 @@ def backward(ctx, grad_output):
438438
attn_layers = []
439439
for idx in range(0, 2):
440440
attn_layers.append(Exp())
441-
441+
442442
for step in range(0, 2):
443443
a = x
444444
for j in range(0,2):
@@ -450,7 +450,7 @@ def backward(ctx, grad_output):
450450
def mark_non_differentiable(self, *args):
451451
"""
452452
Marks outputs as non-differentiable.
453-
This should be called at most once, only from inside the `forward` method,
453+
This should be called at most once, only from inside the `forward` method,
454454
and all arguments should be tensor outputs.
455455
456456
This will mark outputs as not requiring gradients, increasing the
@@ -542,30 +542,27 @@ def backward(ctx, grad, grad2):
542542

543543

544544
class EagerPyLayerBackward(core.eager.PyLayer, EagerPyLayerContext):
545-
546545
def backward(self, *args):
547546
return self._forward_cls.backward(self, *args)
548547

549548

550549
class EagerPyLayerMeta(type):
551-
552550
def __init__(cls, name, bases, attrs):
553-
cls._backward_function = type(name + '_backward',
554-
(EagerPyLayerBackward, ),
555-
{"_forward_cls": cls})
551+
cls._backward_function = type(
552+
name + '_backward', (EagerPyLayerBackward,), {"_forward_cls": cls}
553+
)
556554

557555
return super(EagerPyLayerMeta, cls).__init__(name, bases, attrs)
558556

559557

560558
class EagerPyLayer(
561-
with_mateclass(EagerPyLayerMeta, core.eager.PyLayer,
562-
EagerPyLayerContext)):
563-
559+
with_mateclass(EagerPyLayerMeta, core.eager.PyLayer, EagerPyLayerContext)
560+
):
564561
@staticmethod
565562
def forward(ctx, *args, **kwargs):
566563
"""
567-
It is to be overloaded by subclasses. It must accept a object of `PyLayerContext` as
568-
the first argument, followed by any number of arguments (tensors or other types).
564+
It is to be overloaded by subclasses. It must accept a object of `PyLayerContext` as
565+
the first argument, followed by any number of arguments (tensors or other types).
569566
`None` can not be included in the returned result.
570567
571568
Args:
@@ -574,7 +571,7 @@ def forward(ctx, *args, **kwargs):
574571
575572
Returns:
576573
tensors or other types : output of PyLayer.
577-
574+
578575
Examples:
579576
.. code-block:: python
580577
@@ -597,14 +594,15 @@ def backward(ctx, dy):
597594
return grad
598595
"""
599596
raise NotImplementedError(
600-
"You must implement the forward function for PyLayer.")
597+
"You must implement the forward function for PyLayer."
598+
)
601599

602600
@staticmethod
603601
def backward(ctx, *args):
604602
"""
605-
This is a function to calculate the gradient. It is to be overloaded by subclasses.
606-
It must accept a object of `PyLayerContext` as the first argument, and the rest
607-
arguments are the gradient of forward's output tensors. Output tensors of backward
603+
This is a function to calculate the gradient. It is to be overloaded by subclasses.
604+
It must accept a object of `PyLayerContext` as the first argument, and the rest
605+
arguments are the gradient of forward's output tensors. Output tensors of backward
608606
are the gradient of forward's input tensors.
609607
610608
Args:
@@ -613,7 +611,7 @@ def backward(ctx, *args):
613611
614612
Returns:
615613
Tensor or list of Tensors: The gradient of forward's input tensor(s).
616-
614+
617615
Examples:
618616
.. code-block:: python
619617
@@ -637,11 +635,11 @@ def backward(ctx, dy):
637635
"""
638636

639637
raise NotImplementedError(
640-
"You must implement the backward function for PyLayer.")
638+
"You must implement the backward function for PyLayer."
639+
)
641640

642641

643642
def once_differentiable(backward):
644-
645643
def wrapper(ctx, *args):
646644
with paddle.fluid.dygraph.no_grad():
647645
outputs = backward(ctx, *args)

0 commit comments

Comments
 (0)