You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: backends/arm/_passes/fold_qdq_with_annotated_qparams_pass.py
-15Lines changed: 0 additions & 15 deletions
Original file line number
Diff line number
Diff line change
@@ -105,21 +105,6 @@ def fold_and_annotate_arg(
105
105
forarginarg_list:
106
106
ifnotisinstance(arg, Node):
107
107
return
108
-
"""
109
-
Make sure arg has requires_grad set to False
110
-
For parameters that are not quantized, sometimes (i.e. convolution)
111
-
the Parameter(FakeTensor(...)) has requires_grad set to True, which
112
-
causes the retracing of the graph to fail with:
113
-
114
-
E RuntimeError: isDifferentiableType(variable.scalar_type()) INTERNAL ASSERT FAILED at "/Users/runner/work/pytorch/pytorch/pytorch/torch/csrc/autograd/functions/utils.h":74, please report a bug to PyTorch.
0 commit comments