Skip to content

Commit 1e690b6

Browse files
zeshengzongpytorchmergebot
authored andcommitted
Replace TORCH_INTERNAL_ASSERT with TORCH_CHECK in set_history (pytorch#155453)
Fixes pytorch#154357 ## Test Result ```bash >>> import torch >>> >>> x = torch.tensor(1, device=torch.device('cpu')) >>> y = torch.tensor([1.0, 2.0, 3.0], requires_grad=True) >>> z0 = (x.abs() * y).prod(dtype=torch.int16) Traceback (most recent call last): File "<stdin>", line 1, in <module> RuntimeError: Autograd not support dtype: Short ``` Pull Request resolved: pytorch#155453 Approved by: https://github.com/albanD, https://github.com/soulitzer
1 parent 110ae0f commit 1e690b6

File tree

2 files changed

+5
-4
lines changed

2 files changed

+5
-4
lines changed

test/inductor/test_torchinductor.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3872,9 +3872,7 @@ def forward(self, x):
38723872
with self.assertRaisesRegex(RuntimeError, msg):
38733873
with torch.no_grad():
38743874
torch.compile(fn)(t)
3875-
# TODO: Autograd internal assertion
3876-
msg = r".*isDifferentiableType\(variable.scalar_type\(\)\) INTERNAL ASSERT FAILED.*"
3877-
with self.assertRaisesRegex(RuntimeError, msg):
3875+
with self.assertRaisesRegex(RuntimeError, "Autograd not support dtype:.*"):
38783876
torch.compile(fn)(t)
38793877

38803878
@unittest.skipIf(

torch/csrc/autograd/functions/utils.h

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,10 @@ inline void set_history(
7171
// If the codegen triggers this, you most likely want to add your newly
7272
// added function to the DONT_REQUIRE_DERIVATIVE list in
7373
// tools/autograd/gen_variable_type.py
74-
TORCH_INTERNAL_ASSERT(isDifferentiableType(variable.scalar_type()));
74+
TORCH_CHECK(
75+
isDifferentiableType(variable.scalar_type()),
76+
"Autograd not support dtype: ",
77+
variable.scalar_type());
7578
auto output_nr = grad_fn->add_input_metadata(variable);
7679
impl::set_gradient_edge(variable, {grad_fn, output_nr});
7780
} else {

0 commit comments

Comments
 (0)