Skip to content

Commit 2f6f135

Browse files
janeyx99pytorchmergebot
authored andcommitted
[BE] Actually suppress vmap warning from gradcheck (pytorch#144287)
This is the much safer change compared to pytorch#144283 Before: ``` PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=1 PYTORCH_TEST_WITH_SLOW_GRADCHECK=1 python test/test_optim.py -k TestDifferentiableOptimizer.test_sgd /data/users/janeyx/pytorch/torch/autograd/gradcheck.py:1156: FutureWarning: Please use torch.vmap instead of torch._vmap_internals.vmap. result = vmap(vjp)(torch.stack(grad_outputs)) /data/users/janeyx/pytorch/torch/autograd/gradcheck.py:1156: FutureWarning: Please use torch.vmap instead of torch._vmap_internals.vmap. result = vmap(vjp)(torch.stack(grad_outputs)) . ---------------------------------------------------------------------- Ran 1 test in 0.028s ``` (the env vars aren't necessary) After: ``` PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=1 PYTORCH_TEST_WITH_SLOW_GRADCHECK=1 python test/test_optim.py -k TestDifferentiableOptimizer.test_sgd . ---------------------------------------------------------------------- Ran 1 test in 0.028s ``` Pull Request resolved: pytorch#144287 Approved by: https://github.com/cyyever, https://github.com/soulitzer
1 parent 61c0a3d commit 2f6f135

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torch/autograd/gradcheck.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1151,7 +1151,7 @@ def vjp(v):
11511151
# NB: this doesn't work for CUDA tests: https://github.com/pytorch/pytorch/issues/50209
11521152
with warnings.catch_warnings():
11531153
warnings.filterwarnings("ignore", message="There is a performance drop")
1154-
warnings.filterwarnings("ignore", message="Please use torch.vmap")
1154+
warnings.filterwarnings("ignore", message="Please use `torch.vmap`")
11551155
try:
11561156
result = vmap(vjp)(torch.stack(grad_outputs))
11571157
except RuntimeError as ex:

0 commit comments

Comments
 (0)