You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Specialize BroadcastIndexesRange for the case where there is only 1 contiguous input
In this case, broadcasting is not possible if I understand correctly.
NOTE TO REVIEWERS: I deleted a failing test because I think it's testing not-actually-existent-in-PyTorch functionality. Please let me know if I've made a mistake. I tried to exercise the behavior that this test implied existed like so:
```
>>> t = torch.tensor([1, 2, 3])
>>> t2 = torch.tensor(4)
>>> torch.abs(t2, out=t)
<stdin>:1: UserWarning: An output with one or more elements was resized since it had shape [3], which does not match the required output shape []. This behavior is deprecated, and in a future PyTorch release outputs will not be resized unless they have zero elements. You can explicitly reuse an out tensor t by resizing it, inplace, to zero elements with t.resize_(0). (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/aten/src/ATen/native/Resize.cpp:38.)
tensor(4)
```
I think that if the test was correct, the result would have been torch.tensor([1, 2, 3]) with no message. Also, none of our operator tests seem to be failing. Have I missed anything?
ghstack-source-id: 37448a6
ghstack-comment-id: 3010027375
Pull-Request-resolved: #12023
0 commit comments