Why do tensors with different dimensions not work in PyTorch? #988
Unanswered
Harsha-lohana
asked this question in
Q&A
Replies: 1 comment
-
The torch.tensor function fails with the provided code because the nested lists have different lengths. PyTorch tensors require that all sub-arrays be of the same shape to form a regular grid, making the entire structure a valid tensor. This is similar in case with NumPy as well. Here's a solution: Padding shorter lists with zerosTensor = torch.tensor([[[1, 0, 0], print(Tensor) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This code is working but
this is not working there might be a chance to generate this type of tensor, is it??
or what's the reason behind working with the same length?
Beta Was this translation helpful? Give feedback.
All reactions