[PyTorch fundamentals] Change in numpy array does not change the tensor. #766
Replies: 2 comments
-
Because it needs the NumPy array to process the tensor and it shares same memory when converting index to a value |
Beta Was this translation helpful? Give feedback.
-
The To translate what
The video is wrong and it could cause serious problems if people think that those objects (array and tensor) don't share same memory. The following code could help to illustrate the issue with video: # Change the value of array, what will this do to `tensor`?
array_copy = array + 1
array[0] = 100
array_copy[0] = 99
array_copy, array, tensor |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
First of all, thanks for the video for Pytorch. I am going through the Pytorch fundamentals section of the video. There you have shown the demo where change in numpy array does not impact the tensor which is working as expected for me as well , But when I try changing numpy array using index it actually changed the tensor, which means its sharing same memory.
Here the question is if I do
numpy_array = numpy_array + 1
it does not change the tensor that means its doesnot share the same memory but in below scenario why it is sharing the same memory ?`array = np.arange(1,8)
tensor = torch.from_numpy(array)
tensor, tensor.dtype
array[0] = 100
array, tensor`
Beta Was this translation helpful? Give feedback.
All reactions