Replies: 1 comment 1 reply
-
Great question. Duplicating copied_data = data.clone() In case you also want to copied_data.apply(lambda x: x.detach()) |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, hoping to get some help on a quick question. I am trying to create a duplicate of a Batch object with seperate gradient information. I have seen the problem pop up before in regular PyTorch tensors where calling just .clone() will cause the cloned tensor to share gradients with the original object. Normally this is solved through .detach(), but the Batch object doesn't have this built in. If I want a clean copy, do I need to clone a Batch object and manually call .detach() on all of the tensors within it? Is there a better way to create a copy of Batch object?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions