-
I use the batch sampler for my data loader and it yields each batched data with size of (1, 5, 256, 256, 256) if I run the enumerator and check the data size like: |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 7 replies
-
the code example looks like a pure pytorch implementation, could you provide more details, I guess the |
Beta Was this translation helpful? Give feedback.
I tested adding
inputs = torch.unsqueeze(inputs, 1)
after the line ofinputs = self.transform(self.inputs[index])
, and now it is working. I have previously also posted my question there and the current solution has also been posted in pytorch forum. Thanks for your help!