Skip to content
Discussion options

You must be logged in to vote

Mh, interesting problem. How do you solve this in regular PyTorch? One thing I could think of is to write a WrapperDataset that allows to apply different transforms to a data object:

class WrapperDataset(torch.utils.data.Dataset):
    def __init__(self, dataset, transform1=None, transform2=None):
        self.dataset = dataset
        self.transform1 = transform1
        self.transform2 = transform2

    def __len__(self):
        return len(self.dataset)

    def __getitem__(self, idx):
        data1 = self.dataset[idx]
        data2 = copy.copy(data1)
        return self.transform1(data1), self.transform2(data2)


dataset = WrapperDataset(dataset, transform1, transform2)
loader = DataLo…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by AnimeshSinha1309
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants