Skip to content
Discussion options

You must be logged in to vote

We keep attributes separate by design (e.g. you may wish to do different things to them, such as treating coordintates in a more principled way in an EGNN).

So, what you can do to solve your problem is simply concatenate the features you want

def forward(batch: Batch):
    x = torch.cat([batch.amino_acid_one_hot, batch.some_other_feature], dim=1)
    x = self.conv(x, batch.edge_index)

If you want to do this faster (and are using pytorch lightning modules), you can do the pre-processing in on_after_batch_transfer

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by a-r-j
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants