Skip to content
Discussion options

You must be logged in to vote

This is not a problem. A linear layer will transform each node feature vector in isolation - the node feature dimension acts as the batch dimension here:

x = ...  # [num_nodes, 256]
lin = Linear(256, 512)
x = self.lin(x)  # [num_nodes, 512]

You don't have to worry about that (even for LSTMs), as long as the node dimension is used as the batch dimension.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@dottipr
Comment options

Answer selected by dottipr
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants