Skip to content
Discussion options

You must be logged in to vote

The GCNConv class uses torch_geometric.nn.dense.Linear as it supports lazy initialization, where the parameters are initialized after the first call to forward. torch.nn.parameter does not support this.

Why was lazy initialization needed?

  1. Users don't need to track in channels, the model will infer it from the first forward call.
  2. More importantly, Lazy initialization makes working with heterogenous graphs easier. For example check out the to_hetero, which automatically converts a homogenous graph to an heterogenous graph, for this to work GNN layers like GCNConv or SAGEConv need to support lazy initialization.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by khaled-rahman
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants