Lazy initialisation and torch functions #5112
Unanswered
MattBortoletto
asked this question in
Q&A
Replies: 1 comment 2 replies
-
Looks like a bug TBH. Do you have a simple example to reproduce this? How does your model look like? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I'm building a heterogeneous GNN by converting a homogeneous one using
to_hetero
orto_hetero_with_bases
. Then I would like to see the difference in the number of parameters between the two versions. Therefore I define a simple function for counting parameters:The first two lines inside the functions are to initialise the model parameter, as the GNN class uses lazy initialisation. When I pass the GNN obtained using
to_hetero
it runs fine, but when I pass the one obtained withto_hetero_with_bases
I get the same error as if I didn't initialise the model:Is it a bug or am I missing something? Is there a smarter way to check the number of parameters?
Thank you for your answers!
Beta Was this translation helpful? Give feedback.
All reactions