Find out what layer is triggering the UninitializedParameter error #12972
Unanswered
DanielPerezJensen
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
AFAIK, it is not an error but is just a warning, so you can just ignore it if you don't mind it for training your model. In case you do care the warning, you should be able to see which of the parameters are uninitialized with the following code block: model = HeteroGCLSTM(...)
for name, param in model.named_parameters():
if _is_lazy_weight_tensor(param):
print(f"{name} {param}") Here's the implementation that PL's model summary uses to detect |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a model imported from PyTorch Geometric Temporal that I wish to train using PyTorch Lightning. The model I am using is defined below:
When I train this model using trainer.fit, as per instructions, I get the following error:
.../lib/python3.8/site-packages/pytorch_lightning/utilities/model_summary.py:407: UserWarning: A layer with UninitializedParameter was found. Thus, the total number of parameters detected may be inaccurate.
Is there an easy way to figure out what this UninitializedParameter is? I think it might have to do with the h_dict and c_dict, but I am not sure and the Error does not give any indication.
The code above was gathered from here.
Beta Was this translation helpful? Give feedback.
All reactions