Nested hparams? #10263
Unanswered
FeryET
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Nested hparams?
#10263
Replies: 1 comment 5 replies
-
Hi @FeryET Prince Canuma here, a Data Scientist at Neptune.ai I have a couple of questions and a potential answer to your issue. You mentioned that your nested model can change (can be either LSTM/Transformer etc).
You also talk about storing your hyperparameters related to the nested model in hparams variable but it had to be in a hierarchical/nested structure.
Potential solution
Output:
PTL Docs: https://pytorch-lightning.readthedocs.io/en/latest/common/lightning_module.html#save-hyperparameters |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi.
I have a nested model inside my
LightningModule
which has a set of unique variables. The nested model can change (can be either LSTM/Transformer etc), but the overall PL blackbox should not change. What I want to do is to store the hyperparameters related to the nested model in hparams, but I want to make it hierarchical, and have nested hparams.Something like this:
self.hparams['net'] = {} # a dict
. How can I achieve this? Currently if I callself.save_hyperparameters(net_config)
the hparams dict will update its values with thenet_config
dictionary that I have and it will be unnecessarily crowded.Beta Was this translation helpful? Give feedback.
All reactions