Integration of the WeightCheckpoint Metric #1144
Unanswered
OliviaGraeupner
asked this question in
Q&A
Replies: 1 comment 1 reply
-
The metrics and logging system works best with scalar value. To save a weight checkpoint, I suggest to use a normal plugin, something like:
Depending on when and how often you need to save the model, you can use a different callback (e.g. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello everybody,
I'm using the following basic code for implementing an continual learning model:
My goal is to log the weights at certain checkpoints during the training. (The performance of the model is incidental.) For this purpose I plan to use the .update() and .result() functions so I can log the weights of all layers of the SimpleMLP model. My problem is that I don't quite know where to insert the WeightCheckpoint functions in my code appropriately and how to output the results.
Accordingly there are no values in the
metric_dict
.In addition, I would like to save the weights in the log.txt file using the TextLogger. How can I integrate this feature in the TextLogger source code?
Thanks for your help!
Beta Was this translation helpful? Give feedback.
All reactions