Skip to content
Discussion options

You must be logged in to vote

I already use your first way but setting default_hp_metric to False makes hp_metric be removed from "hparams" tab (this tab isn't there at all even if I have set some hyper parameters). Adding the final log_hyperparams creates the hparams tab but the graph of hp_metric gets a final value at iteration 0 instead of final iteration), and this step will also be skipped if the job is killed.

Here are what I've tried so far:

  • default_hp_metric=True: hparams tab visible in Tensorboad with hp_metric updated during training, hp_metric wrong initial value that makes the corresponding graph unsuitable with log scale and smoothing activated.
  • default_hp_metric=False: no hparams tab in TensorBoard, hp_…

Replies: 5 comments 2 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@FranzKnut
Comment options

@rolanddenis
Comment options

Answer selected by Borda
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
5 participants
Converted from issue

This discussion was converted from issue #4832 on February 09, 2021 23:44.