How to log hyperparams, so they can be viewed in "HPARAMS" tab of Tensorboard? #19960
Unanswered
adosar
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
|
so check their code at for some reason they use
I made my own subclass with proper implementation: class TensorBoardMultiLogger(pl.loggers.TensorBoardLogger):
"""TensorBoard logger that groups metrics into subdirectories by name prefix.
Splits metric names on the first '/' separator: the prefix becomes a
TensorBoard tag group and the suffix becomes the scalar name within that
group. Metrics without a '/' are logged individually as-is.
For example, 'loss/nt_xent/trn' is grouped under 'loss' with key
'nt_xent/trn', so TensorBoard displays all loss variants together.
"""
ROOT_SPLITTER: t.Final[str] = '/'
@t.override
@pl.utilities.rank_zero_only
def log_metrics(self, metrics: t.Mapping[str, float], step: t.Optional[int] = None) -> None:
assert pl.utilities.rank_zero_only.rank == 0, 'experiment tried to log from global_rank != 0'
collection_metrics: dict[str, dict[str, float]] = collections.defaultdict(dict)
individual_metrics: dict[str, float] = dict()
for name, value in metrics.items():
# split the name into the metric root name and metric value name
name_parts: list[str] = name.split(self.ROOT_SPLITTER, maxsplit=1)
match len(name_parts):
case 2:
collection_metrics[name_parts[0]][name_parts[1]] = value
case 1:
individual_metrics[name] = value
case name_parts_count:
raise RuntimeError('the name split failed (empty name?): ' + str(name_parts_count))
# log all collection metrics
# the method signature claim it does not support the nested dicts
# but the implementation actually does (refer to the sources)
# noinspection PyTypeChecker
super().log_metrics(collection_metrics, step)
# log individual metrics
super().log_metrics(individual_metrics, step)
@t.override
@pl.utilities.rank_zero_only
def log_hyperparams(
self,
params: dict[str, t.Any],
metrics: t.Optional[dict[str, t.Any]] = None,
step: t.Optional[int] = None,
) -> None:
assert pl.utilities.rank_zero_only.rank == 0, 'experiment tried to log from global_rank != 0'
# noinspection PyProtectedMember
from lightning_fabric.utilities.logger import _add_prefix, _convert_params, _flatten_dict
params = _convert_params(params)
params = _flatten_dict(params)
params = self._sanitize_params(params)
metrics = _add_prefix(metrics, 'hparam', self.ROOT_SPLITTER)
# noinspection PyTypeChecker
return self.experiment.add_hparams(
hparam_dict=params,
metric_dict=metrics,
global_step=step,
) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to log my hyperparameters with their metric values, so I can view them (compare runs with different hparams) under the "HPARAMS" tab of Tensorboard.
From the docs (I don't know if there is something more relevant), I tried the following snippet:
However, I am getting the following:

TLDR
Is there any way to get the output of add_hparams with Pytorch Lightning?
Beta Was this translation helpful? Give feedback.
All reactions