Skip to content
Discussion options

You must be logged in to vote

Found a rather simple solution that worked for me. Put different features/different models under different directories and fire up tensorboard in root directory.

class LightningWrapper(pl.LightningModule):
    def__init__(self, features):
        self.features = features 

    def training_step(self, batch, batch_idx):
        ....
        .....
        self.logger.experiment.add_scalar("loss", loss)


for feature_set in potential_feature_sets:
    logger = pl.loggers.TensorBoardLogger(f"lightning_logs/{num_features}", version="0")
    trainer = pl.Trainer(logger=logger)
    model = LightningWrapper(feature_set)
    trainer.fit(model)

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ajayrfhp
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment