WandB Logging DDP #7316
Unanswered
benjaminrwilson
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 3 replies
-
You can use our wrapper for wandb. It logs only on the first process, so only one experiment gets created. from pytorch_lightning.loggers import WandbLogger
logger = WandbLogger(project="the coconut", name="is not a nut")
trainer = Trainer(logger=logger) |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What is the proper way to prevent repeated WandB experiments when using DDP with multiple GPUs?
The WandB discussion discusses it here: https://docs.wandb.ai/guides/track/advanced/distributed-training (Method 1). Has anyone had any success preventing each GPU spawning an experiment?
Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions