Trainer logger is None when given a DummyLogger instance #15017
Unanswered
MetaHG
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi!
I am encoutering an issue with
pytorch_lightning.loggers.logger import DummyLogger
.I have implemented some PL Module where I sometimes call my logger like this:
Such calls are typically used when logging "media" objects with a WandbLogger for example.
When doing code adjustements and debugging, it is sometimes useful to disable the logger. (Note: I know that there is a parameter called
fast_dev_run
matching this purpose partially). To disable the logger, I tried the two followingTrainer
instanciation:However, in both cases, my code crashes when reaching statements such as
where the following error is reported
This means that
self.logger is None
, even when setting the logger to aDummyLogger
instance.I don't understand why the PL Module
self.logger
isNone
when explicitly setting it to aDummyLogger
and I think that it should not be the case. What am I doing wrong? Am I missing something?Please let me know if this looks more like a bug and if I should open an issue instead.
Beta Was this translation helpful? Give feedback.
All reactions