Raise a cumstom exception in on_fit_end
#12569
Unanswered
nzw0301
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 2 comments 2 replies
-
More specifically, it is used in a callback developed by Optuna for PyTorch-Lightning. CC: @Borda @akihironitta because I saw your reactions to an issue. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @nzw0301! Sorry for replying very late. I'm sure the change in the behaviour was introduced in #10896. @awaelchli Could I ask for your help here? Is it possible for a user to catch an exception of their interest instead of |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi
I would like to raise an exception in the
on_fit_end
of a callback with DDP training. For example, the following code callsoptuna.exceptions.TrialPruned()
in the method.If I use pl
1.5.*
,on_fit_end
is called once. So I could handle the exception.However, if I use pl
1.6.0
,on_fit_end
is called by every child process. As a result, we gottorch.multiprocessing.spawn.ProcessRaisedException
due tooptuna.exceptions.TrialPruned()
.How should I raise a custom exception as with pl
1.5.*
when we use pl1.6.0
withouttorch.multiprocessing.spawn.ProcessRaisedException
?Note that my env is as follows:
Best regards,
Beta Was this translation helpful? Give feedback.
All reactions