如何对不同的loss单独进行grad_clip,或是定义两个optimizer? #747
Unanswered
hollow-503
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
我的loss_dict中有
{"loss1": 1.3455, "loss2": 1.792, "loss3": 5.79}
在训练过程中,我原本是将三者相加再进行grad_clip,但是训练过程中发现loss3容易梯度爆炸,而loss1和loss2相对稳定,请问如何单独对loss3进行梯度裁切,使得loss1和loss2在训练时不受loss3的影响?是否能够定义两个optimizer实现,是的话如何在mmdet3d中定义两个optimizer?
Beta Was this translation helpful? Give feedback.
All reactions