None of the inputs have requires_grad=True. Gradients will be None #912
Unanswered
TigerHH6866
asked this question in
Q&A
Replies: 1 comment 1 reply
-
I get both of those messages, and my training works fine. I don't think they're a problem, although it would be nice if they were removed via code so as not to worry people. Just ignore them for now. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
because max_grad_norm is set, clip_grad_norm is enabled. consider set to 0 / max_grad_normが設定されているためclip_grad_normが有効になります。0に設定して無効にしたほうがいいかもしれません
running training / 学習開始
num train images * repeats / 学習画像の数×繰り返し回数: 1050
num reg images / 正則化画像の数: 0
num batches per epoch / 1epochのバッチ数: 525
num epochs / epoch数: 12
batch size per device / バッチサイズ: 2
gradient accumulation steps / 勾配を合計するステップ数 = 1
total optimization steps / 学習ステップ数: 6300
steps: 0%| | 0/6300 [00:00<?, ?it/s]
epoch 1/12
/root/miniconda3/lib/python3.10/site-packages/torch/utils/checkpoint.py:31: UserWarning: None of the inputs have requires_grad=True. Gradients will be None
warnings.warn("None of the inputs have requires_grad=True. Gradients will be None")
first one: max_grad_norm or clip_grad_norm where to set?
seccond one: None of the inputs have requires_grad=True. Gradients will be None
this seem to make result worng. how to set right??
Beta Was this translation helpful? Give feedback.
All reactions