Replies: 1 comment 9 replies
-
This is quite strange and I could not reproduce the issue. Your suggestion to add |
Beta Was this translation helpful? Give feedback.
9 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey, so I know there was a similar question like it. But I am facing a problem
So, I not only want to update the layer, but also update the embedding layer. So for it I use
modules_to_save=["embed_tokens"]
but after training there should be weight change between base model embed_tokens_weight and modules_to_save Embedding weight. But when I check on their mean this is 0, that means it didn't update my embedding layer weight but, it updated the layers weight. So, I guess somehow my embedding layer is not connecting or participating in training.
merged_model_f_n.base_model.model.model.lm_head.original_moduleSS.weight-merged_model_f_n.base_model.model.model.lm_head.modules_to_save.default.weight
Will be happy to accept any suggestion. Thanks
Beta Was this translation helpful? Give feedback.
All reactions