Can I customize the loss function used during fine-tuning in the Lag-Llama model? #136
Unanswered
XinyuChen-hey
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I am currently working with the Lag-Llama model and would like to know if it is possible to customize the loss function during the fine-tuning process. Is there a way to modify or replace the default loss function with a custom one, and if so, how can I implement this change?
Any guidance or examples would be greatly appreciated!
Thank you in advance!
Beta Was this translation helpful? Give feedback.
All reactions