Conversation
… URI, save only the lora parameters for save checkpoint
|
Hi @Amirjab21! Thank you for your pull request and welcome to our community. Action RequiredIn order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks! |
|
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks! |
Why?
Add a recipe so that people can easily get started with finetuning the model using parameter efficient fine-tuning. This allows people to finetune the model on consumer hardware.
How ?
I built a custom LoRA implementation as other libraries did not support the fairseq2 modules used in omnilingual-asr.
I also used a public dataset to show an example of this training recipe for finetuning the model for an Arabic dialect.
Test plan
The script can be run on most consumer devices and you can see the performance of the model improving.
I wrote some tests to verify that the LoRA implementation works as expected but did not include this in this PR. I can add if necessary.