Skip to content

Transformers Version 4.29.2 is unsuitable for LoRA #152

@hy-0003

Description

@hy-0003

首先,非常感谢你们在 DNABERT-2 上所做的出色工作。我在使用 LoRA 微调模型时遇到了一个问题:根据 requirements.txt 文件,推荐的库版本是 transformers==4.29.2 和 peft==0.3.0。但是当我运行 finetune/train.py 脚本去Lora微调模型时,训练能够正常完成。但在训练结束后,脚本保存了一个完整的 pytorch_model.bin 文件,而不是预期的 LoRA 适配器文件。所以我认为requirements.txt推荐的Transformers版本是不合适的,希望你们可以修改相关版本推荐以及代码(如果不同版本会导致不适配的话)

First of all, thank you for your excellent work on DNABERT-2. I've encountered an issue while fine-tuning the model with LoRA. According to your requirements.txt file, the recommended library versions are transformers==4.29.2 and peft==0.3.0.

However, when I run the finetune/train.py script to fine-tune the model with LoRA, the training process completes successfully, but it saves a full pytorch_model.bin file afterward, instead of the expected LoRA adapter files.

Therefore, I believe the transformers version recommended in requirements.txt is unsuitable for this purpose. I hope you can update the version recommendation and the corresponding code if the new version causes any incompatibility.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions