Skip to content

Confusion about adapter_dir usage in from_pretrained('conf/model.yaml', args.adapter_dir) #15

@tahsirmunna

Description

@tahsirmunna

Hello,

I'm trying to understand how to correctly use the following line from your codebase:

model, tokenizer = from_pretrained('conf/model.yaml', args.adapter_dir)

Specifically, I’m confused about the adapter_dir argument. I couldn't find clear documentation or examples that explain:

  • What kind of directory this should be?
  • What files it must contain?
  • Whether it refers to LoRA adapters, PEFT adapters, or something else?

Could you please:

  1. Provide an example of a valid adapter_dir path?
  2. List the required files (e.g., adapter_config.json, adapter_model.bin, etc.) that should be inside that directory?
  3. Clarify whether the adapters are format-specific (e.g., Hugging Face PEFT-compatible)?

Thank you for your great work! Looking forward to your response to better integrate this with my project.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions