Skip to content

Conversation

@timofeev1995
Copy link
Contributor

Have you read the Contributing Guidelines?

Issue: Lora droupout can be set to values outside the 0-1 range

Describe your changes

Add validation to the parameter

@timofeev1995 timofeev1995 requested review from artek0chumak and mryab and removed request for mryab June 9, 2025 10:30
@mryab mryab requested a review from sbassam June 9, 2025 10:31
@artek0chumak
Copy link
Contributor

Don't forget to bump up the version before merging!

f"LoRA adapters are not supported for the selected model ({model_or_checkpoint})."
)

if lora_dropout is not None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this can be simplified to if lora_dropout

Copy link
Contributor

@sbassam sbassam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just left a few comments but nothing major. LGTM otherwise

Comment on lines +105 to +111
create_finetune_request(
model_limits=_MODEL_LIMITS,
model=_MODEL_NAME,
training_file=_TRAINING_FILE,
lora=True,
lora_dropout=lora_dropout,
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure why we need this but if it successfully detects out of range dropouts that's fine with me.

@timofeev1995 timofeev1995 merged commit c6353ae into main Jun 9, 2025
10 checks passed
@timofeev1995 timofeev1995 deleted the egor/lora-dropout branch June 9, 2025 17:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants