Skip to content

Conversation

@sayakpaul
Copy link
Member

@sayakpaul sayakpaul commented Dec 4, 2024

Currently, users can set the scale of a loaded LoRA in a pipeline in two ways:

  1. set_adapters(adapter_name, weight).
  2. Supply something like attention_kwargs={"scale": weight} while calling the pipeline.

This PR adds a test to ensure outputs from 1 and 2 match under identical settings.

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, LGTM. Only a couple of nits from my side.

@sayakpaul sayakpaul merged commit a6a18cf into main Dec 12, 2024
14 checks passed
@sayakpaul sayakpaul deleted the add-set-adapters-tests branch December 12, 2024 07:22
sayakpaul added a commit that referenced this pull request Dec 23, 2024
…ch (#10110)

* add a test to ensure set_adapters() and attn kwargs outs match

* remove print

* fix

* Apply suggestions from code review

Co-authored-by: Benjamin Bossan <[email protected]>

* assertFalse.

---------

Co-authored-by: Benjamin Bossan <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants