Skip to content

Conversation

@DrJKL
Copy link
Contributor

@DrJKL DrJKL commented Jun 26, 2024

What?

Adds a secondary output to Power Lora Loader in the LORA_STACK format. Also takes in a lora_stack to allow for chaining

Why?

I have a multi-phase workflow that I want to share some common LoRAs but then augment them with others without having to clone nodes or keep them updated. I like the Power Lora Loader better than the Efficiency Nodes' LoRA Stacker.

@rgthree I could also break this into a separate Power Lora Stacker (rgthree) node. Or setup a property to allow for either/or (I figure just hiding the input/output nodes?)

Note

There is an issue with this implementation when all of the outputs and inputs are chained.
LoRAs are applied in each node, leading to weird results.
I put together a test workflow: Power_Lora_Loader_Stack_Test.json
(Open in a text-editor and Find/Replace <<<CHECKPOINT>>>, <<<LORA_1>>>, <<<LORA_2>>> for convenience)

DrJKL added 2 commits June 26, 2024 13:34
This has the issue of re-applying the LoRAs to the model/clip for each PLL node...
@DrJKL DrJKL marked this pull request as ready for review June 26, 2024 21:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant