Skip to content

Unloading multiple loras: norms do not return to their original values #10745

@christopher5106

Description

@christopher5106

When unloading from multiple loras on flux pipeline, I believe that the norm layers are not restored here.

Shouldn't we have:

        if len(transformer_norm_state_dict) > 0:
            original_norm_layers_state_dict = self._load_norm_into_transformer(
                transformer_norm_state_dict,
                transformer=transformer,
                discard_original_layers=False,
            )
            if not hasattr(transformer, "_transformer_norm_layers"):
                 transformer._transformer_norm_layers = original_norm_layers_state_dict

Metadata

Metadata

Assignees

No one assigned

    Labels

    staleIssues that haven't received updates

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions