-
Couldn't load subscription status.
- Fork 6.5k
Open
Labels
staleIssues that haven't received updatesIssues that haven't received updates
Description
When unloading from multiple loras on flux pipeline, I believe that the norm layers are not restored here.
Shouldn't we have:
if len(transformer_norm_state_dict) > 0:
original_norm_layers_state_dict = self._load_norm_into_transformer(
transformer_norm_state_dict,
transformer=transformer,
discard_original_layers=False,
)
if not hasattr(transformer, "_transformer_norm_layers"):
transformer._transformer_norm_layers = original_norm_layers_state_dictMetadata
Metadata
Assignees
Labels
staleIssues that haven't received updatesIssues that haven't received updates