Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion swift/model/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -296,7 +296,15 @@ def save_checkpoint(model: Optional[PreTrainedModel],
additional_saved_files: Optional[List[str]] = None) -> None:
if model is not None:
if model.__class__.__name__ != 'SentenceTransformer':
model.save_pretrained(output_dir, safe_serialization=safe_serialization, max_shard_size=max_shard_size)
# Pass save_original_format=False to avoid transformers>=5.5 revert_weight_conversion bug
# that corrupts state_dict keys for composite models (e.g. Qwen3.5).
# See: https://github.com/modelscope/ms-swift/issues/9046
save_kwargs = {}
import inspect
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

According to PEP 8 style guidelines, imports should be placed at the top of the file. Please move import inspect to the top-level imports of this module. This improves code organization and avoids re-importing the module on every function call.

References
  1. PEP 8 states that imports should be at the top of the file, just after any module comments and docstrings, and before module globals and constants. Placing imports inside functions is discouraged. (link)

if 'save_original_format' in inspect.signature(model.save_pretrained).parameters:
save_kwargs['save_original_format'] = False
model.save_pretrained(
output_dir, safe_serialization=safe_serialization, max_shard_size=max_shard_size, **save_kwargs)
else:
model.save_pretrained(output_dir, safe_serialization=safe_serialization)
# copy sentencetransformers files
Expand Down