Skip to content

Commit 38b8201

Browse files
author
nemo
committed
Make style
1 parent f4f3ee0 commit 38b8201

File tree

2 files changed

+4
-5
lines changed

2 files changed

+4
-5
lines changed

src/peft/tuners/osf/config.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -16,9 +16,8 @@ class OSFConfig(PeftConfig):
1616
effective_rank (`int` or `float`, *optional*):
1717
Preserved SVD rank ("high" subspace). The top-``effective_rank`` singular directions are frozen and
1818
retained across tasks; the remaining dimensions form the trainable low-rank subspace. If `None`, defaults
19-
to 50% of the smaller weight dimension per target module.
20-
Note: This differs from LoRA's `r` (trainable rank). In OSF, the trainable rank is
21-
`min(weight.shape) - effective_rank`.
19+
to 50% of the smaller weight dimension per target module. Note: This differs from LoRA's `r` (trainable
20+
rank). In OSF, the trainable rank is `min(weight.shape) - effective_rank`.
2221
target_modules (`Union[list[str], str]`, *optional*):
2322
The names of the modules to apply OSF to. Can be a list of module names or `"all-linear"`.
2423
rank_pattern (`dict[str, int|float]`, *optional*):

src/peft/tuners/osf/model.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@ def __init__(self, model, config, adapter_name, low_cpu_mem_usage: bool = False)
2323
def __getattr__(self, name: str):
2424
"""Forward missing attributes to the wrapped base model.
2525
26-
This mirrors the behavior of other tuners (e.g., LoRA), ensuring attributes
27-
like `device` resolve to the underlying transformers model.
26+
This mirrors the behavior of other tuners (e.g., LoRA), ensuring attributes like `device` resolve to the
27+
underlying transformers model.
2828
"""
2929
try:
3030
return super().__getattr__(name) # defer to nn.Module's logic

0 commit comments

Comments
 (0)