Skip to content

Commit 7255790

Browse files
Fix param_init_fn: move if-statement out of lambda (meta-llama#487)
2 parents 5f11aeb + fb2e802 commit 7255790

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/llama_recipes/finetuning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -188,7 +188,7 @@ def main(**kwargs):
188188
device_id=device_id,
189189
limit_all_gathers=True,
190190
sync_module_states=train_config.low_cpu_fsdp,
191-
param_init_fn=lambda module: module.to_empty(device=torch.device("cuda"), recurse=False)
191+
param_init_fn=(lambda module: module.to_empty(device=torch.device("cuda"), recurse=False))
192192
if train_config.low_cpu_fsdp and rank != 0 else None,
193193
)
194194
if fsdp_config.fsdp_activation_checkpointing:

0 commit comments

Comments
 (0)