Skip to content

Commit a8e9f4e

Browse files
authored
Fix numpy seed in finetuning.py (meta-llama#728)
2 parents 4e6e7e4 + b75a79e commit a8e9f4e

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

src/llama_recipes/finetuning.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
import random
1010
import torch
1111
import torch.optim as optim
12+
import numpy as np
1213
from peft import get_peft_model, PeftModel
1314
from torch.distributed.fsdp import (
1415
FullyShardedDataParallel as FSDP,
@@ -82,6 +83,7 @@ def main(**kwargs):
8283
torch.xpu.manual_seed(train_config.seed)
8384
torch.manual_seed(train_config.seed)
8485
random.seed(train_config.seed)
86+
np.random.seed(train_config.seed)
8587

8688
if train_config.enable_fsdp:
8789
setup()

0 commit comments

Comments
 (0)