Skip to content

Commit fe16ce8

Browse files
authored
Fix fsdp2 example (#3657)
1 parent 5987d79 commit fe16ce8

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

examples/fsdp2/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,5 +32,5 @@ In our example, we use a 8B Llama3.1 model, which has a hidden dimension of 4096
3232
The figures above were generated on 8x H100 SXM GPUs, with 8192 sequence length and 1000 steps. To run the example, you can use the following command, where you can specify the precision to train in:
3333

3434
```bash
35-
accelerate launch --fsdp2_fp8.py --sequence_length 8192 --num_steps 1000 --log_with wandb --precision [fp8 | bf16]
35+
accelerate launch fsdp2_fp8.py --sequence-length 8192 --num-steps 1000 --log_with wandb --precision [fp8 | bf16]
3636
```

0 commit comments

Comments
 (0)