Skip to content

Commit 9ab1652

Browse files
committed
fix the fsdp cmd in llm_sparsity
Signed-off-by: Kai Xu <[email protected]>
1 parent 5b02483 commit 9ab1652

File tree

2 files changed

+3
-2
lines changed

2 files changed

+3
-2
lines changed

examples/llm_sparsity/launch_finetune.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -91,8 +91,8 @@ CMD="accelerate launch --multi_gpu --mixed_precision bf16 finetune.py \
9191
--warmup_ratio 0.0 \
9292
--lr_scheduler_type cosine \
9393
--logging_steps 1 \
94-
--fsdp 'full_shard auto_wrap' \
95-
--fsdp_transformer_layer_cls_to_wrap 'LlamaDecoderLayer' \
94+
--fsdp full_shard auto_wrap \
95+
--fsdp_transformer_layer_cls_to_wrap LlamaDecoderLayer \
9696
--tf32 True \
9797
--modelopt_restore_path $MODELOPT_RESTORE_PATH \
9898
--report_to tensorboard \
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
11
flash-attn
22
sentencepiece>=0.2.0
33
tensorboardX
4+
transformers>=4.57.0

0 commit comments

Comments
 (0)