Skip to content

Commit dae64cc

Browse files
【AutoParallel】Change benchmark config for llama2-7b (#8667)
* change benchmark config for llama2-7b * change logging_steps
1 parent 69e04c0 commit dae64cc

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

tests/test_tipc/static/auto_parallel/llama2/pretrain_config_llama2_7b/pretrain-llama2_7b.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
"tensor_parallel_degree": 1,
1010
"pipeline_parallel_degree": 1,
1111
"sharding": "stage2",
12+
"data_parallel_config": "enable_allreduce_avg_in_gradinent_scale gradient_sync_after_accumulate",
1213
"sharding_parallel_config": "enable_stage2_overlap",
1314
"tensor_parallel_config": "enable_mp_async_allreduce",
1415
"pipeline_parallel_config": "",
@@ -24,7 +25,7 @@
2425
"learning_rate": 3e-05,
2526
"min_learning_rate": 3e-06,
2627
"warmup_steps": 30,
27-
"logging_steps": 1,
28+
"logging_steps": 10,
2829
"max_steps": 50,
2930
"save_steps": 5000,
3031
"eval_steps": 1000,

0 commit comments

Comments
 (0)