Skip to content

Commit f41eb2c

Browse files
fix: add default optim arg in training arg (#607)
Signed-off-by: yashasvi <[email protected]>
1 parent bc39f95 commit f41eb2c

File tree

1 file changed

+7
-0
lines changed

1 file changed

+7
-0
lines changed

tuning/config/configs.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -251,6 +251,13 @@ class TrainingArguments(transformers.TrainingArguments):
251251
Other possible values are 'debug', 'info', 'warning', 'error' and 'critical'"
252252
},
253253
)
254+
optim: str = field(
255+
default="adamw_torch",
256+
metadata={
257+
"help": "Pass optimizer name to use during training. \
258+
Please only use the optimizers that are supported with HF transformers"
259+
},
260+
)
254261
enable_reduce_loss_sum: bool = field(
255262
default=False,
256263
metadata={

0 commit comments

Comments
 (0)