Skip to content

Commit aafc307

Browse files
authored
Fix commands in pretrain.md (#2097)
1 parent ebef170 commit aafc307

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

tutorials/pretrain.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,8 @@ litgpt pretrain pythia-14m \
8181
--tokenizer_dir EleutherAI/pythia-14m \
8282
--data TextFiles \
8383
--data.train_data_path custom_pretraining_data \
84-
--train.lr_warmup_steps=200
84+
--train.lr_warmup_steps=200 \
85+
--optimizer AdamW \
8586
--optimizer.lr 0.005
8687
```
8788

@@ -121,7 +122,7 @@ The following subsections illustrate three typical scenarioes:
121122
For instance, let's assume we download a Pythia model:
122123

123124
```bash
124-
litgpt download EleutherAI/pythia-14m
125+
litgpt download EleutherAI/pythia-160m
125126
```
126127

127128
Next, assume we have a custom dataset stored in text files similar to the *Pretrain on custom data* above. We can further pretrain the Pythia model via the `--initial_checkpoint_dir` setting as follows:

0 commit comments

Comments
 (0)