Skip to content

Commit 2e6e687

Browse files
authored
Merge pull request #8 from IBM/jjasghar/wrong_model
update for the commands for training and datagenerate
2 parents 53f1639 + 4f17957 commit 2e6e687

File tree

1 file changed

+5
-12
lines changed

1 file changed

+5
-12
lines changed

docs/lab-4/README.md

Lines changed: 5 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -4,15 +4,8 @@ Now that you've set up InstructLab, lets get tuning the Granite Model.
44

55
## Sanity check
66

7-
First thing you should do is verify you can talk to the Granite model, go ahead and run
8-
the following commands to verify you can.
9-
10-
```bash
11-
cd instructlab
12-
source venv/bin/activate
13-
ilab model chat
14-
/q
15-
```
7+
Take a moment to verify that you are not running `ilab model chat` or `ilab model serve` anywhere,
8+
it will clash with the following commands with training and tuning the model.
169

1710
The Granite family of foundation models span an increasing variety of modalities, including language, code, time series, and science (e.g., materials) - with much more to come. We're building them with transparency and with focus on fulfilling rigorous enterprise requirements that are emerging for AI. If you'd like to learn more about the models themselves and how we build them, check out Granite Models.
1811

@@ -39,7 +32,7 @@ Knowledge in the taxonomy tree consists of a few more elements than skills:
3932

4033
Format of the `qna.yaml`:
4134

42-
- `version`: The chache verion of the qna.yaml file, this is the format of the file used for SDG. The value must be the number 3.
35+
- `version`: The cache version of the qna.yaml file, this is the format of the file used for SDG. The value must be the number 3.
4336
- `created_by`: Your GitHub username.
4437
- `domain`: Specify the category of the knowledge.
4538
- `seed_examples`: A collection of key/value entries.
@@ -271,7 +264,7 @@ ilab model download
271264
2) Next we need to generate the data, this is done with the following command:
272265

273266
```bash
274-
ilab data generate --pipeline full --model ~/.cache/instructlab/models/mistral-7b-instruct-v0.2.Q4_K_M.gguf --model-family mixtral
267+
ilab data generate
275268
```
276269

277270
This can take some time, take note of the time in the right hand corner, this is building 1000 questions off of your initial 15.
@@ -281,7 +274,7 @@ This takes the granite model, leverages the tokenized version of it, and runs th
281274
hopefully you can take a lunch break or something while this is running.
282275

283276
```bash
284-
ilab model train --pipeline full --effective-batch size 64 --is-padding-free false --device mps --max-batch-len 4000 --model-dir instructlab/granite-7b-lab --tokenizer-dir models/granite-7b-lab --model-name instructlab/granite-7b-lab
277+
ilab model train --pipeline simple
285278
```
286279

287280
4) When this is completed, you'll need to test this model, which is the following command:

0 commit comments

Comments
 (0)