Skip to content

Commit 14091d5

Browse files
committed
Update tutorial
1 parent 1c85e9d commit 14091d5

File tree

1 file changed

+7
-31
lines changed

1 file changed

+7
-31
lines changed

docs/tutorial.md

Lines changed: 7 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -33,19 +33,11 @@ Hello, world!
3333
## Calling an LLM
3434

3535
```yaml
36-
description: Hello world calling a model
37-
text:
38-
- Hello,
39-
- model: watsonx/ibm/granite-34b-code-instruct
40-
parameters:
41-
decoding_method: greedy
42-
stop:
43-
- '!'
44-
include_stop_sequence: true
36+
--8<-- "./examples/tutorial/calling_llm.pdl"
4537
```
4638

47-
In this program ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm.pdl)), the `text` starts with the word `Hello,`, and we call a model (`watsonx/ibm/granite-34b-code-instruct`) with this as input prompt. Notice the Watsonx model id on LiteLLM.
48-
The model is passed some parameters including the `decoding_method` and `stop`, which corresponds to the `stop_sequences` parameter in Watsonx. The stop sequences are to be included in the output.
39+
In this program ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm.pdl)), the `text` starts with the word `Hello,`, and we call a model (`watsonx/ibm/granite-34b-code-instruct`) with this as input prompt. Notice the watsonx model id on LiteLLM.
40+
The model is passed some parameters including the `decoding_method` and `stop`, which corresponds to the `stop_sequences` parameter in watsonx. The stop sequences are to be included in the output.
4941

5042
A PDL program computes 2 data structures. The first is a JSON corresponding to the result of the overall program, obtained by aggregating the results of each block. This is what is printed by default when we run the interpreter. The second is a conversational background context, which is a list of role/content pairs, where we implicitly keep track of roles and content for the purpose of communicating with models that support chat APIs. The contents in the latter correspond to the results of each block. The conversational background context is what is used to make calls to LLMs via LiteLLM.
5143

@@ -62,12 +54,7 @@ where the portion ` world!` has been generated by Granite.
6254

6355
Here's another of model call that includes an `input` field ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input.pdl)):
6456
```yaml
65-
description: Hello world calling a model
66-
text:
67-
- "Hello,"
68-
- model: watsonx/ibm/granite-20b-multilingual
69-
input:
70-
Translate the word 'world' to French
57+
--8<-- "./examples/tutorial/calling_llm_with_input.pdl"
7158
```
7259

7360
In this case, we make a call to the granite multilingual model, and the input passed to the model is the sentence: `Translate the word 'world' to French` and nothing else from the surrounding document. When we execute this program, we obtain:
@@ -78,9 +65,9 @@ Le mot 'world' en français est 'monde'.
7865
```
7966
where everything after the `:` including it were generated by the model.
8067

81-
### Parameter defaults for Watsonx Granite models
68+
### Parameter defaults for watsonx Granite models
8269

83-
PDL provides the following defaults for Watsonx Granite models, when the following parameters are missing:
70+
PDL provides the following defaults for watsonx Granite models, when the following parameters are missing:
8471
- `decoding_method`: `greedy`
8572
- `max_new_tokens`: 1024
8673
- `min_new_tokens`: 1
@@ -128,18 +115,7 @@ GEN is equal to: world!
128115
In PDL, we can declaratively chain models together as in the following example ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/model_chaining.pdl)):
129116

130117
```yaml
131-
description: Model chaining
132-
text:
133-
- Hello,
134-
- model: watsonx/ibm/granite-34b-code-instruct
135-
parameters:
136-
stop: ["!"]
137-
include_stop_sequence: true
138-
- "\nTranslate this to French\n"
139-
- model: watsonx/ibm/granite-20b-multilingual
140-
parameters:
141-
stop: ["!"]
142-
include_stop_sequence: true
118+
--8<-- "./examples/tutorial/model_chaining.pdl"
143119
```
144120

145121
In this program, the first call is to a granite model to complete the sentence `Hello, world!`. The following block in the document prints out the sentence: `Translate this to French`. The final line of the program takes the entire document produced so far and passes it as input to the granite multilingual model. Notice that the input passed to this model is the document up to that point, represented as a conversation. This makes it easy to chain models together and continue building on previous interactions.

0 commit comments

Comments
 (0)