You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ LLMs will continue to change the way we build software systems. They are not onl
5
5
PDL is based on the premise that interactions between users, LLMs and rule-based systems form a *document*. Consider for example the interactions between a user and a chatbot. At each interaction, the exchanges form a document that gets longer and longer. Similarly, chaining models together or using tools for specific tasks result in outputs that together form a document. PDL allows users to specify the shape and contents of such documents in a declarative way (in YAML), and is agnostic of any programming language. Because of its document-oriented nature, it can be used to easily express a variety of data generation tasks (inference, data synthesis, data generation for model training, etc...).
6
6
7
7
PDL provides the following features:
8
-
- Ability to use any LLM locally or remotely via [LiteLLM](https://www.litellm.ai/), including [IBM's Watsonx](https://www.ibm.com/watsonx)
8
+
- Ability to use any LLM locally or remotely via [LiteLLM](https://www.litellm.ai/), including [IBM's watsonx](https://www.ibm.com/watsonx)
9
9
- Ability to templatize not only prompts for one LLM call, but also composition of LLMs with tools (code and APIs). Templates can encompass tasks of larger granularity than a single LLM call
10
10
- Control structures: variable definitions and use, conditionals, loops, functions
11
11
- Ability to read from files and stdin, including JSON data
@@ -24,7 +24,7 @@ See below for a quick reference, followed by [installation notes](#interpreter_i
In order to run the examples that use foundation models hosted on [Watsonx](https://www.ibm.com/watsonx) via LiteLLM, you need a WatsonX account (a free plan is available) and set up the following environment variables:
48
-
-`WATSONX_URL`, the API url (set to `https://{region}.ml.cloud.ibm.com`) of your WatsonX instance
47
+
In order to run the examples that use foundation models hosted on [watsonx](https://www.ibm.com/watsonx) via LiteLLM, you need a watsonx account (a free plan is available) and set up the following environment variables:
48
+
-`WATSONX_URL`, the API url (set to `https://{region}.ml.cloud.ibm.com`) of your watsonx instance
49
49
-`WATSONX_APIKEY`, the API key (see information on [key creation](https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui#create_user_key))
50
50
-`WATSONX_PROJECT_ID`, the project hosting the resources (see information about [project creation](https://www.ibm.com/docs/en/watsonx/saas?topic=projects-creating-project) and [finding project ID](https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/fm-project-id.html?context=wx)).
51
51
@@ -231,7 +231,7 @@ assign the result to variable `CODE`.
231
231
232
232
Next we define a `text`, where the first block is simply a string and writes out the source code. This is done by accessing the variable `CODE`. The syntax `${ var }` means accessing the value of a variable in the scope. Since `CODE` contains YAML data, we can also access fields such as `CODE.source_code`.
233
233
234
-
The second block calls a granite model on WatsonX via LiteLLM. Here we explicitly provide an `input` field which means that we do not pass the entire text produced so far to the model, but only what is specified in this field. In this case, we specify our template by using the variable `CODE` as shown above.
234
+
The second block calls a granite model on watsonx via LiteLLM. Here we explicitly provide an `input` field which means that we do not pass the entire text produced so far to the model, but only what is specified in this field. In this case, we specify our template by using the variable `CODE` as shown above.
235
235
236
236
When we execute this program with the PDL interpreter, we obtain the following text:
237
237
@@ -401,7 +401,7 @@ This is similar to a spreadsheet for tabular data, where data is in the forefron
401
401
402
402
## Additional Notes
403
403
404
-
When using Granite models on Watsonx, we use the following defaults for model parameters (except `granite-20b-code-instruct-r1.1`):
404
+
When using Granite models on watsonx, we use the following defaults for model parameters (except `granite-20b-code-instruct-r1.1`):
405
405
- `decoding_method`: `greedy`
406
406
- `max_new_tokens`: 1024
407
407
- `min_new_tokens`: 1
@@ -417,4 +417,4 @@ For a complete list of issues see [here](https://github.com/IBM/prompt-declarati
417
417
418
418
## Contributing to the Project
419
419
420
-
See [Contributing to PDL](https://ibm.github.io/prompt-declaration-language/contrib)
420
+
See [Contributing to PDL](https://ibm.github.io/prompt-declaration-language/contrib).
Copy file name to clipboardExpand all lines: docs/README.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ LLMs will continue to change the way we build software systems. They are not onl
10
10
PDL is based on the premise that interactions between users, LLMs and rule-based systems form a *document*. Consider for example the interactions between a user and a chatbot. At each interaction, the exchanges form a document that gets longer and longer. Similarly, chaining models together or using tools for specific tasks result in outputs that together form a document. PDL allows users to specify the shape and contents of such documents in a declarative way (in YAML), and is agnostic of any programming language. Because of its document-oriented nature, it can be used to easily express a variety of data generation tasks (inference, data synthesis, data generation for model training, etc...).
11
11
12
12
PDL provides the following features:
13
-
- Ability to use any LLM locally or remotely via [LiteLLM](https://www.litellm.ai/), including [IBM's Watsonx](https://www.ibm.com/watsonx)
13
+
- Ability to use any LLM locally or remotely via [LiteLLM](https://www.litellm.ai/), including [IBM's watsonx](https://www.ibm.com/watsonx)
14
14
- Ability to templatize not only prompts for one LLM call, but also composition of LLMs with tools (code and APIs). Templates can encompass tasks of larger granularity than a single LLM call
15
15
- Control structures: variable definitions and use, conditionals, loops, functions
16
16
- Ability to read from files and stdin, including JSON data
@@ -29,7 +29,7 @@ See below for a quick reference, followed by [installation notes](#interpreter_i
In order to run the examples that use foundation models hosted on [Watsonx](https://www.ibm.com/watsonx) via LiteLLM, you need a WatsonX account (a free plan is available) and set up the following environment variables:
53
-
-`WATSONX_URL`, the API url (set to `https://{region}.ml.cloud.ibm.com`) of your WatsonX instance
52
+
In order to run the examples that use foundation models hosted on [watsonx](https://www.ibm.com/watsonx) via LiteLLM, you need a watsonx account (a free plan is available) and set up the following environment variables:
53
+
-`WATSONX_URL`, the API url (set to `https://{region}.ml.cloud.ibm.com`) of your watsonx instance
54
54
-`WATSONX_APIKEY`, the API key (see information on [key creation](https://cloud.ibm.com/docs/account?topic=account-userapikey&interface=ui#create_user_key))
55
55
-`WATSONX_PROJECT_ID`, the project hosting the resources (see information about [project creation](https://www.ibm.com/docs/en/watsonx/saas?topic=projects-creating-project) and [finding project ID](https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/fm-project-id.html?context=wx)).
56
56
@@ -236,7 +236,7 @@ assign the result to variable `CODE`.
236
236
237
237
Next we define a `text`, where the first block is simply a string and writes out the source code. This is done by accessing the variable `CODE`. The syntax `${ var }` means accessing the value of a variable in the scope. Since `CODE` contains YAML data, we can also access fields such as `CODE.source_code`.
238
238
239
-
The second block calls a granite model on WatsonX via LiteLLM. Here we explicitly provide an `input` field which means that we do not pass the entire text produced so far to the model, but only what is specified in this field. In this case, we specify our template by using the variable `CODE` as shown above.
239
+
The second block calls a granite model on watsonx via LiteLLM. Here we explicitly provide an `input` field which means that we do not pass the entire text produced so far to the model, but only what is specified in this field. In this case, we specify our template by using the variable `CODE` as shown above.
240
240
241
241
When we execute this program with the PDL interpreter, we obtain the following text:
242
242
@@ -406,7 +406,7 @@ This is similar to a spreadsheet for tabular data, where data is in the forefron
406
406
407
407
## Additional Notes
408
408
409
-
When using Granite models on Watsonx, we use the following defaults for model parameters (except `granite-20b-code-instruct-r1.1`):
409
+
When using Granite models on watsonx, we use the following defaults for model parameters (except `granite-20b-code-instruct-r1.1`):
410
410
- `decoding_method`: `greedy`
411
411
- `max_new_tokens`: 1024
412
412
- `min_new_tokens`: 1
@@ -422,4 +422,4 @@ For a complete list of issues see [here](https://github.com/IBM/prompt-declarati
422
422
423
423
## Contributing to the Project
424
424
425
-
See [Contributing to PDL](https://ibm.github.io/prompt-declaration-language/contrib)
425
+
See [Contributing to PDL](https://ibm.github.io/prompt-declaration-language/contrib).
Copy file name to clipboardExpand all lines: docs/tutorial.md
+16-46Lines changed: 16 additions & 46 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,9 +11,7 @@ All the examples in this tutorial can be found in `examples/tutorial`.
11
11
The simplest PDL program is one that generates a small text ([file](https://github.com/IBM/prompt-declaration-language/blob/main/examples/tutorial/simple_program.pdl)):
12
12
13
13
```yaml
14
-
description: Hello world!
15
-
text:
16
-
Hello, world!
14
+
--8<-- "./examples/tutorial/simple_program.pdl"
17
15
```
18
16
19
17
This program has a `description` field, which contains a title. The `description` field is optional. It also has a `text` field, which can be either a string, a *block*, or a list of strings and blocks. A block is a recipe for how to obtain data (e.g., model call, code call, etc...). In this case, there are no calls to an LLM or other tools, and `text` consists of a simple string.
@@ -35,19 +33,11 @@ Hello, world!
35
33
## Calling an LLM
36
34
37
35
```yaml
38
-
description: Hello world calling a model
39
-
text:
40
-
- Hello,
41
-
- model: watsonx/ibm/granite-34b-code-instruct
42
-
parameters:
43
-
decoding_method: greedy
44
-
stop:
45
-
- '!'
46
-
include_stop_sequence: true
36
+
--8<-- "./examples/tutorial/calling_llm.pdl"
47
37
```
48
38
49
-
In this program ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm.pdl)), the `text` starts with the word `Hello,`, and we call a model (`watsonx/ibm/granite-34b-code-instruct`) with this as input prompt. Notice the Watsonx model id on LiteLLM.
50
-
The model is passed some parameters including the `decoding_method` and `stop`, which corresponds to the `stop_sequences` parameter in Watsonx. The stop sequences are to be included in the output.
39
+
In this program ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm.pdl)), the `text` starts with the word `Hello,`, and we call a model (`watsonx/ibm/granite-34b-code-instruct`) with this as input prompt. Notice the watsonx model id on LiteLLM.
40
+
The model is passed some parameters including the `decoding_method` and `stop`, which corresponds to the `stop_sequences` parameter in watsonx. The stop sequences are to be included in the output.
51
41
52
42
A PDL program computes 2 data structures. The first is a JSON corresponding to the result of the overall program, obtained by aggregating the results of each block. This is what is printed by default when we run the interpreter. The second is a conversational background context, which is a list of role/content pairs, where we implicitly keep track of roles and content for the purpose of communicating with models that support chat APIs. The contents in the latter correspond to the results of each block. The conversational background context is what is used to make calls to LLMs via LiteLLM.
53
43
@@ -64,12 +54,7 @@ where the portion ` world!` has been generated by Granite.
64
54
65
55
Here's another of model call that includes an `input` field ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input.pdl)):
In this case, we make a call to the granite multilingual model, and the input passed to the model is the sentence: `Translate the word 'world' to French` and nothing else from the surrounding document. When we execute this program, we obtain:
@@ -80,9 +65,15 @@ Le mot 'world' en français est 'monde'.
80
65
```
81
66
where everything after the `:` including it were generated by the model.
82
67
83
-
### Parameter defaults for Watsonx Granite models
84
68
85
-
PDL provides the following defaults for Watsonx Granite models, when the following parameters are missing:
69
+
Using the `input` field, we can also give a directly an array of messages (`role`/`content`) to the model ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input_messages.pdl)):
PDL provides the following defaults for watsonx Granite models, when the following parameters are missing:
86
77
-`decoding_method`: `greedy`
87
78
-`max_new_tokens`: 1024
88
79
-`min_new_tokens`: 1
@@ -99,22 +90,12 @@ The user can override these defaults by explicitly including them in the model c
99
90
100
91
## Variable Definition and Use
101
92
102
-
Any block can have a variable definition using a `def: <var>` field. This means that the output of that block is assigned to the variable `<var>`, which may be reused at a later point in the document.
93
+
Any block can define a variable using a `def: <var>` field. This means that the output of that block is assigned to the variable `<var>`, which may be reused at a later point in the document.
103
94
104
95
Consider the following example ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/variable_def_use.pdl)):
105
96
106
97
```yaml
107
-
description: Hello world with variable def and use
108
-
text:
109
-
- Hello,
110
-
- model: watsonx/ibm/granite-34b-code-instruct
111
-
def: GEN
112
-
parameters:
113
-
decoding_method: greedy
114
-
stop:
115
-
- '!'
116
-
include_stop_sequence: true
117
-
- "\nGEN is equal to: ${ GEN }"
98
+
--8<-- "./examples/tutorial/variable_def_use.pdl"
118
99
```
119
100
120
101
Here we assign the output of the model to variable `GEN` using the `def` field. The last line of the program prints out the value of `GEN`. Notice the notation `${ }` for accessing the value of a variable.
@@ -130,18 +111,7 @@ GEN is equal to: world!
130
111
In PDL, we can declaratively chain models together as in the following example ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/model_chaining.pdl)):
131
112
132
113
```yaml
133
-
description: Model chaining
134
-
text:
135
-
- Hello,
136
-
- model: watsonx/ibm/granite-34b-code-instruct
137
-
parameters:
138
-
stop: ["!"]
139
-
include_stop_sequence: true
140
-
- "\nTranslate this to French\n"
141
-
- model: watsonx/ibm/granite-20b-multilingual
142
-
parameters:
143
-
stop: ["!"]
144
-
include_stop_sequence: true
114
+
--8<-- "./examples/tutorial/model_chaining.pdl"
145
115
```
146
116
147
117
In this program, the first call is to a granite model to complete the sentence `Hello, world!`. The following block in the document prints out the sentence: `Translate this to French`. The final line of the program takes the entire document produced so far and passes it as input to the granite multilingual model. Notice that the input passed to this model is the document up to that point, represented as a conversation. This makes it easy to chain models together and continue building on previous interactions.
0 commit comments