Skip to content

Commit ebede69

Browse files
authored
Migrate some examples from Replicate to Ollama (#522)
* Update examples to use ollama Signed-off-by: Jing Chen <[email protected]> * Rever some examples from ollama back to replicate Signed-off-by: Jing Chen <[email protected]> * Set ollama granite default temperature to 0 --------- Signed-off-by: Jing Chen <[email protected]>
1 parent e502e8e commit ebede69

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

44 files changed

+110
-109
lines changed

docs/tutorial.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -653,7 +653,7 @@ This is similar to a spreadsheet for tabular data, where data is in the forefron
653653
## Using Ollama models
654654

655655
1. Install Ollama e.g., `brew install --cask ollama`
656-
2. Run a model e.g., `ollama run granite-code:34b-instruct-q5_K_M`. See [the Ollama library for more models](https://ollama.com/library/granite-code/tags)
656+
2. Run a model e.g., `ollama run granite-code:8b`. See [the Ollama library for more models](https://ollama.com/library/granite-code/tags)
657657
3. An OpenAI style server is running locally at [http://localhost:11434/](http://localhost:11434/), see [the Ollama blog](https://ollama.com/blog/openai-compatibility) for more details.
658658

659659

@@ -662,7 +662,7 @@ Example:
662662
```
663663
text:
664664
- Hello,
665-
- model: ollama_chat/granite-code:34b-instruct-q5_K_M
665+
- model: ollama_chat/granite-code:8b
666666
parameters:
667667
stop:
668668
- '!'

examples/callback/repair_prompt.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ lastOf:
99
Please repair the code!
1010

1111
- def: raw_output
12-
model: replicate/ibm-granite/granite-3.1-8b-instruct
12+
model: ollama/granite-code:8b
1313
parameters:
1414
#stop_sequences: "\n\n"
1515
temperature: 0

examples/chatbot/chatbot.pdl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ text:
66
- repeat:
77
text:
88
# Send context to Granite model hosted at replicate.com
9-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
9+
- model: ollama/granite-code:8b
1010
# Allow the user to type 'yes', 'no', or anything else, storing
1111
# the input into a variable named `eval`. The input is also implicitly
1212
# added to the context.

examples/code/code-eval.pdl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,13 +10,13 @@ defs:
1010
text:
1111
# Print the source code to the console
1212
- "\n${ CODE.source_code }\n"
13-
# Use replicate.com to invoke a Granite model with a prompt. Output AND
13+
# Use ollama to invoke a Granite model with a prompt. Output AND
1414
# set the variable `EXPLANATION` to the output.
15-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
15+
- model: ollama/granite-code:8b
1616
def: EXPLANATION
1717
input: |
1818
Here is some info about the location of the function in the repo.
19-
repo:
19+
repo:
2020
${ CODE.repo_info.repo }
2121
path: ${ CODE.repo_info.path }
2222
Function_name: ${ CODE.repo_info.function_name }

examples/code/code-json.pdl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,13 @@ defs:
66
TRUTH:
77
read: ./ground_truth.txt
88
text:
9-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
9+
- model: ollama/granite-code:8b
1010
def: EXPLANATION
1111
contribute: []
1212
input:
1313
|
1414
Here is some info about the location of the function in the repo.
15-
repo:
15+
repo:
1616
${ CODE.repo_info.repo }
1717
path: ${ CODE.repo_info.path }
1818
Function_name: ${ CODE.repo_info.function_name }
@@ -37,7 +37,7 @@ text:
3737
"""
3838
# (In PDL, set `result` to the output you wish for your code block.)
3939
result = textdistance.levenshtein.normalized_similarity(expl, truth)
40-
- data:
40+
- data:
4141
input: ${ CODE }
4242
output: ${ EXPLANATION }
4343
metric: ${ EVAL }

examples/code/code.pdl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,16 +7,16 @@ defs:
77
text:
88
# Output the `source_code:` of the YAML to the console
99
- "\n${ CODE.source_code }\n"
10-
# Use replicate.com to invoke a Granite model with a prompt
11-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
10+
# Use ollama to invoke a Granite model with a prompt
11+
- model: ollama/granite-code:8b
1212
input: |
1313
Here is some info about the location of the function in the repo.
14-
repo:
14+
repo:
1515
${ CODE.repo_info.repo }
1616
path: ${ CODE.repo_info.path }
1717
Function_name: ${ CODE.repo_info.function_name }
1818

19-
19+
2020
Explain the following code:
2121
```
2222
${ CODE.source_code }```

examples/demo/3-weather.pdl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ description: Using a weather API and LLM to make a small weather app
22
text:
33
- def: QUERY
44
text: "What is the weather in Madrid?\n"
5-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
5+
- model: ollama/granite-code:8b
66
input: |
77
Extract the location from the question.
88
Question: What is the weather in London?
@@ -19,14 +19,14 @@ text:
1919
- lang: python
2020
code: |
2121
import requests
22-
#result = requests.get('https://api.weatherapi.com/v1/current.json?key==XYZ=${ LOCATION }')
22+
#result = requests.get('https://api.weatherapi.com/v1/current.json?key==XYZ=${ LOCATION }')
2323
#Mock result:
2424
result = '{"location": {"name": "Madrid", "region": "Madrid", "country": "Spain", "lat": 40.4, "lon": -3.6833, "tz_id": "Europe/Madrid", "localtime_epoch": 1732543839, "localtime": "2024-11-25 15:10"}, "current": {"last_updated_epoch": 1732543200, "last_updated": "2024-11-25 15:00", "temp_c": 14.4, "temp_f": 57.9, "is_day": 1, "condition": {"text": "Partly cloudy", "icon": "//cdn.weatherapi.com/weather/64x64/day/116.png", "code": 1003}, "wind_mph": 13.2, "wind_kph": 21.2, "wind_degree": 265, "wind_dir": "W", "pressure_mb": 1017.0, "pressure_in": 30.03, "precip_mm": 0.01, "precip_in": 0.0, "humidity": 77, "cloud": 75, "feelslike_c": 12.8, "feelslike_f": 55.1, "windchill_c": 13.0, "windchill_f": 55.4, "heatindex_c": 14.5, "heatindex_f": 58.2, "dewpoint_c": 7.3, "dewpoint_f": 45.2, "vis_km": 10.0, "vis_miles": 6.0, "uv": 1.4, "gust_mph": 15.2, "gust_kph": 24.4}}'
2525
def: WEATHER
2626
parser: json
2727
contribute: []
28-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
28+
- model: ollama/granite-code:8b
2929
input: |
3030
Explain the weather from the following JSON:
3131
${ WEATHER }
32-
32+

examples/demo/4-translator.pdl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
description: PDL program
22
text:
33
- "What is APR?\n"
4-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
4+
- model: ollama/granite-code:8b
55
- repeat:
66
text:
77
- read:
@@ -11,5 +11,5 @@ text:
1111
then:
1212
text:
1313
- "\n\nTranslate the above to ${ language }\n"
14-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
14+
- model: ollama/granite-code:8b
1515
until: ${ language == 'stop' }

examples/fibonacci/fib.pdl

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ text:
66
# Use IBM Granite to author a program that computes the Nth Fibonacci number,
77
# storing the generated program into the variable `CODE`.
88
- def: CODE
9-
model: replicate/ibm-granite/granite-3.1-8b-instruct
9+
model: ollama/granite-code:8b
1010
input: "Write a Python function to compute the Fibonacci sequence. Do not include a doc string.\n\n"
1111
parameters:
1212
# Request no randomness when generating code
@@ -42,5 +42,4 @@ text:
4242

4343
# Invoke the LLM again to explain the PDL context
4444
- "\n\nExplain what the above code does and what the result means\n\n"
45-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
46-
45+
- model: ollama/granite-code:8b

examples/hello/hello-def-use.pdl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
description: Hello world with variable use
22
text:
33
- "Hello\n"
4-
# Define GEN to be the result of a Granite LLM using replicate.com
5-
- model: replicate/ibm-granite/granite-3.1-8b-instruct
4+
# Define GEN to be the result of a Granite LLM using ollama
5+
- model: ollama/granite-code:8b
66
parameters:
77
# "greedy" sampling tells the LLM to use the most likely token at each step
88
decoding_method: greedy

0 commit comments

Comments
 (0)