Skip to content

Commit bc13e62

Browse files
avihu111Luis Lastras lastrasl@us.ibm.com
authored andcommitted
Minor fix to tutorial.md (generative-computing#275)
1 parent c756dc8 commit bc13e62

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/tutorial.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ Here, we initialized a backend running Ollama on a local machine using the grani
8080
We then ask the model to generate an email and print it to the console.
8181

8282
> [!NOTE]
83-
> Mellea supports many other models and backends. By default, a new Mellea session will run IBM's capable Granite 8B model on your own laptop. This is a good (and free!) way to get started. If you would like to try out other models or backends, you can explicitly specify the backend and model in the start_session method. For example, `mellea.start_session(backend_name="ollama", model_id=mellea.model_ids.IBM_GRANITE_3_3_8B)`.
83+
> Mellea supports many other models and backends. By default, a new Mellea session will run IBM's capable Granite 3B model on your own laptop. This is a good (and free!) way to get started. If you would like to try out other models or backends, you can explicitly specify the backend and model in the start_session method. For example, `mellea.start_session(backend_name="ollama", model_id=mellea.model_ids.IBM_GRANITE_4_MICRO_3B)`.
8484
8585
Before continuing, let's wrap this call into a function with some arguments:
8686

0 commit comments

Comments
 (0)