-
Notifications
You must be signed in to change notification settings - Fork 0
Using Ollama
Griffen Fargo edited this page Mar 20, 2024
·
5 revisions
The easiest way to use Ollama out of the box with coco is to add or update the service option in the coco config to be ollama.
The default service configuration objects connected to the shorthand aliases of openai and ollama can be found here.
{
"$schema": "https://git-co.co/schema.json",
"service": "ollama",
"verbose": true,
"ignoredFiles": [
"package-lock.json"
],
"ignoredExtensions": [
"map",
"lock"
]
}
If you want to go deeper than shorthand aliases of openai and ollama and tinker with the underlying model, temperature, endpoint, or other settings available in the OpenAILLMService or OllamaLLMService interfaces.