Skip to content

Commit 6ec91f1

Browse files
stevenhmdelapenya
andauthored
feat(ollama): support calling the Ollama local process (#2923)
* feat: support running ollama from the local binary * fix: wrong working dir at CI * chore: extract wait to a function * chore: print local binary logs on error * chore: remove debug logs * fix(ci): kill ollama before the tests * chore: stop ollama using systemctl * chore: support setting log file from the env * chore: support running ollama commands, only * fix: release lock on error * chore: add more test coverage for the option * chore: simplify useLocal checks * chore: simpolify * chore: pass context to runLocal * chore: move ctx to the right scope * chore: remove not needed * chore: use a container function * chore: support reading OLLAMA_HOST * chore: return error with copy APIs * chore: simply execute the script * chore: simplify var initialisation * chore: return nil * fix: return errors on terminate * chore: remove options type * chore: use a map * chor: simplify error on wait * chore: wrap start logic around the localContext * chor: fold * chore: merge wait into start * fix: use proper ContainersState * fix: remove extra conversion * chore: handle remove log file errors properly * chore: go back to string in env vars * refactor(ollama): local process Refactor local process handling for Ollama using a container implementation avoiding the wrapping methods. This defaults to running the binary with an ephemeral port to avoid port conflicts. This behaviour can be overridden my setting OLLAMA_HOST either in the parent environment or in the values passed via WithUseLocal. Improve API compatibility with: - Multiplexed output streams - State reporting - Exec option processing - WaitingFor customisation Fix Container implementation: - Port management - Running checks - Terminate processing - Endpoint argument definition - Add missing methods - Consistent environment handling * chore(ollama): refactor local to use log sub match. Refactor local processing to use the new log sub match functionality. * feat(ollama): validate container request Validate the container request to ensure the user configuration can be processed and no fields that would be ignored are present. * chore(ollama): remove temporary test Remove temporary simple test. * feat(ollama): configurable local process binary Allow the local ollama binary name to be configured using the image name. * docs(ollama): detail local process supported fields Detail the container request supported fields. * docs(ollama): update local process site docs Update local process site docs to match recent changes. * chore: refactor to support TerminateOption Refactor Terminate to support testcontainers.TerminateOption. * fix: remove unused var --------- Co-authored-by: Manuel de la Peña <[email protected]>
1 parent 632249a commit 6ec91f1

File tree

10 files changed

+1630
-16
lines changed

10 files changed

+1630
-16
lines changed
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
#!/usr/bin/env bash
2+
3+
curl -fsSL https://ollama.com/install.sh | sh
4+
5+
# kill any running ollama process so that the tests can start from a clean state
6+
sudo systemctl stop ollama.service

.github/workflows/ci-test-go.yml

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -107,6 +107,16 @@ jobs:
107107
working-directory: ./${{ inputs.project-directory }}
108108
run: go build
109109

110+
- name: Install dependencies
111+
shell: bash
112+
run: |
113+
SCRIPT_PATH="./.github/scripts/${{ inputs.project-directory }}/install-dependencies.sh"
114+
if [ -f "$SCRIPT_PATH" ]; then
115+
$SCRIPT_PATH
116+
else
117+
echo "No dependencies script found at $SCRIPT_PATH - skipping installation"
118+
fi
119+
110120
- name: go test
111121
# only run tests on linux, there are a number of things that won't allow the tests to run on anything else
112122
# many (maybe, all?) images used can only be build on Linux, they don't have Windows in their manifest, and

docs/modules/ollama.md

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,15 @@ go get github.com/testcontainers/testcontainers-go/modules/ollama
1616

1717
## Usage example
1818

19+
The module allows you to run the Ollama container or the local Ollama binary.
20+
1921
<!--codeinclude-->
2022
[Creating a Ollama container](../../modules/ollama/examples_test.go) inside_block:runOllamaContainer
23+
[Running the local Ollama binary](../../modules/ollama/examples_test.go) inside_block:localOllama
2124
<!--/codeinclude-->
2225

26+
If the local Ollama binary fails to execute, the module will fallback to the container version of Ollama.
27+
2328
## Module Reference
2429

2530
### Run function
@@ -48,6 +53,51 @@ When starting the Ollama container, you can pass options in a variadic way to co
4853
If you need to set a different Ollama Docker image, you can set a valid Docker image as the second argument in the `Run` function.
4954
E.g. `Run(context.Background(), "ollama/ollama:0.1.25")`.
5055

56+
#### Use Local
57+
58+
- Not available until the next release of testcontainers-go <a href="https://github.com/testcontainers/testcontainers-go"><span class="tc-version">:material-tag: main</span></a>
59+
60+
!!!warning
61+
Please make sure the local Ollama binary is not running when using the local version of the module:
62+
Ollama can be started as a system service, or as part of the Ollama application,
63+
and interacting with the logs of a running Ollama process not managed by the module is not supported.
64+
65+
If you need to run the local Ollama binary, you can set the `UseLocal` option in the `Run` function.
66+
This option accepts a list of environment variables as a string, that will be applied to the Ollama binary when executing commands.
67+
68+
E.g. `Run(context.Background(), "ollama/ollama:0.1.25", WithUseLocal("OLLAMA_DEBUG=true"))`.
69+
70+
All the container methods are available when using the local Ollama binary, but will be executed locally instead of inside the container.
71+
Please consider the following differences when using the local Ollama binary:
72+
73+
- The local Ollama binary will create a log file in the current working directory, identified by the session ID. E.g. `local-ollama-<session-id>.log`. It's possible to set the log file name using the `OLLAMA_LOGFILE` environment variable. So if you're running Ollama yourself, from the Ollama app, or the standalone binary, you could use this environment variable to set the same log file name.
74+
- For the Ollama app, the default log file resides in the `$HOME/.ollama/logs/server.log`.
75+
- For the standalone binary, you should start it redirecting the logs to a file. E.g. `ollama serve > /tmp/ollama.log 2>&1`.
76+
- `ConnectionString` returns the connection string to connect to the local Ollama binary started by the module instead of the container.
77+
- `ContainerIP` returns the bound host IP `127.0.0.1` by default.
78+
- `ContainerIPs` returns the bound host IP `["127.0.0.1"]` by default.
79+
- `CopyToContainer`, `CopyDirToContainer`, `CopyFileToContainer` and `CopyFileFromContainer` return an error if called.
80+
- `GetLogProductionErrorChannel` returns a nil channel.
81+
- `Endpoint` returns the endpoint to connect to the local Ollama binary started by the module instead of the container.
82+
- `Exec` passes the command to the local Ollama binary started by the module instead of inside the container. First argument is the command to execute, and the second argument is the list of arguments, else, an error is returned.
83+
- `GetContainerID` returns the container ID of the local Ollama binary started by the module instead of the container, which maps to `local-ollama-<session-id>`.
84+
- `Host` returns the bound host IP `127.0.0.1` by default.
85+
- `Inspect` returns a ContainerJSON with the state of the local Ollama binary started by the module.
86+
- `IsRunning` returns true if the local Ollama binary process started by the module is running.
87+
- `Logs` returns the logs from the local Ollama binary started by the module instead of the container.
88+
- `MappedPort` returns the port mapping for the local Ollama binary started by the module instead of the container.
89+
- `Start` starts the local Ollama binary process.
90+
- `State` returns the current state of the local Ollama binary process, `stopped` or `running`.
91+
- `Stop` stops the local Ollama binary process.
92+
- `Terminate` calls the `Stop` method and then removes the log file.
93+
94+
The local Ollama binary will create a log file in the current working directory, and it will be available in the container's `Logs` method.
95+
96+
!!!info
97+
The local Ollama binary will use the `OLLAMA_HOST` environment variable to set the host and port to listen on.
98+
If the environment variable is not set, it will default to `localhost:0`
99+
which bind to a loopback address on an ephemeral port to avoid port conflicts.
100+
51101
{% include "../features/common_functional_options.md" %}
52102

53103
### Container Methods

modules/ollama/examples_test.go

Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -173,3 +173,73 @@ func ExampleRun_withModel_llama2_langchain() {
173173

174174
// Intentionally not asserting the output, as we don't want to run this example in the tests.
175175
}
176+
177+
func ExampleRun_withLocal() {
178+
ctx := context.Background()
179+
180+
// localOllama {
181+
ollamaContainer, err := tcollama.Run(ctx, "ollama/ollama:0.3.13", tcollama.WithUseLocal("OLLAMA_DEBUG=true"))
182+
defer func() {
183+
if err := testcontainers.TerminateContainer(ollamaContainer); err != nil {
184+
log.Printf("failed to terminate container: %s", err)
185+
}
186+
}()
187+
if err != nil {
188+
log.Printf("failed to start container: %s", err)
189+
return
190+
}
191+
// }
192+
193+
model := "llama3.2:1b"
194+
195+
_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "pull", model})
196+
if err != nil {
197+
log.Printf("failed to pull model %s: %s", model, err)
198+
return
199+
}
200+
201+
_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "run", model})
202+
if err != nil {
203+
log.Printf("failed to run model %s: %s", model, err)
204+
return
205+
}
206+
207+
connectionStr, err := ollamaContainer.ConnectionString(ctx)
208+
if err != nil {
209+
log.Printf("failed to get connection string: %s", err)
210+
return
211+
}
212+
213+
var llm *langchainollama.LLM
214+
if llm, err = langchainollama.New(
215+
langchainollama.WithModel(model),
216+
langchainollama.WithServerURL(connectionStr),
217+
); err != nil {
218+
log.Printf("failed to create langchain ollama: %s", err)
219+
return
220+
}
221+
222+
completion, err := llm.Call(
223+
context.Background(),
224+
"how can Testcontainers help with testing?",
225+
llms.WithSeed(42), // the lower the seed, the more deterministic the completion
226+
llms.WithTemperature(0.0), // the lower the temperature, the more creative the completion
227+
)
228+
if err != nil {
229+
log.Printf("failed to create langchain ollama: %s", err)
230+
return
231+
}
232+
233+
words := []string{
234+
"easy", "isolation", "consistency",
235+
}
236+
lwCompletion := strings.ToLower(completion)
237+
238+
for _, word := range words {
239+
if strings.Contains(lwCompletion, word) {
240+
fmt.Println(true)
241+
}
242+
}
243+
244+
// Intentionally not asserting the output, as we don't want to run this example in the tests.
245+
}

modules/ollama/go.mod

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ go 1.22
44

55
require (
66
github.com/docker/docker v27.1.1+incompatible
7+
github.com/docker/go-connections v0.5.0
78
github.com/google/uuid v1.6.0
89
github.com/stretchr/testify v1.9.0
910
github.com/testcontainers/testcontainers-go v0.34.0
@@ -22,7 +23,6 @@ require (
2223
github.com/davecgh/go-spew v1.1.1 // indirect
2324
github.com/distribution/reference v0.6.0 // indirect
2425
github.com/dlclark/regexp2 v1.8.1 // indirect
25-
github.com/docker/go-connections v0.5.0 // indirect
2626
github.com/docker/go-units v0.5.0 // indirect
2727
github.com/felixge/httpsnoop v1.0.4 // indirect
2828
github.com/go-logr/logr v1.4.1 // indirect

0 commit comments

Comments
 (0)