You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat(ollama): support calling the Ollama local process (#2923)
* feat: support running ollama from the local binary
* fix: wrong working dir at CI
* chore: extract wait to a function
* chore: print local binary logs on error
* chore: remove debug logs
* fix(ci): kill ollama before the tests
* chore: stop ollama using systemctl
* chore: support setting log file from the env
* chore: support running ollama commands, only
* fix: release lock on error
* chore: add more test coverage for the option
* chore: simplify useLocal checks
* chore: simpolify
* chore: pass context to runLocal
* chore: move ctx to the right scope
* chore: remove not needed
* chore: use a container function
* chore: support reading OLLAMA_HOST
* chore: return error with copy APIs
* chore: simply execute the script
* chore: simplify var initialisation
* chore: return nil
* fix: return errors on terminate
* chore: remove options type
* chore: use a map
* chor: simplify error on wait
* chore: wrap start logic around the localContext
* chor: fold
* chore: merge wait into start
* fix: use proper ContainersState
* fix: remove extra conversion
* chore: handle remove log file errors properly
* chore: go back to string in env vars
* refactor(ollama): local process
Refactor local process handling for Ollama using a container implementation
avoiding the wrapping methods.
This defaults to running the binary with an ephemeral port to avoid port
conflicts. This behaviour can be overridden my setting OLLAMA_HOST either
in the parent environment or in the values passed via WithUseLocal.
Improve API compatibility with:
- Multiplexed output streams
- State reporting
- Exec option processing
- WaitingFor customisation
Fix Container implementation:
- Port management
- Running checks
- Terminate processing
- Endpoint argument definition
- Add missing methods
- Consistent environment handling
* chore(ollama): refactor local to use log sub match.
Refactor local processing to use the new log sub match functionality.
* feat(ollama): validate container request
Validate the container request to ensure the user configuration can be processed
and no fields that would be ignored are present.
* chore(ollama): remove temporary test
Remove temporary simple test.
* feat(ollama): configurable local process binary
Allow the local ollama binary name to be configured using the image name.
* docs(ollama): detail local process supported fields
Detail the container request supported fields.
* docs(ollama): update local process site docs
Update local process site docs to match recent changes.
* chore: refactor to support TerminateOption
Refactor Terminate to support testcontainers.TerminateOption.
* fix: remove unused var
---------
Co-authored-by: Manuel de la Peña <[email protected]>
Copy file name to clipboardExpand all lines: docs/modules/ollama.md
+50Lines changed: 50 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,10 +16,15 @@ go get github.com/testcontainers/testcontainers-go/modules/ollama
16
16
17
17
## Usage example
18
18
19
+
The module allows you to run the Ollama container or the local Ollama binary.
20
+
19
21
<!--codeinclude-->
20
22
[Creating a Ollama container](../../modules/ollama/examples_test.go) inside_block:runOllamaContainer
23
+
[Running the local Ollama binary](../../modules/ollama/examples_test.go) inside_block:localOllama
21
24
<!--/codeinclude-->
22
25
26
+
If the local Ollama binary fails to execute, the module will fallback to the container version of Ollama.
27
+
23
28
## Module Reference
24
29
25
30
### Run function
@@ -48,6 +53,51 @@ When starting the Ollama container, you can pass options in a variadic way to co
48
53
If you need to set a different Ollama Docker image, you can set a valid Docker image as the second argument in the `Run` function.
49
54
E.g. `Run(context.Background(), "ollama/ollama:0.1.25")`.
50
55
56
+
#### Use Local
57
+
58
+
- Not available until the next release of testcontainers-go <a href="https://github.com/testcontainers/testcontainers-go"><span class="tc-version">:material-tag: main</span></a>
59
+
60
+
!!!warning
61
+
Please make sure the local Ollama binary is not running when using the local version of the module:
62
+
Ollama can be started as a system service, or as part of the Ollama application,
63
+
and interacting with the logs of a running Ollama process not managed by the module is not supported.
64
+
65
+
If you need to run the local Ollama binary, you can set the `UseLocal` option in the `Run` function.
66
+
This option accepts a list of environment variables as a string, that will be applied to the Ollama binary when executing commands.
67
+
68
+
E.g. `Run(context.Background(), "ollama/ollama:0.1.25", WithUseLocal("OLLAMA_DEBUG=true"))`.
69
+
70
+
All the container methods are available when using the local Ollama binary, but will be executed locally instead of inside the container.
71
+
Please consider the following differences when using the local Ollama binary:
72
+
73
+
- The local Ollama binary will create a log file in the current working directory, identified by the session ID. E.g. `local-ollama-<session-id>.log`. It's possible to set the log file name using the `OLLAMA_LOGFILE` environment variable. So if you're running Ollama yourself, from the Ollama app, or the standalone binary, you could use this environment variable to set the same log file name.
74
+
- For the Ollama app, the default log file resides in the `$HOME/.ollama/logs/server.log`.
75
+
- For the standalone binary, you should start it redirecting the logs to a file. E.g. `ollama serve > /tmp/ollama.log 2>&1`.
76
+
- `ConnectionString` returns the connection string to connect to the local Ollama binary started by the module instead of the container.
77
+
- `ContainerIP` returns the bound host IP `127.0.0.1` by default.
78
+
- `ContainerIPs` returns the bound host IP `["127.0.0.1"]` by default.
79
+
- `CopyToContainer`, `CopyDirToContainer`, `CopyFileToContainer` and `CopyFileFromContainer` return an error if called.
80
+
- `GetLogProductionErrorChannel` returns a nil channel.
81
+
- `Endpoint` returns the endpoint to connect to the local Ollama binary started by the module instead of the container.
82
+
- `Exec` passes the command to the local Ollama binary started by the module instead of inside the container. First argument is the command to execute, and the second argument is the list of arguments, else, an error is returned.
83
+
- `GetContainerID` returns the container ID of the local Ollama binary started by the module instead of the container, which maps to `local-ollama-<session-id>`.
84
+
- `Host` returns the bound host IP `127.0.0.1` by default.
85
+
- `Inspect` returns a ContainerJSON with the state of the local Ollama binary started by the module.
86
+
- `IsRunning` returns true if the local Ollama binary process started by the module is running.
87
+
- `Logs` returns the logs from the local Ollama binary started by the module instead of the container.
88
+
- `MappedPort` returns the port mapping for the local Ollama binary started by the module instead of the container.
89
+
- `Start` starts the local Ollama binary process.
90
+
- `State` returns the current state of the local Ollama binary process, `stopped` or `running`.
91
+
- `Stop` stops the local Ollama binary process.
92
+
- `Terminate` calls the `Stop` method and then removes the log file.
93
+
94
+
The local Ollama binary will create a log file in the current working directory, and it will be available in the container's `Logs` method.
95
+
96
+
!!!info
97
+
The local Ollama binary will use the `OLLAMA_HOST` environment variable to set the host and port to listen on.
98
+
If the environment variable is not set, it will default to `localhost:0`
99
+
which bind to a loopback address on an ephemeral port to avoid port conflicts.
100
+
51
101
{% include "../features/common_functional_options.md" %}
0 commit comments