This image contains the ollama-benchmark tool from Cloud Mercato.
- Pull from Docker Hub, download the package from Releases or build using
builder/build.sh
This container image runs ollama-benchmark in the entrypoint and takes the command as arguments.
The environment variable SERVER_URL configures the used Ollama instance and defaults to
http://host.containers.internal:11434. It can be overridden by setting the environment variable or by using the --host argument.
Example command for running a speed benchmark with the model llama3.2:
docker run --add-host "host.containers.internal:host-gateway" --rm madebytimo/ollama-benchmark \
speed --model llama3.2
Take a look at the projects readme for a detailed usage guide.
Alternatively the help message can be printed with:
docker run --rm madebytimo/ollama-benchmark --help
To run for development execute:
docker compose --file docker-compose-dev.yaml up --build