Skip to content

Docker Production Setup Fails: Env Variables Missing and Ollama Connection Error #353

@av1155

Description

@av1155

Describe the bug

I have attempted to set up the production environment using Docker, following the steps outlined in the README, as well as additional troubleshooting steps, but the setup consistently fails. My primary goal is to integrate Ollama.

Even though I create the .env.local file and input all the necessary information for OpenAI, Anthropic, and Ollama URLs, the following error persists when executing the commands npm run dockerbuild:prod and docker-compose —profile production up:

Issues:

  1. Environment Variables Not Set:
    Despite creating a .env.local file with the required environment variables (e.g., OpenAI, Anthropic, Ollama URL), they are not being recognized during the build process. The following warning messages appear:

    WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
    WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
    WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
  2. Ollama Error in UI:
    When attempting to send a message via Ollama, the UI shows the error:
    "There was an error processing your request: No details were returned."
    The logs include the following error message:

    Error: Network connection lost.
        at async postToApi ...
        at async OllamaChatLanguageModel.doStream ...

Additional Context:

  • My local Ollama instance is running at http://127.0.0.1:11434, and I have verified it is accessible.
    • Ollama is installed with Homebrew (macOS) and the server is running.
      ❯ curl http://127.0.0.1:11434
      Ollama is running
  • I am using the latest version of the project and dependencies, including npm and docker-compose.
    ❯ docker -v
    Docker version 27.3.1, build ce12230
    ❯ docker-compose -v
    Docker Compose version v2.29.7-desktop.1
    ❯ npm -v
    10.9.0

Full Log Output:

❯ docker-compose --profile production up
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
[+] Running 2/2
 ✔ Network boltnew-any-llm_default      Created                                                                                                   0.0s
 ✔ Container boltnew-any-llm-bolt-ai-1  Created                                                                                                   0.1s
Attaching to bolt-ai-1
bolt-ai-1  |
bolt-ai-1  | > bolt@ dockerstart /app
bolt-ai-1  | > bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 5173 --no-show-interactive-dev-session
bolt-ai-1  |
bolt-ai-1  | ./bindings.sh: line 12: .env.local: No such file or directory
bolt-ai-1  |
bolt-ai-1  |  ⛅️ wrangler 3.63.2 (update available 3.88.0)
bolt-ai-1  | ---------------------------------------------
bolt-ai-1  |
bolt-ai-1  | ✨ Compiled Worker successfully
bolt-ai-1  | [wrangler:inf] Ready on http://0.0.0.0:5173
[wrangler:inf] - http://127.0.0.1:5173
[wrangler:inf] - http://172.19.0.2:5173
⎔ Starting local server...
[wrangler:inf] GET / 200 OK (326ms)
[wrangler:inf] GET /assets/manifest-64f35b7b.js 200 OK (4ms)
[wrangler:inf] GET /assets/_index-CYAjcA4A.js 200 OK (18ms)
[wrangler:inf] GET /assets/index-CPTzpSUP.css 200 OK (76ms)
[wrangler:inf] GET /assets/root-BpC-Gj3n.css 200 OK (82ms)
[wrangler:inf] GET /assets/tailwind-compat-CC20SAMN.css 200 OK (82ms)
[wrangler:inf] GET /assets/xterm-lQO2bNqs.css 200 OK (81ms)
[wrangler:inf] GET /assets/ReactToastify-CYivYX3d.css 200 OK (81ms)
[wrangler:inf] GET /assets/_index-D_NZK3VS.css 200 OK (3ms)
[wrangler:inf] GET /assets/root-CbOXwCh9.js 200 OK (3ms)
[wrangler:inf] GET /assets/theme-9upYr29Y.js 200 OK (5ms)
[wrangler:inf] GET /assets/components-VKHSbR2h.js 200 OK (6ms)
[wrangler:inf] GET /assets/entry.client-kdhVl-y0.js 200 OK (26ms)
[wrangler:inf] GET /assets/_index-CgRH4noA.js 200 OK (26ms)
[wrangler:inf] GET /assets/wasm-CsTmP73Z.js 200 OK (9ms)
[wrangler:inf] GET /assets/light-plus-BsvsQ1iS.js 200 OK (6ms)
[wrangler:inf] GET /assets/dark-plus-KEYLhlmT.js 200 OK (7ms)
[wrangler:inf] GET /assets/shellscript-BZfs-ost.js 200 OK (8ms)
Error: Network connection lost.
bolt-ai-1  |     at async postToApi (file:///app/node_modules/.pnpm/@ai-sdk+provider-utils@1.0.20_zod@3.23.8/node_modules/@ai-sdk/provider-utils/src/post-to-api.ts:65:22)
bolt-ai-1  |     at async OllamaChatLanguageModel.doStream (file:///app/node_modules/.pnpm/ollama-ai-provider@0.15.2_zod@3.23.8/node_modules/ollama-ai-provider/src/ollama-chat-language-model.ts:230:50)
bolt-ai-1  |     at async fn (file:///app/node_modules/.pnpm/ai@3.4.9_react@18.3.1_sswr@2.1.0_svelte@4.2.18__svelte@4.2.18_vue@3.4.30_typescript@5.5.2__zod@3.23.8/node_modules/ai/core/generate-text/stream-text.ts:345:23)
bolt-ai-1  |     at null.<anonymous> (async file:///app/.wrangler/tmp/dev-RIKdeU/functionsWorker-0.4237590511921838.js:30634:22)
bolt-ai-1  |     at async _retryWithExponentialBackoff (file:///app/node_modules/.pnpm/ai@3.4.9_react@18.3.1_sswr@2.1.0_svelte@4.2.18__svelte@4.2.18_vue@3.4.30_typescript@5.5.2__zod@3.23.8/node_modules/ai/util/retry-with-exponential-backoff.ts:37:12)
bolt-ai-1  |     at async startStep (file:///app/node_modules/.pnpm/ai@3.4.9_react@18.3.1_sswr@2.1.0_svelte@4.2.18__svelte@4.2.18_vue@3.4.30_typescript@5.5.2__zod@3.23.8/node_modules/ai/core/generate-text/stream-text.ts:310:13)
bolt-ai-1  |     at async fn (file:///app/node_modules/.pnpm/ai@3.4.9_react@18.3.1_sswr@2.1.0_svelte@4.2.18__svelte@4.2.18_vue@3.4.30_typescript@5.5.2__zod@3.23.8/node_modules/ai/core/generate-text/stream-text.ts:387:11)
bolt-ai-1  |     at null.<anonymous> (async file:///app/.wrangler/tmp/dev-RIKdeU/functionsWorker-0.4237590511921838.js:30634:22)
bolt-ai-1  |     at async chatAction (file:///app/build/server/index.js:1169:20)
bolt-ai-1  |     at async Object.callRouteAction (file:///app/node_modules/.pnpm/@remix-run+server-runtime@2.10.2_typescript@5.5.2/node_modules/@remix-run/server-runtime/dist/data.js:37:16) {
bolt-ai-1  |   retryable: true
bolt-ai-1  | }
[wrangler:inf] POST /api/chat 500 Internal Server Error (18ms)
bolt-ai-1  |
^CGracefully stopping... (press Ctrl+C again to force)
[+] Stopping 1/1
 ✔ Container boltnew-any-llm-bolt-ai-1  Stopped                                                                                                   0.1s
canceled

Link to the Bolt URL that caused the error

http://0.0.0.0:5173/

Steps to reproduce

  1. Create a .env.local file with the following content:
OPENAI_API_KEY=<your-key>
ANTHROPIC_API_KEY=<your-key>
OLLAMA_API_BASE_URL=http://127.0.0.1:11434
  1. Run the following commands:
npm run dockerbuild:prod
docker-compose --profile production up
  1. Attempt to interact with the application UI and send a message using Ollama.

Expected behavior

The Docker setup should:

  • Load all environment variables from .env.local.
  • Connect to the Ollama API without errors and receive an answer.

Actual Behavior:

  • The environment variables are not recognized during the Docker build process.
  • The UI displays an error when sending a message via Ollama, and logs indicate network connection issues.

If I use the simple pnpm process, it works well, but I want to set up a Docker container for this so it's always running and I can just open a browser and use it.

Screen Recording / Screenshot

No response

Platform

  • OS: macOS Version 15.1 (24B2083)
  • Browser: Arc Browser

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions