Skip to content

Agent mode with Ollama ignores provider settings #1177

@schltgn

Description

@schltgn

What happened?

When trying the new agent mode with Ollama, I do not get any response and I see the following error in IDEA's log file:

2026-01-23 09:51:08,836 [1319762]   INFO - STDERR - [DefaultDispatcher-worker-25] ERROR ee.carlrobert.codegpt.agent.ProxyAIAgent - Agent execution failed: AgentExecutionFailedContext(eventId=62f3cc57-1dae-4fac-8ce0-a106939fa8d6, executionInfo=AgentExecutionInfo(parent=null, partName=a265439f-b9b0-4195-8100-f2a4d19cce91), agentId=a265439f-b9b0-4195-8100-f2a4d19cce91, runId=9b272c3a-12ee-4605-8eef-f48c8fb0dbbd, throwable=org.apache.hc.client5.http.HttpHostConnectException: Connect to http://localhost:11434 [localhost/127.0.0.1, localhost/0:0:0:0:0:0:0:1] failed: Connection refused)

It tries to connect to http://localhost:11434 but my Ollama provider's host is not configured as localhost. In agent mode, it seems that ProxyAI ignores the provider settings. Chat mode is working fine.

Steps to reproduce

Run an Ollama LLM on a remote machine (not localhost) and configure ProxyAI plugin to use it in Agent mode. Then post anything to the agent.

Environment

  • IntelliJ IDEA 2025.3.2 Build #IU-253.30387.90
  • JDK: 21.0.9; VM: OpenJDK 64-Bit Server VM; Vendor: JetBrains s.r.o.
  • OS: Mac OS X
  • ProxyAI version: 3.7.0-241.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions