Skip to content

Fix: Ensure cloud providers use their default endpoints#113

Open
Mortasen wants to merge 1 commit intohydropix:mainfrom
Mortasen:fix/cloud-provider-endpoints
Open

Fix: Ensure cloud providers use their default endpoints#113
Mortasen wants to merge 1 commit intohydropix:mainfrom
Mortasen:fix/cloud-provider-endpoints

Conversation

@Mortasen
Copy link

@Mortasen Mortasen commented Mar 10, 2026

Fixed an issue where cloud providers (DeepSeek, Mistral, Poe) were incorrectly defaulting to the local Ollama endpoint from the global configuration. This was causing connection failures when using these cloud APIs (issue #112).

@hydropix
Copy link
Owner

Thanks for the correction! I'm definitely having quite a few issues with all the ways to use LLMs on this project ;)

I can confirm that cloud providers receive the Ollama endpoint when using the CLI/GenericTranslator path, which causes connection failures...

However, I opted for a simpler approach: in factory.py, cloud service providers now always use their dedicated endpoint constant (DEEPSEEK_API_ENDPOINT, MISTRAL_API_ENDPOINT, POE_API_ENDPOINT) from config.py instead of reading the generic endpoint kwarg. These constants are already configurable via .env.: no double filtering logic (both in factory.py and llm_client.py), heuristic comparison against known default values, which could have silently ignored a legitimate custom endpoint.

The bug only affected the CLI path; the web UI path via create_llm_client() already did not pass api_endpoint to cloud providers.

Thanks again for identifying the issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants