Fix: Ensure cloud providers use their default endpoints#113
Fix: Ensure cloud providers use their default endpoints#113Mortasen wants to merge 1 commit intohydropix:mainfrom
Conversation
|
Thanks for the correction! I'm definitely having quite a few issues with all the ways to use LLMs on this project ;) I can confirm that cloud providers receive the Ollama endpoint when using the CLI/GenericTranslator path, which causes connection failures... However, I opted for a simpler approach: in factory.py, cloud service providers now always use their dedicated endpoint constant (DEEPSEEK_API_ENDPOINT, MISTRAL_API_ENDPOINT, POE_API_ENDPOINT) from config.py instead of reading the generic endpoint kwarg. These constants are already configurable via .env.: no double filtering logic (both in factory.py and llm_client.py), heuristic comparison against known default values, which could have silently ignored a legitimate custom endpoint. The bug only affected the CLI path; the web UI path via create_llm_client() already did not pass api_endpoint to cloud providers. Thanks again for identifying the issue! |
Fixed an issue where cloud providers (DeepSeek, Mistral, Poe) were incorrectly defaulting to the local Ollama endpoint from the global configuration. This was causing connection failures when using these cloud APIs (issue #112).