[Config Support]: OpenAI GenAI provider ignores base_url — all requests sent to api.openai.com #22054
Replies: 1 comment 1 reply
-
|
You've identified a known issue with the OpenAI provider in Frigate 0.16.x. The WorkaroundThe documented workaround is to set the environment_vars:
OPENAI_BASE_URL: http://my-local-server:8002/v1
genai:
enabled: true
provider: openai
api_key: sk-any-key
model: my-local-modelThe Python Additional NotesThis issue has been reported in the context of OpenRouter and other OpenAI-compatible APIs(1). A community member confirmed that setting For Home Assistant add-on users specifically, there was discussion about adding the 📚 Sources:
Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the problem you are having
When configuring the
genaisection withprovider: openaiand a custombase_urlpointing to a self-hosted OpenAI-compatible API (e.g. vLLM, Ollama), thebase_urlvalue is accepted by config validation and stored internally, but never passed to theOpenAI()client constructor. All GenAI requests are sent toapi.openai.cominstead of the configured endpoint.Version
0.16.4
Frigate config file
Relevant Frigate log output
Relevant go2rtc log output
Frigate stats
No response
Operating system
Home Assistant OS
Install method
Home Assistant Add-on
docker-compose file or Docker CLI command
n/aObject Detector
Coral
Screenshots of the Frigate UI's System metrics pages
No response
Any other information that may be helpful
In
frigate/genai/openai.py, the_init_providermethod only passesapi_keyto theOpenAI()constructor:The
base_urlfromself.genai_config.base_urlis never included. The fix would be:Workaround
The Python
openailibrary automatically reads theOPENAI_BASE_URLenvironment variable when no explicitbase_urlis passed to the constructor. Frigate supports setting environment variables via the top-levelenvironment_varsconfig section:Additional Context
I understand that v0.17.x introduces
provider_options.base_urlwhich may address this. However, thebase_urlfield in v0.16.x silently accepts the value without any warning that it won't be used, which makes debugging very difficult — config validation passes, the add-on starts cleanly, and the only clue is 401 errors buried in logs pointing atapi.openai.com.If this won't be backported to 0.16.x, it would be helpful to either:
base_urlis set with theopenaiprovider (explaining it's not used), orenvironment_varsworkaround in the GenAI docs.Beta Was this translation helpful? Give feedback.
All reactions