Expose ollama-with-open-webui ollama api
#5941
Replies: 3 comments
-
|
add this to open-webui ENABLE_OLLAMA_API=truethen you can either make api calls using your url /api/xxxx or you can pass through to ollama if you havent already i would recommend you setting up an api key https://docs.openwebui.com/getting-started/env-configuration#api-key-endpoint-restrictions |
Beta Was this translation helpful? Give feedback.
-
|
also if you are only making backend api calls to ollama, then as long as your app is in same docker network you can just use ollama-api:11434 directly from your backend since you cant make calls directly from your front end without exposing your api keys |
Beta Was this translation helpful? Give feedback.
-
|
Hello @djsisson, @lucasdidur, Today when i use Ollama with WebUI service on Coolify. I dont see the API Key option under Account |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using Ollama With Open Webui and I whant to expose the http://ollama-api:11434 to use with my aplications. How can I do this?
This is the original service
Beta Was this translation helpful? Give feedback.
All reactions