Easy Guide on Setting up OpenWebUI with LiteLLM #9628
tan-yong-sheng
started this conversation in
Show and tell
Replies: 2 comments
-
@tan-yong-sheng looks great, can you make a PR to link to your tutorial from our docs: https://docs.litellm.ai/docs/tutorials/openweb_ui (You can create an additional resources section in our doc and link to your post) |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @ishaan-jaff, thanks a lot, I have just created a pull request here: #9636 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I previously wrote a blog about setting up OpenWebUI with LiteLLM:
🔗 Running LiteLLM and OpenWebUI on Windows Localhost – A Comprehensive Guide
One of the challenging parts of configuring OpenWebUI’s RAG with LiteLLM is figuring out how to adjust the embedding settings based on the model you’re using.
In my guide, I demonstrate how to configure everything using a .env file, specifically for the gemini/text-embedding-004 model, with settings like batch_size = 100.
Sharing this here in case anyone finds it helpful. Thanks! 🚀
Beta Was this translation helpful? Give feedback.
All reactions