Add vLLM as provider #1846
ba2512005
started this conversation in
1. Feature requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Add vLLM as a provider for faster local LLM usage than Ollama.
Beta Was this translation helpful? Give feedback.
All reactions