Resolving 'No Module Named vllm' error for use of litellm as proxy for vllm #1727
Unanswered
yet-another-schizotypic
asked this question in
Q&A
Replies: 1 comment
-
@yet-another-schizotypic thanks for this issue
Yes it is possible, is your vllm backend OpenAI compatible ? We can hop on a call and help get you setup with litellm proxy |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm encountering an issue integrating litellm with vllm in proxy mode. My objective is to configure litellm to receive OpenAI-compatible requests and then forward them to a local vllm instance.
TL;DR:
["<class 'litellm.exceptions.APIConnectionError'>Status: 500Message: NoneVLLMException - No module named 'vllm'Full exceptionVLLMException - No module named 'vllm'"]
To ensure that the problem isn't with vllm, I've double-checked its functionality.
Run command:
vllm logs:
vllm test query:
vllm logs after the query:
Now, turning back to litellm. I use the following command to run it:
My litellm config is very simple:
Here is the contents:
It appears that there are no issues when starting litellm, as indicated by its logs:
However, when I attempt to execute an OpenAI-compatible POST query on litellm, I encounter the following error:
The logs from litellm indicate these specific errors:
Is it possible to use vllm as a backend for litellm, and if so, how can it be configured?
Beta Was this translation helpful? Give feedback.
All reactions