Replies: 1 comment 3 replies
-
From my understanding, you are able to use If you'd prefer to have a custom vllm implementation in community, you're welcome to do that as well. However, as you've found, you can't have this subclass the implementation in langchain-openai. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Hi! I want to add VLLM OpenAI compatible client with additional parameters.
Motivation
Now all additional vLLM generation params should be passed through extra_body without validation. This class will explicitly specify supported params and add validations.
Proposal (If applicable)
I've implemented solution in #23692. The problem with this proposal is that @efriis suggested moving this class to the community package. However, this class would require
OpenAI
andChatOpenAI
as parent classes. I'm not sure how to do this without adding additional requirements tolangchain_community
.Beta Was this translation helpful? Give feedback.
All reactions