-
As mentioned at https://dottxt-ai.github.io/outlines/latest/features/models/:
However, for vLLM, it seems like all output types are supported, in direct contrast to the other server-based models. Also surprisingly, it uses the OpenAI client while the atual OpenAI server-based model doesn't support anything but JSON Schema. As a result I have 2 questions:
RE: |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
The vLLM server integration indeed handles all output types. I'm not sure what |
Beta Was this translation helpful? Give feedback.
The vLLM server integration indeed handles all output types. I'm not sure what
mistral.rs
uses for structured outputs, and if they can handle every output type, so you would have to check. Happy to add an integration regardless.