Skip to content

vllm localhost fails to accept json #147

@rmenziejr

Description

@rmenziejr

the openai format provided has the following format for the body:
{
"model": "model_id",
"parameters": {
"max_new_tokens": 60,
...
},
"prompt":"text"
}

however the accepted format in VLLM does not have a parameters field. it should be of the format:
{
"model": "model_id",
"max_new_tokens": 60,
...
"prompt":"text"
}

When you try to set the request body and don't include the parameters as a field then it is automatically added.
This then throws a 400 request error.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions