Skip to content

Bug: response_format type must be one of "text" or "json_object", but got: json_schema', 'type': 'server_error' #778

@whoisjeremylam

Description

@whoisjeremylam

What happened?

I am trying to use Maestro which supports OpenAI API compatible endpoints. The API call is failing with the aforementioned error.

I found a related issue in llama.cpp from September, 2024: ggml-org/llama.cpp#9527

If I run llama-server from llama.cpp then the API calls work, so I'm not sure if this is a case of functionality that needs to be brought to ik_llama.cpp or if there is a something that caused this not to work.

Name and Version

version: 3850 (f6f56db)
built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu

What operating system are you seeing the problem on?

Linux

Relevant log output

openai.InternalServerError: Error code: 500 - {'error': {'code': 500, 'message': 'response_format type must be one of "text" or "json_object", but got: json_schema', 'type': 'server_error'}}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions