Skip to content

Conversation

@codefromthecrypt
Copy link
Contributor

What does this pull request do?

Pydantic types are passed directly in as the response_format.

e.g. from DeepSeek:

                if self.model_name in structured_outputs_models:
                    print(f"structured_outputs_models: {type(schema)}")

                    completion = await client.beta.chat.completions.parse(
                        model=self.model_name,
                        messages=[
                            {"role": "user", "content": prompt},
                        ],
                        response_format=schema,
                    )

These are lazily resolved with logic we unlikely want to recreate here, so this just maps the default case where we don't have a string to json_schema.

In doing so, I can see this removes the attribute warnings below as the value is now always a string:

WARNING  opentelemetry.attributes:__init__.py:100 Invalid type ModelMetaclass for attribute 'gen_ai.openai.request.response_format' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types

Related issues

Closes #64

Signed-off-by: Adrian Cole <[email protected]>
@xrmx xrmx merged commit dd5dc9f into main Mar 18, 2025
14 checks passed
@codefromthecrypt codefromthecrypt deleted the structured-inputs branch March 18, 2025 16:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

openai: add structured outputs support

4 participants