-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed
Labels
Feature requestNew feature requestNew feature request
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
Hi, according to the groq documentation, the "Structured Outputs" (or "JSON Schema Mode") should be supported, at least for some subset of models.
This should allow the user to use NativeOutput (as per https://ai.pydantic.dev/output/#native-output), but it's not the case.
Using the example from documentation with groq instead of openai results in pydantic_ai.exceptions.UserError: Native structured output is not supported by the model.
I'm not sure if this is a bug or rather feature request, but I marked it a bug, since I think that user would reasonably expect this to work, based on the documentation
Example Code
from pydantic_ai import Agent, NativeOutput
from pydantic import BaseModel
class Fruit(BaseModel):
name: str
color: str
class Vehicle(BaseModel):
name: str
wheels: int
agent = Agent(
'groq:openai/gpt-oss-120b',
output_type=NativeOutput(
[Fruit, Vehicle],
name='Fruit_or_vehicle',
description='Return a fruit or vehicle.'
),
)
result = agent.run_sync('What is a Ford Explorer?')This results in:
Traceback (most recent call last):
...
result = agent.run_sync('What is a Ford Explorer?')
File "...venv/lib/python3.13/site-packages/pydantic_ai/agent/abstract.py", line 306, in run_sync
return get_event_loop().run_until_complete(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
self.run(
^^^^^^^^^
...<11 lines>...
)
^
)
^
File ".pyenv/versions/3.13.1/lib/python3.13/asyncio/base_events.py", line 720, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File ".venv/lib/python3.13/site-packages/pydantic_ai/agent/abstract.py", line 200, in run
async with self.iter(
~~~~~~~~~^
user_prompt=user_prompt,
^^^^^^^^^^^^^^^^^^^^^^^^
...<7 lines>...
toolsets=toolsets,
^^^^^^^^^^^^^^^^^^
) as agent_run:
^
File ".pyenv/versions/3.13.1/lib/python3.13/contextlib.py", line 214, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.13/site-packages/pydantic_ai/agent/__init__.py", line 552, in iter
output_schema = self._prepare_output_schema(output_type, model_used.profile)
File ".venv/lib/python3.13/site-packages/pydantic_ai/agent/__init__.py", line 1313, in _prepare_output_schema
schema.raise_if_unsupported(model_profile)
~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^
File ".venv/lib/python3.13/site-packages/pydantic_ai/_output.py", line 471, in raise_if_unsupported
raise UserError('Native structured output is not supported by the model.')
pydantic_ai.exceptions.UserError: Native structured output is not supported by the model.
Python, Pydantic AI & LLM client version
Python | 3.13
pydantic-ai | 0.8.1
pydantic-ai-slim | 0.8.1
Metadata
Metadata
Assignees
Labels
Feature requestNew feature requestNew feature request