Skip to content

Conversation

@boltzmann-brain
Copy link
Contributor

…ator bugfix for certain types of None responses.

This has two fixes: the first for mistral and the second (very minor) one for OpenAI-style generators.

The first fix works around a bug in the mistralai package which doesn't account for thinking tokens in Magistral models. If the response fails the pydantic validation in mistralai, we will fall back to using raw HTTP requests.

We can get rid of the HTTP workaround once mistral fixes that bug.

Possibly related bug: mistralai/client-python#252

The second, OpenAI fix is a minor one that handles certain None responses better so that the run doesn't just fail. I encountered such None responses occasionally while querying gemini-2.5.flash.

Verification

Try to start a run on magistral-medium-2509 with the previous version of the generator. It doesn't really matter which probe. At some point it will fail with a Pydantic validation error, because the structure of the response doesn't fit mistralai's pydantic model.

With this version of the generator, the Pydantic validation error will be written to the log, but garak will retry with a raw HTTP request and the run will continue.

…ator bugfix for certain types of None responses
@leondz leondz self-requested a review January 12, 2026 08:17
@leondz leondz added bug Something isn't working generators Interfaces with LLMs labels Jan 15, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working generators Interfaces with LLMs

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants