Skip to content

Conversation

@mochow13
Copy link
Contributor

The MR adds support for LiteLLM in Pydantic AI. The idea is to use a LiteLLMProvider for OpenAIModel since LiteLLM API is OpenAI API compatible.

Closes #1496

@mochow13
Copy link
Contributor Author

@DouweM Pipeline is failing due to:

Name                               Stmts   Miss Branch BrPart   Cover   Missing
-------------------------------------------------------------------------------
tests/models/test_model_names.py      68      6      8      0  92.11%   78-98, 140-141
-------------------------------------------------------------------------------
TOTAL                              28027      6   4472      0  99.98%

236 files skipped due to complete coverage.
Coverage failure: total of 99.98 is less than fail-under=100.00
make: *** [test] Error 2

I am not sure what to do. In the test_model_names.py file, there is no code related to OpenRouter or Together. So confused whether we should put any tests for LiteLLM or not.

object: model
owned_by: Cerebras
object: list
code: wrong_api_key
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you restore this file please? That may also fix the coverage issue

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok it's green now. I just needed to rebase. Previously it was not rebased.

@mochow13 mochow13 force-pushed the mottakin/litellm-integration branch from 36ef733 to 77436fc Compare September 2, 2025 18:44
@mochow13 mochow13 force-pushed the mottakin/litellm-integration branch from 77436fc to a5f239d Compare September 3, 2025 19:00
@DouweM DouweM merged commit 85d47a1 into pydantic:main Sep 3, 2025
30 checks passed
@DouweM
Copy link
Collaborator

DouweM commented Sep 3, 2025

@mochow13 Thanks Mottakin, nice work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Adding LiteLLM as model wrap just like how google-adk does it.

3 participants