Skip to content

Commit 3641872

Browse files
committed
feat: Add OVHcloud AI Endpoints provider
1 parent 3a86855 commit 3641872

File tree

12 files changed

+178
-3
lines changed

12 files changed

+178
-3
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ We built Pydantic AI with one simple aim: to bring that FastAPI feeling to GenAI
3939
[Pydantic Validation](https://docs.pydantic.dev/latest/) is the validation layer of the OpenAI SDK, the Google ADK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more. _Why use the derivative when you can go straight to the source?_ :smiley:
4040

4141
2. **Model-agnostic**:
42-
Supports virtually every [model](https://ai.pydantic.dev/models/overview) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel, Nebius. If your favorite model or provider is not listed, you can easily implement a [custom model](https://ai.pydantic.dev/models/overview#custom-models).
42+
Supports virtually every [model](https://ai.pydantic.dev/models/overview) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel, Nebius, OVHcloud AI Endpoints. If your favorite model or provider is not listed, you can easily implement a [custom model](https://ai.pydantic.dev/models/overview#custom-models).
4343

4444
3. **Seamless Observability**:
4545
Tightly [integrates](https://ai.pydantic.dev/logfire) with [Pydantic Logfire](https://pydantic.dev/logfire), our general-purpose OpenTelemetry observability platform, for real-time debugging, evals-based performance monitoring, and behavior, tracing, and cost tracking. If you already have an observability platform that supports OTel, you can [use that too](https://ai.pydantic.dev/logfire#alternative-observability-backends).

docs/api/providers.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,3 +43,5 @@
4343
::: pydantic_ai.providers.litellm.LiteLLMProvider
4444

4545
::: pydantic_ai.providers.nebius.NebiusProvider
46+
47+
::: pydantic_ai.providers.ovhcloud.OVHcloudAIEndpointsProvider

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ We built Pydantic AI with one simple aim: to bring that FastAPI feeling to GenAI
1414
[Pydantic Validation](https://docs.pydantic.dev/latest/) is the validation layer of the OpenAI SDK, the Google ADK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more. _Why use the derivative when you can go straight to the source?_ :smiley:
1515

1616
2. **Model-agnostic**:
17-
Supports virtually every [model](models/overview.md) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel, Nebius. If your favorite model or provider is not listed, you can easily implement a [custom model](models/overview.md#custom-models).
17+
Supports virtually every [model](models/overview.md) and provider: OpenAI, Anthropic, Gemini, DeepSeek, Grok, Cohere, Mistral, and Perplexity; Azure AI Foundry, Amazon Bedrock, Google Vertex AI, Ollama, LiteLLM, Groq, OpenRouter, Together AI, Fireworks AI, Cerebras, Hugging Face, GitHub, Heroku, Vercel, Nebius, OVHcloud AI Endpoints. If your favorite model or provider is not listed, you can easily implement a [custom model](models/overview.md#custom-models).
1818

1919
3. **Seamless Observability**:
2020
Tightly [integrates](logfire.md) with [Pydantic Logfire](https://pydantic.dev/logfire), our general-purpose OpenTelemetry observability platform, for real-time debugging, evals-based performance monitoring, and behavior, tracing, and cost tracking. If you already have an observability platform that supports OTel, you can [use that too](logfire.md#alternative-observability-backends).

docs/models/openai.md

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -724,3 +724,37 @@ result = agent.run_sync('What is the capital of France?')
724724
print(result.output)
725725
#> The capital of France is Paris.
726726
```
727+
728+
### OVHcloud AI Endpoints
729+
730+
To use OVHcloud AI Endpoints, you need to create a new API key. To do so, go to the [OVHcloud manager](https://ovh.com/manager), then in Public Cloud > AI Endpoints > API keys. Click on `Create a new API key` and copy your new key.
731+
732+
You can explore our [catalog](https://endpoints.ai.cloud.ovh.net/catalog) to find which models are available.
733+
734+
Once you've set the `OVHCLOUD_AI_ENDPOINTS_API_KEY` environment variable with your new API key, you can run the following:
735+
736+
```python
737+
from pydantic_ai import Agent
738+
739+
agent = Agent('ovhcloud:gpt-oss-120b')
740+
result = agent.run_sync('What is the capital of France?')
741+
print(result.output)
742+
#> The capital of France is Paris.
743+
```
744+
745+
If you need to configure the provider, you can use the [`OVHcloudAIEndpointsProvider`][pydantic_ai.providers.ovhcloud.OVHcloudAIEndpointsProvider] class:
746+
747+
```python
748+
from pydantic_ai import Agent
749+
from pydantic_ai.models.openai import OpenAIChatModel
750+
from pydantic_ai.providers.ovhcloud import OVHcloudAIEndpointsProvider
751+
752+
model = OpenAIChatModel(
753+
'gpt-oss-120b',
754+
provider=OVHcloudAIEndpointsProvider(api_key='your-api-key'),
755+
)
756+
agent = Agent(model)
757+
result = agent.run_sync('What is the capital of France?')
758+
print(result.output)
759+
#> The capital of France is Paris.
760+
```

docs/models/overview.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ In addition, many providers are compatible with the OpenAI API, and can be used
2929
- [Cerebras](openai.md#cerebras)
3030
- [LiteLLM](openai.md#litellm)
3131
- [Nebius AI Studio](openai.md#nebius-ai-studio)
32+
- [OVHcloud AI Endpoints](openai.md#ovhcloud-ai-endpoints)
3233

3334
Pydantic AI also comes with [`TestModel`](../api/models/test.md) and [`FunctionModel`](../api/models/function.md)
3435
for testing and development.

pydantic_ai_slim/pydantic_ai/models/__init__.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -688,6 +688,7 @@ def infer_model(model: Model | KnownModelName | str) -> Model: # noqa: C901
688688
'vercel',
689689
'litellm',
690690
'nebius',
691+
'ovhcloud',
691692
):
692693
from .openai import OpenAIChatModel
693694

pydantic_ai_slim/pydantic_ai/models/openai.py

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -285,6 +285,7 @@ def __init__(
285285
'vercel',
286286
'litellm',
287287
'nebius',
288+
'ovhcloud',
288289
]
289290
| Provider[AsyncOpenAI] = 'openai',
290291
profile: ModelProfileSpec | None = None,
@@ -314,6 +315,7 @@ def __init__(
314315
'vercel',
315316
'litellm',
316317
'nebius',
318+
'ovhcloud',
317319
]
318320
| Provider[AsyncOpenAI] = 'openai',
319321
profile: ModelProfileSpec | None = None,
@@ -342,6 +344,7 @@ def __init__(
342344
'vercel',
343345
'litellm',
344346
'nebius',
347+
'ovhcloud',
345348
]
346349
| Provider[AsyncOpenAI] = 'openai',
347350
profile: ModelProfileSpec | None = None,
@@ -903,7 +906,9 @@ def __init__(
903906
self,
904907
model_name: OpenAIModelName,
905908
*,
906-
provider: Literal['openai', 'deepseek', 'azure', 'openrouter', 'grok', 'fireworks', 'together', 'nebius']
909+
provider: Literal[
910+
'openai', 'deepseek', 'azure', 'openrouter', 'grok', 'fireworks', 'together', 'nebius', 'ovhcloud'
911+
]
907912
| Provider[AsyncOpenAI] = 'openai',
908913
profile: ModelProfileSpec | None = None,
909914
settings: ModelSettings | None = None,

pydantic_ai_slim/pydantic_ai/providers/__init__.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -146,6 +146,10 @@ def infer_provider_class(provider: str) -> type[Provider[Any]]: # noqa: C901
146146
from .nebius import NebiusProvider
147147

148148
return NebiusProvider
149+
elif provider == 'ovhcloud':
150+
from .ovhcloud import OVHcloudAIEndpointsProvider
151+
152+
return OVHcloudAIEndpointsProvider
149153
else: # pragma: no cover
150154
raise ValueError(f'Unknown provider: {provider}')
151155

Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
from __future__ import annotations as _annotations
2+
3+
import os
4+
from typing import overload
5+
6+
import httpx
7+
8+
from pydantic_ai import ModelProfile
9+
from pydantic_ai.exceptions import UserError
10+
from pydantic_ai.models import cached_async_http_client
11+
from pydantic_ai.providers import Provider
12+
13+
try:
14+
from openai import AsyncOpenAI
15+
except ImportError as _import_error: # pragma: no cover
16+
raise ImportError(
17+
'Please install the `openai` package to use OVHcloud AI Endpoints provider.'
18+
'You can use the `openai` optional group — `pip install "pydantic-ai-slim[openai]"`'
19+
) from _import_error
20+
21+
22+
class OVHcloudAIEndpointsProvider(Provider[AsyncOpenAI]):
23+
"""Provider for OVHcloud AI Endpoints."""
24+
25+
@property
26+
def name(self) -> str:
27+
return 'ovhcloud'
28+
29+
@property
30+
def base_url(self) -> str:
31+
return 'https://oai.endpoints.kepler.ai.cloud.ovh.net/v1'
32+
33+
@property
34+
def client(self) -> AsyncOpenAI:
35+
return self._client
36+
37+
def model_profile(self, model_name: str) -> ModelProfile | None:
38+
return None
39+
40+
@overload
41+
def __init__(self) -> None: ...
42+
43+
@overload
44+
def __init__(self, *, api_key: str) -> None: ...
45+
46+
@overload
47+
def __init__(self, *, api_key: str, http_client: httpx.AsyncClient) -> None: ...
48+
49+
@overload
50+
def __init__(self, *, openai_client: AsyncOpenAI | None = None) -> None: ...
51+
52+
def __init__(
53+
self,
54+
*,
55+
api_key: str | None = None,
56+
openai_client: AsyncOpenAI | None = None,
57+
http_client: httpx.AsyncClient | None = None,
58+
) -> None:
59+
api_key = api_key or os.getenv('OVHCLOUD_AI_ENDPOINTS_API_KEY')
60+
if not api_key and openai_client is None:
61+
raise UserError(
62+
'Set the `OVHCLOUD_AI_ENDPOINTS_API_KEY` environment variable or pass it via '
63+
'`OVHcloudAIEndpointsProvider(api_key=...)` to use OVHcloud AI Endpoints provider.'
64+
)
65+
66+
if openai_client is not None:
67+
self._client = openai_client
68+
elif http_client is not None:
69+
self._client = AsyncOpenAI(base_url=self.base_url, api_key=api_key, http_client=http_client)
70+
else:
71+
http_client = cached_async_http_client(provider='ovhcloud')
72+
self._client = AsyncOpenAI(base_url=self.base_url, api_key=api_key, http_client=http_client)

tests/providers/test_ovhcloud.py

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
import re
2+
3+
import httpx
4+
import pytest
5+
6+
from pydantic_ai.exceptions import UserError
7+
8+
from ..conftest import TestEnv, try_import
9+
10+
with try_import() as imports_successful:
11+
import openai
12+
13+
from pydantic_ai.providers.ovhcloud import OVHcloudAIEndpointsProvider
14+
15+
16+
pytestmark = [
17+
pytest.mark.skipif(not imports_successful(), reason='openai not installed'),
18+
pytest.mark.vcr,
19+
pytest.mark.anyio,
20+
]
21+
22+
23+
def test_ovhcloud_provider():
24+
provider = OVHcloudAIEndpointsProvider(api_key='your-api-key')
25+
assert provider.name == 'ovhcloud'
26+
assert provider.base_url == 'https://oai.endpoints.kepler.ai.cloud.ovh.net/v1'
27+
assert isinstance(provider.client, openai.AsyncOpenAI)
28+
assert provider.client.api_key == 'your-api-key'
29+
30+
31+
def test_ovhcloud_provider_need_api_key(env: TestEnv) -> None:
32+
env.remove('OVHCLOUD_AI_ENDPOINTS_API_KEY')
33+
with pytest.raises(
34+
UserError,
35+
match=re.escape(
36+
'Set the `OVHCLOUD_AI_ENDPOINTS_API_KEY` environment variable or pass it via '
37+
'`OVHcloudAIEndpointsProvider(api_key=...)` to use OVHcloud AI Endpoints provider.'
38+
),
39+
):
40+
OVHcloudAIEndpointsProvider()
41+
42+
43+
def test_ovhcloud_pass_openai_client() -> None:
44+
openai_client = openai.AsyncOpenAI(api_key='your-api-key')
45+
provider = OVHcloudAIEndpointsProvider(openai_client=openai_client)
46+
assert provider.client == openai_client
47+
48+
49+
def test_ovhcloud_pass_http_client():
50+
http_client = httpx.AsyncClient()
51+
provider = OVHcloudAIEndpointsProvider(api_key='your-api-key', http_client=http_client)
52+
assert isinstance(provider.client, openai.AsyncOpenAI)
53+
assert provider.client.api_key == 'your-api-key'

0 commit comments

Comments
 (0)