-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Add provider for Anthropic's Vertexai Client #1392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| # breaking this in multiple lines breaks pycharm type recognition. However, I was unable to stop ruff from | ||
| # doing it - # fmt: skip etc didn't work :( | ||
| provider: Literal['anthropic', 'anthropic-vertex'] | ||
| | Provider[AsyncAnthropicVertex] | ||
| | Provider[AsyncAnthropic] = # fmt: skip | ||
| 'anthropic', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you need to do # fmt: off and # fmt: on after.
| instance of `Provider[AsyncAnthropic]`. If not provided, the other parameters will be used. | ||
| provider: The provider to use for the Anthropic API. Can be either the string 'anthropic', | ||
| 'anthropic-vertex', or an instance of Provider[AsyncAnthropic] or Provider[AsyncAnthropicVertex]. | ||
| Defaults to 'anthropic'. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe you should also add an entry to the KnownModelName literal at models/__init__.py
| elif provider == 'anthropic-vertex': | ||
| from .anthropic_vertex import AnthropicVertexProvider | ||
|
|
||
| return AnthropicVertexProvider() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is tricky. If you want google-vertex:<anthropic-model>, the agent should be able to infer this provider instead of the other... Maybe the provider should offer different clients depending on the model?
Right now, the GoogleVertexProvider is a generic on the client, which currently is httpx.AsyncClient. What the provider can offer a client via method?
class GoogleVertexProvider(Provider):
def get_client(self, tp: type[T]) -> T:
if isinstance(tp, httpx.AsyncClient):
return self.httpx_client
elif isinstance(tp, AsyncAnthropicVertex):
return self.anthropic_client
else:
raise ValueError('not supported')There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Kludex really want to see this happen - I would propose the following options:
- Kick out VertexAI to its own top level provider - e.g. Bedrock - and let the VertexAI model class handle Anthropic/Mistral/Llama/Gemini VertexAI providers. This would be pretty clean, as the string would be
google-vertex: <model name>which should route to the hypothetical VertexAI model class instead of a GeminiModel class. - Have routing logic based on
google-vertex: <model>that would be responsible for routing to Pydantic-supported model types - e.g. it may only support Mistral/Gemini/Anthropic. - Change the naming convention for VertexAI Gemini models from
google-vertexaitogemini-vertexaito support a model-provider model name format.
Personally, I like number 1 as I think it would be the cleanest from a top-level abstraction standpoint.
|
This PR is fine, we need to get around on how to choose this client when passing the model string to the agent. |
|
For those stuck on this too, here is a temporary solution: |
When i do this, i get cannot import name 'AsyncAnthropicVertex' from 'pydantic_ai.providers.anthropic, am i doing something wrong ? i am loading the latest (0.2.11) pydantic-ai |
|
@acehand |
|
Worth noting on this that prompt caching does not work with antropic python sdk + python... anthropics/anthropic-sdk-python#653 😢 |
|
@Kludex Just putting this back on your radar! |
|
This PR is stale, and will be closed in 3 days if no reply is received. |
|
Closing this PR as it has been inactive for 10 days. |
|
This PR is stale, and will be closed in 3 days if no reply is received. |
|
Closing this PR as it has been inactive for 10 days. |
|
This PR is stale, and will be closed in 3 days if no reply is received. |
|
Closing this PR as it has been inactive for 10 days. |
|
this is my approach just in case anyone finds it useful import os
from anthropic import AsyncAnthropicVertex
from pydantic_ai.models.anthropic import AnthropicModel
def create_vertex_model(
model_name: str = "claude-opus-4",
project_id: str = "app-prod",
region: str = "us-east5",
credentials_file: str | None = None
):
"""
Usage:
model = create_vertex_model("claude-opus-4")
agent = Agent(model, system_prompt="You are helpful")
@agent.tool_plain
def my_tool(x: int): return x * 2
result = await agent.run("Hello")
"""
# Set credentials if provided
if credentials_file:
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = credentials_file
# Create Vertex client
vertex = AsyncAnthropicVertex(
project_id=project_id,
region=region
)
# Add the missing attributes that AnthropicModel expects
# This is the key trick - expose vertex itself as 'client'
vertex.client = vertex
vertex.model_profile = None # Default profile
vertex.name = "anthropic_vertex" # Provider name
return AnthropicModel(model_name, provider=vertex) |
|
@cristiandley Did you try this approach? #1392 (comment) That shouldn't require manually setting any attributes |
|
This PR is stale, and will be closed in 3 days if no reply is received. |
|
Not necessary anymore now that we have #3292 |
Hello folks,
I had some time and started drafting a possible implementation for this issue: #960.
There's still lots of work to do, but I was feeling kind of insecure about my approach and wanted to get some (both general and concrete) feedback if possible before investing more time on it :)
My main questions are:
test_initintests/models/test_anthropic.py, how should I handle this?Thank you very much for your feedback and help and have a nice day :)