-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
When creating the HTTP client in infer_provider, it specifies a transport that has no proxy used explicitly, which causes the HTTP request to the LLM provider to be unreachable behind a proxy, e.g., the completion request.
Python, Pydantic AI & LLM client version
Python: 3.12
Pydantic-ai: the latest main branch
LLM cient version: any
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working