Skip to content

Fix working behind http proxy #2916

@maxnilz

Description

@maxnilz

Initial Checks

Description

When creating the HTTP client in infer_provider, it specifies a transport that has no proxy used explicitly, which causes the HTTP request to the LLM provider to be unreachable behind a proxy, e.g., the completion request.

Python, Pydantic AI & LLM client version

Python: 3.12
Pydantic-ai: the latest main branch
LLM cient version: any

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions