Skip to content

Feedback on Tool Usage Issue with Ollama Provider #253

@Shedulab

Description

@Shedulab

Hi,

I encountered an issue where automatic tool usage (tools= together with max_turns=) didn’t work for locally downloaded OSS models when using the Ollama Provider. I had GPT-5 fix the code, and it somehow works now — though I haven’t had the time to inspect exactly how GPT-5 fixed it. At least it works. 🙂

I’m sharing the code below for reference, in case it might be useful for an official fix.
ollama_provider.py

By the way, thank you very much for the incredible work you’ve done creating this amazing library.

Best regards,
Huseyin

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions