Skip to content

Fix LiteLLM cost tracking for provider-prefixed models#2257

Merged
enyst merged 1 commit intoOpenHands:mainfrom
ShreySatapara:fix-issue-1681-cost-tracking
Mar 2, 2026
Merged

Fix LiteLLM cost tracking for provider-prefixed models#2257
enyst merged 1 commit intoOpenHands:mainfrom
ShreySatapara:fix-issue-1681-cost-tracking

Conversation

@ShreySatapara
Copy link
Copy Markdown
Contributor

Summary

Fixes missing cost tracking in Conversation.run() when using provider-prefixed model names (e.g. vertex_ai/claude-sonnet-4-5@..., azure/gpt-5.2-chat).
Previously, telemetry’s LiteLLM cost fallback stripped the provider and passed only the bare model name to litellm_completion_cost(), causing LiteLLM to warn “LLM Provider NOT provided” and return None for cost. This PR preserves the provider by passing custom_llm_provider=<provider> and model=<bare_model> to LiteLLM’s cost calculator.

Also adds regression tests to ensure provider and model are propagated correctly for cost calculation.

[fill in a summary of this PR]

Checklist

  • [✔ ] If the PR is changing/adding functionality, are there tests to reflect this?
    • Added unit tests in tests/sdk/test_telemetry.py that assert litellm_completion_cost() is called with custom_llm_provider and the correct bare model (covers Vertex AI + Azure).
  • [ ✔] If there is an example, have you run the example to make sure that it works?
    • Conducted Experiment before and after doing the suggested changes, cost is getting reflected in telemetry logs for model names with provided which was showing $0.00 before making suggested changes.
  • [ ✔] If there are instructions on how to run the code, have you followed the instructions and made sure that it works?
    • Ran local checks per DEVELOPMENT.md: make format, make lint, uv run pre-commit run --all-files, and uv run pytest.
  • If the feature is significant enough to require documentation, is there a PR open on the OpenHands/docs repository with the same branch name?
    • N/A (bugfix; no user-facing docs update required).
  • Is the github CI passing?
    • I did the pre-commit checks and all the offline checks, not sure which additional workflow I need to execute.

Copy link
Copy Markdown
Collaborator

@enyst enyst left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! Litellm can be unexpected sometimes, LLMs do have "provider/model" names so it's doing the split itself in the regular completion method... but maybe not in completion_cost? I set off OpenHands agent to take a look too 😅

Copy link
Copy Markdown
Collaborator

@enyst enyst left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the fix!

@enyst enyst enabled auto-merge (squash) March 2, 2026 10:17
@enyst enyst merged commit 2d3d96d into OpenHands:main Mar 2, 2026
28 of 30 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants