Skip to content

Conversation

@areibman
Copy link
Contributor

Summary

  • fetch latest pricing data directly from OpenRouter
  • update docs and refresh script to reflect new source
  • note that OPENROUTER_API_KEY is optional
  • bump version to 0.1.24

Testing

  • pytest -q (fails: ProxyError downloading tokenization data)

https://chatgpt.com/codex/tasks/task_e_684752af35548321a64f221435080e64

Copy link
Member

@dot-agi dot-agi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fetching models data from OpenRouter reduces the number to roughly 1/3rd of the data we get from the LiteLLM repo's dictionary.

OpenRouter contains the data for the latest models while the LiteLLM dictionary has the data for other inference providers (SambaNova, Azure, Bedrock etc.).

I would prefer using LiteLLM instead of switching to OpenRouter.

@areibman areibman closed this Jun 12, 2025
@dot-agi dot-agi deleted the codex/fix-tokencost-routing-to-open-router branch July 30, 2025 01:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants