Add OpenRouterModel as OpenAIChatModel subclass #3089
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hi! This pull request takes a shot at implementing a dedicated
OpenRouterModel
model, for issue #2936.The differentiator for this PR is that this implementation minimizes code duplication as much as possible by delegating the main logic to
OpenAIChatModel
, such that the new model class serves as a convenience layer for OpenRouter specific features.The main thinking behind this solution is that as long as the OpenRouter API is still fully accessible via the
openai
package, it would be inefficient to reimplement the internal logic using this same package again. We can instead use hooks to achieve the requested features.I would like to get some thoughts on this implementation before starting to update the docs.
Addressed issues
Provider metadata can now be accessed via the 'downstream_provider' key in ModelMessage.provider_details:
The new
OpenRouterModelSettings
allows for the reasoning parameter by OpenRouter, the thinking can then be accessed as aThinkingPart
in the model response:error
response from OpenRouter as exception instead of validation failure #2323, OpenRouter uses non-compatible finish reason #2844These are dependent on some downstream logic from OpenRouter or their own downstream providers (that a response of type 'error' will have a >= 400 status code), but for most cases I would say it works as one would expect:
OpenRouterModel
#1870 (comment)Add some additional type support to set the provider routing options from OpenRouter: