-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Support OpenAI codex-mini
#3677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
|
GO GO GO! |
|
Any news? |
|
This looks good to me, it only adds support for OpenRouter as far as I can tell since it's not available in the usual v1 endpoint for native OpenAI. |
|
I don't think this one works yet |
|
I don't think the Codex model is going to work for Roo Code; it's been developed for use with the Codex tool and not general programming tasks. |
|
ok -- does work with opencode and others |
* migrate toggleWindsurfRules * changeset --------- Co-authored-by: Elephant Lumps <[email protected]>
Description
You can't pass
temperature, and it looks like we need to update our OpenAI client library to use it.Important
Support for OpenAI
codex-minimodel by setting temperature to undefined and addingDEEP_SEEK_DEFAULT_TOP_Pconstant.openrouter.ts,OpenRouterHandlersetstemperaturetoundefinedfor models starting withopenai/codex.DEEP_SEEK_DEFAULT_TOP_Pinconstants.tsand uses it inopenrouter.tsfordeepseek/deepseek-r1models.ProviderNametype ininterface.ts,roo-code.d.ts, andtypes.ts.providerNamesininterface.tsandroo-code.d.ts.codex-mini-latestinopenAiNativeModelsinapi.ts.This description was created by
for 14cc21d. You can customize this summary. It will automatically update as commits are pushed.