add glm-5 to model_list.json#427
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 8be72c3072
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "Codex (@codex) review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "Codex (@codex) address that feedback".
| "fireworks" | ||
| ] | ||
| }, | ||
| "fireworks_ai/accounts/fireworks/models/glm-5": { |
There was a problem hiding this comment.
Register glm-5 in endpoint type map
This new model entry is only added to model_list.json, but getModelEndpointTypes in packages/proxy/schema/index.ts resolves provider routing from AvailableEndpointTypes first and otherwise falls back to DefaultEndpointTypes by format. Because fireworks_ai/accounts/fireworks/models/glm-5 is missing from that map (I checked the AvailableEndpointTypes block in index.ts), it will resolve to the OpenAI/Azure defaults instead of Fireworks, which can mis-route requests and select the wrong secret type whenever lookup falls back (for example in edge/index.ts when credential lookup fails).
Useful? React with 👍 / 👎.
User would like to add the glm-5 model to the proxy list for Fireworks.
Pulled cost and max_input_tokens from:
https://fireworks.ai/models/fireworks/glm-5 https://docs.z.ai/guides/llm/glm-5
Pulled max_output_tokens from:
https://docs.z.ai/guides/llm/glm-5