feat: add Grok-4 AI model support with 2M context window and OpenAI-compatible API client #288
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This pull request adds support for xAI's Grok-4-fast-reasoning model to the SecureFlow CLI, including configuration, client implementation, and usage tracking. The changes ensure Grok can be selected, invoked, and its token usage properly tracked, alongside updates to recommendations and documentation.
Grok model integration:
model-context-limits.json, specifying its context window, output limits, and metadata notes. [1] [2]Codebase support for Grok:
GrokClientimplementation ingrok-client.js, supporting both standard and streaming requests to the xAI API.ai-client-factory.js) for selection when Grok is requested. [1] [2]AIModeltype definition to include Grok-4-fast-reasoning.Usage and token tracking:
TokenTrackerto handle Grok's usage format, including reasoning token tracking and fallback logic for Grok fields.