|
| 1 | +--- |
| 2 | +sidebar_label: xAI (Grok) |
| 3 | +--- |
| 4 | + |
| 5 | +# Using xAI (Grok) With Roo Code |
| 6 | + |
| 7 | +xAI is the company behind Grok, a large language model known for its conversational abilities and large context window. Grok models are designed to provide helpful, informative, and contextually relevant responses. |
| 8 | + |
| 9 | +**Website:** [https://x.ai/](https://x.ai/) |
| 10 | + |
| 11 | +## Getting an API Key |
| 12 | + |
| 13 | +1. **Sign Up/Sign In:** Go to the [xAI Console](https://console.x.ai/). Create an account or sign in. |
| 14 | +2. **Navigate to API Keys:** Go to the API keys section in your dashboard. |
| 15 | +3. **Create a Key:** Click to create a new API key. Give your key a descriptive name (e.g., "Roo Code"). |
| 16 | +4. **Copy the Key:** **Important:** Copy the API key *immediately*. You will not be able to see it again. Store it securely. |
| 17 | + |
| 18 | +## Supported Models |
| 19 | + |
| 20 | +Roo Code supports the following xAI Grok models: |
| 21 | + |
| 22 | +### Grok-3 Models |
| 23 | +* `grok-3-beta` (Default) - xAI's Grok-3 beta model with 131K context window |
| 24 | +* `grok-3-fast-beta` - xAI's Grok-3 fast beta model with 131K context window |
| 25 | +* `grok-3-mini-beta` - xAI's Grok-3 mini beta model with 131K context window |
| 26 | +* `grok-3-mini-fast-beta` - xAI's Grok-3 mini fast beta model with 131K context window |
| 27 | + |
| 28 | +### Grok-2 Models |
| 29 | +* `grok-2-latest` - xAI's Grok-2 model - latest version with 131K context window |
| 30 | +* `grok-2` - xAI's Grok-2 model with 131K context window |
| 31 | +* `grok-2-1212` - xAI's Grok-2 model (version 1212) with 131K context window |
| 32 | + |
| 33 | +### Grok Vision Models |
| 34 | +* `grok-2-vision-latest` - xAI's Grok-2 Vision model - latest version with image support and 32K context window |
| 35 | +* `grok-2-vision` - xAI's Grok-2 Vision model with image support and 32K context window |
| 36 | +* `grok-2-vision-1212` - xAI's Grok-2 Vision model (version 1212) with image support and 32K context window |
| 37 | +* `grok-vision-beta` - xAI's Grok Vision Beta model with image support and 8K context window |
| 38 | + |
| 39 | +### Legacy Models |
| 40 | +* `grok-beta` - xAI's Grok Beta model (legacy) with 131K context window |
| 41 | + |
| 42 | +## Configuration in Roo Code |
| 43 | + |
| 44 | +1. **Open Roo Code Settings:** Click the gear icon (<Codicon name="gear" />) in the Roo Code panel. |
| 45 | +2. **Select Provider:** Choose "xAI" from the "API Provider" dropdown. |
| 46 | +3. **Enter API Key:** Paste your xAI API key into the "xAI API Key" field. |
| 47 | +4. **Select Model:** Choose your desired Grok model from the "Model" dropdown. |
| 48 | + |
| 49 | +## Reasoning Capabilities |
| 50 | + |
| 51 | +Grok 3 Mini models feature specialized reasoning capabilities, allowing them to "think before responding" - particularly useful for complex problem-solving tasks. |
| 52 | + |
| 53 | +### Reasoning-Enabled Models |
| 54 | + |
| 55 | +Reasoning is only supported by: |
| 56 | +* `grok-3-mini-beta` |
| 57 | +* `grok-3-mini-fast-beta` |
| 58 | + |
| 59 | +The Grok 3 models `grok-3-beta` and `grok-3-fast-beta` do not support reasoning. |
| 60 | + |
| 61 | +### Controlling Reasoning Effort |
| 62 | + |
| 63 | +When using reasoning-enabled models, you can control how hard the model thinks with the `reasoning_effort` parameter: |
| 64 | + |
| 65 | +* `low`: Minimal thinking time, using fewer tokens for quick responses |
| 66 | +* `high`: Maximum thinking time, leveraging more tokens for complex problems |
| 67 | + |
| 68 | +Choose `low` for simple queries that should complete quickly, and `high` for harder problems where response latency is less important. |
| 69 | + |
| 70 | +### Key Features |
| 71 | + |
| 72 | +* **Step-by-Step Problem Solving**: The model thinks through problems methodically before delivering an answer |
| 73 | +* **Math & Quantitative Strength**: Excels at numerical challenges and logic puzzles |
| 74 | +* **Reasoning Trace Access**: The model's thinking process is available via the `reasoning_content` field in the response completion object |
| 75 | + |
| 76 | +## Tips and Notes |
| 77 | + |
| 78 | +* **Context Window:** Most Grok models feature large context windows (up to 131K tokens), allowing you to include substantial amounts of code and context in your prompts. |
| 79 | +* **Vision Capabilities:** Select vision-enabled models (`grok-2-vision-latest`, `grok-2-vision`, etc.) when you need to process or analyze images. |
| 80 | +* **Pricing:** Pricing varies by model, with input costs ranging from $0.3 to $5.0 per million tokens and output costs from $0.5 to $25.0 per million tokens. Refer to the xAI documentation for the most current pricing information. |
| 81 | +* **Performance Tradeoffs:** "Fast" variants typically offer quicker response times but may have higher costs, while "mini" variants are more economical but may have reduced capabilities. |
0 commit comments