You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/faq.md
+25-25Lines changed: 25 additions & 25 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -34,7 +34,7 @@ Roo Code can help with a variety of coding tasks, including:
34
34
35
35
### Is Roo Code free to use?
36
36
37
-
The Roo Code extension itself is free and open-source. However, Roo Code relies on external API providers (like Anthropic, OpenAI, OpenRouter, etc.) for its AI capabilities. These providers typically charge for API usage based on the number of tokens processed. You will need to create an account and obtain an API key from your chosen provider. See [Setting Up Your First AI Provider](getting-started/connecting-api-provider) for details.
37
+
The Roo Code extension itself is free and open-source. However, Roo Code relies on external API providers (like [Anthropic](providers/anthropic), [OpenAI](providers/openai), [OpenRouter](providers/openrouter), etc.) for its AI capabilities. These providers typically charge for API usage based on the number of tokens processed. You will need to create an account and obtain an API key from your chosen provider. See [Setting Up Your First AI Provider](getting-started/connecting-api-provider) for details.
38
38
39
39
### What are the risks of using Roo Code?
40
40
@@ -54,56 +54,56 @@ See the [Installation Guide](getting-started/installing) for detailed instructio
54
54
55
55
Roo Code supports a wide range of API providers, including:
56
56
57
-
* Anthropic (Claude)
58
-
* OpenAI (GPT models)
59
-
* OpenRouter (access to multiple models)
60
-
* Google Gemini
61
-
* Glama
62
-
* AWS Bedrock
63
-
* GCP Vertex AI
64
-
* Ollama (local models)
65
-
* LM Studio (local models)
66
-
* DeepSeek
67
-
* Mistral
68
-
* Unbound
69
-
* Requesty
70
-
* VS Code Language Model API
57
+
*[Anthropic (Claude)](/providers/anthropic)
58
+
*[OpenAI (GPT models)](/providers/openai)
59
+
*[OpenRouter (access to multiple models)](/providers/openrouter)
60
+
*[Google Gemini](/providers/gemini)
61
+
*[Glama](/providers/glama)
62
+
*[AWS Bedrock](/providers/bedrock)
63
+
*[GCP Vertex AI](/providers/vertex)
64
+
*[Ollama (local models)](/providers/ollama)
65
+
*[LM Studio (local models)](/providers/lmstudio)
66
+
*[DeepSeek](/providers/deepseek)
67
+
*[Mistral](/providers/mistral)
68
+
*[Unbound](/providers/unbound)
69
+
*[Requesty](/providers/requesty)
70
+
*[VS Code Language Model API](/providers/vscode-lm)
71
71
72
72
### How do I get an API key?
73
73
74
-
Each API provider has its own process for obtaining an API key. See the [Setting Up Your First AI Provider](getting-started/connecting-api-provider) for links to the relevant documentation for each provider.
74
+
Each API provider has its own process for obtaining an API key. See the [Setting Up Your First AI Provider](/getting-started/connecting-api-provider) for links to the relevant documentation for each provider.
75
75
76
76
### Can I use Roo Code with local models?
77
77
78
-
Yes, Roo Code supports running models locally using Ollama and LM Studio. See [Using Local Models](advanced-usage/local-models) for instructions.
78
+
Yes, Roo Code supports running models locally using [Ollama](/providers/ollama) and [LM Studio](/providers/lmstudio). See [Using Local Models](/advanced-usage/local-models) for instructions.
79
79
80
80
## Usage
81
81
82
82
### How do I start a new task?
83
83
84
-
Open the Roo Code panel (<Codiconname="rocket" />) and type your task in the chat box. Be clear and specific about what you want Roo Code to do. See [Typing Your Requests](basic-usage/typing-your-requests) for best practices.
84
+
Open the Roo Code panel (<Codiconname="rocket" />) and type your task in the chat box. Be clear and specific about what you want Roo Code to do. See [Typing Your Requests](/basic-usage/typing-your-requests) for best practices.
85
85
86
86
### What are modes in Roo Code?
87
87
88
-
[Modes](basic-usage/modes) are different personas that Roo Code can adopt, each with a specific focus and set of capabilities. The built-in modes are:
88
+
[Modes](/basic-usage/modes) are different personas that Roo Code can adopt, each with a specific focus and set of capabilities. The built-in modes are:
89
89
90
90
***Code:** For general-purpose coding tasks.
91
91
***Architect:** For planning and technical leadership.
92
92
***Ask:** For answering questions and providing information.
93
93
94
-
You can also create [Custom Modes](advanced-usage/custom-modes).
94
+
You can also create [Custom Modes](/advanced-usage/custom-modes).
95
95
96
96
### How do I switch between modes?
97
97
98
98
Use the dropdown menu in the chat input area to select a different mode, or use the `/` command to switch to a specific mode.
99
99
100
100
### What are tools and how do I use them?
101
101
102
-
[Tools](basic-usage/using-tools) are how Roo Code interacts with your system. Roo Code automatically selects and uses the appropriate tools to complete your tasks. You don't need to call tools directly. You will be prompted to approve or reject each tool use.
102
+
[Tools](/basic-usage/using-tools) are how Roo Code interacts with your system. Roo Code automatically selects and uses the appropriate tools to complete your tasks. You don't need to call tools directly. You will be prompted to approve or reject each tool use.
103
103
104
104
### What are context mentions?
105
105
106
-
[Context mentions](basic-usage/context-mentions) are a way to provide Roo Code with specific information about your project, such as files, folders, or problems. Use the "@" symbol followed by the item you want to mention (e.g., `@/src/file.ts`, `@problems`).
106
+
[Context mentions](/basic-usage/context-mentions) are a way to provide Roo Code with specific information about your project, such as files, folders, or problems. Use the "@" symbol followed by the item you want to mention (e.g., `@/src/file.ts`, `@problems`).
107
107
108
108
### Can Roo Code access the internet?
109
109
@@ -127,16 +127,16 @@ Yes, you can customize Roo Code in several ways:
127
127
***Settings:** Adjust various settings, such as auto-approval, diff editing, and more.
128
128
129
129
### Does Roo Code have any auto approval settings?
130
-
Yes, Roo Code has a few settings that when enabled will automatically approve actions. Find out more [here](advanced-usage/auto-approving-actions)
130
+
Yes, Roo Code has a few settings that when enabled will automatically approve actions. Find out more [here](/advanced-usage/auto-approving-actions).
131
131
132
132
## Advanced Features
133
133
134
134
### Can I use Roo offline?
135
-
Yes, if you use a [local model](advanced-usage/local-models).
135
+
Yes, if you use a [local model](/advanced-usage/local-models).
136
136
137
137
### What is MCP (Model Context Protocol)?
138
138
139
-
[MCP](advanced-usage/mcp) is a protocol that allows Roo Code to communicate with external servers, extending its capabilities with custom tools and resources.
139
+
[MCP](/advanced-usage/mcp) is a protocol that allows Roo Code to communicate with external servers, extending its capabilities with custom tools and resources.
Anthropic is an AI safety and research company that builds reliable, interpretable, and steerable AI systems. Their Claude models are known for their strong reasoning abilities, helpfulness, and honesty.
1.**Sign Up/Sign In:** Go to the [Anthropic Console](https://console.anthropic.com/). Create an account or sign in.
10
+
2.**Navigate to API Keys:** Find the API keys section in your account settings. The exact location might vary, but it's typically in a "Developer" or "API" section.
11
+
3.**Create a Key:** Click "Create Key" (or similar). Give your key a descriptive name (e.g., "Roo Code").
12
+
4.**Copy the Key:****Important:** Copy the API key *immediately*. You will not be able to see it again. Store it securely.
13
+
14
+
## Supported Models
15
+
16
+
Roo Code supports the following Anthropic Claude models:
17
+
18
+
*`claude-3-5-sonnet-20241022` (Recommended)
19
+
*`claude-3-5-haiku-20241022`
20
+
*`claude-3-opus-20240229`
21
+
*`claude-3-haiku-20240307`
22
+
23
+
See [Anthropic's Model Documentation](https://docs.anthropic.com/claude/docs/models-overview) for more details on each model's capabilities.
24
+
25
+
## Configuration in Roo Code
26
+
27
+
1.**Open Roo Code Settings:** Click the gear icon (<Codiconname="gear" />) in the Roo Code panel.
28
+
2.**Select Provider:** Choose "Anthropic" from the "API Provider" dropdown.
29
+
3.**Enter API Key:** Paste your Anthropic API key into the "API Key" field.
30
+
4.**Select Model:** Choose your desired Claude model from the "Model ID" dropdown.
31
+
5.**(Optional) Custom Base URL:** If you need to use a custom base URL for the Anthropic API, check "Use custom base URL" and enter the URL. Leave this blank for most users.
32
+
33
+
## Tips and Notes
34
+
35
+
***Prompt Caching:** Claude 3 models support [prompt caching](https://docs.anthropic.com/claude/docs/prompt-caching), which can significantly reduce costs and latency for repeated prompts.
36
+
***Context Window:** Claude models have large context windows (200,000 tokens), allowing you to include a significant amount of code and context in your prompts.
37
+
***Pricing:** Refer to the [Anthropic Pricing](https://www.anthropic.com/pricing) page for the latest pricing information.
38
+
***Rate Limits:** Anthropic has strict rate limits based on [usage tiers](https://docs.anthropic.com/en/api/rate-limits#requirements-to-advance-tier). If you're repeatedly hitting rate limits, consider contacting Anthropic sales or accessing Claude through a different provider like [OpenRouter](/providers/openrouter).
Roo Code supports accessing models through Amazon Bedrock, a fully managed service that makes a selection of high-performing foundation models (FMs) from leading AI companies available via a single API.
***Bedrock Access:** You must request and be granted access to Amazon Bedrock. See the [AWS Bedrock documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html) for details on requesting access.
11
+
***Model Access:** Within Bedrock, you need to request access to the specific models you want to use (e.g., Anthropic Claude).
12
+
***Install AWS CLI:** Use AWS CLI to configure your account for authentication
13
+
```bash
14
+
aws configure
15
+
```
16
+
17
+
## Getting Credentials
18
+
19
+
You have two main options for configuring AWS credentials:
20
+
21
+
1. **AWS Access Keys (Recommended for Development):**
22
+
* Create an IAM user with the necessary permissions (at least `bedrock:InvokeModel`).
23
+
* Generate an access key ID and secret access key for that user.
24
+
**(Optional)* Create a session token if required by your IAM configuration.
25
+
2. **AWS Profile:**
26
+
* Configure an AWS profile using the AWS CLI or by manually editing your AWS credentials file. See the [AWS CLI documentation](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html) for details.
27
+
28
+
## Supported Models
29
+
30
+
Roo Code supports the following models through Bedrock:
31
+
32
+
***Anthropic:**
33
+
*`anthropic.claude-3-5-sonnet-20241022-v2:0`
34
+
*`anthropic.claude-3-5-haiku-20241022-v1:0`
35
+
*`anthropic.claude-3-opus-20240229-v1:0`
36
+
*`anthropic.claude-3-sonnet-20240229-v1:0`
37
+
*`anthropic.claude-3-haiku-20240307-v1:0`
38
+
***Meta:**
39
+
*`meta.llama3-3-70b-instruct-v1:0`
40
+
*`meta.llama3-2-90b-instruct-v1:0`
41
+
*`meta.llama3-2-11b-instruct-v1:0`
42
+
*`meta.llama3-2-3b-instruct-v1:0`
43
+
*`meta.llama3-2-1b-instruct-v1:0`
44
+
*`meta.llama3-1-405b-instruct-v1:0`
45
+
*`meta.llama3-1-70b-instruct-v1:0`
46
+
*`meta.llama3-1-8b-instruct-v1:0`
47
+
***Amazon**:
48
+
*`amazon.nova-pro-v1:0`
49
+
*`amazon.nova-lite-v1:0`
50
+
*`amazon.nova-micro-v1:0`
51
+
52
+
Refer to the [Amazon Bedrock documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids-arns.html) for the most up-to-date list of available models and their IDs. Make sure to use the *model ID* when configuring Roo Code, not the model name.
53
+
54
+
## Configuration in Roo Code
55
+
56
+
1. **Open Roo Code Settings:** Click the gear icon (<Codicon name="gear" />) in the Roo Code panel.
57
+
2. **Select Provider:** Choose "Bedrock" from the "API Provider" dropdown.
58
+
3. **Select Authentication Method:**
59
+
***AWS Credentials:**
60
+
* Enter your "AWS Access Key" and "AWS Secret Key."
61
+
* (Optional) Enter your "AWS Session Token"if you're using temporary credentials.
62
+
* Leave "AWS Profile" *blank*.
63
+
* Set "Use AWS Profile" to *unchecked*.
64
+
* **AWS Profile:**
65
+
* Enter your "AWS Profile" name (e.g., "default").
66
+
* Set "Use AWS Profile" to *checked*.
67
+
* Leave the Access Key, Secret Key, and Session Token fields *blank*.
68
+
4. **Select Region:** Choose the AWS region where your Bedrock service is available (e.g., "us-east-1").
69
+
5. **(Optional) Cross-Region Inference:** Check "Use cross-region inference" if you want to access models in a region different from your configured AWS region.
70
+
6. **Select Model:** Choose your desired model from the "Model" dropdown.
71
+
72
+
## Tips and Notes
73
+
74
+
* **Permissions:** Ensure your IAM user or role has the necessary permissions to invoke Bedrock models. The `bedrock:InvokeModel` permission is required.
75
+
* **Pricing:** Refer to the [Amazon Bedrock pricing](https://aws.amazon.com/bedrock/pricing/) page for details on model costs.
76
+
* **Cross-Region Inference:** Using cross-region inference may result in higher latency.
77
+
* **Prompt Caching**: You can enable caching of prompts if you want to use AWS's implementation.
1.**Go to Google AI Studio:** Navigate to [https://ai.google.dev/](https://ai.google.dev/).
10
+
2.**Sign In:** Sign in with your Google account.
11
+
3.**Create API Key:** Click on "Create API key" in the left-hand menu.
12
+
4.**Copy API Key:** Copy the generated API key.
13
+
14
+
## Supported Models
15
+
16
+
Roo Code supports the following Gemini models:
17
+
18
+
*`gemini-2.0-flash-001`
19
+
*`gemini-2.0-flash-lite-preview-02-05`
20
+
*`gemini-2.0-pro-exp-02-05`
21
+
*`gemini-2.0-flash-thinking-exp-1219`
22
+
*`gemini-2.0-flash-exp`
23
+
*`gemini-1.5-flash-002`
24
+
*`gemini-1.5-flash-exp-0827`
25
+
*`gemini-1.5-flash-8b-exp-0827`
26
+
*`gemini-1.5-pro-002`
27
+
*`gemini-1.5-pro-exp-0827`
28
+
*`gemini-exp-1206`
29
+
30
+
Refer to the [Gemini documentation](https://ai.google.dev/models/gemini) for more details on each model.
31
+
32
+
## Configuration in Roo Code
33
+
34
+
1.**Open Roo Code Settings:** Click the gear icon (<Codiconname="gear" />) in the Roo Code panel.
35
+
2.**Select Provider:** Choose "Google Gemini" from the "API Provider" dropdown.
36
+
3.**Enter API Key:** Paste your Gemini API key into the "API Key" field.
37
+
4.**Select Model:** Choose your desired Gemini model from the "Model ID" dropdown.
38
+
39
+
## Tips and Notes
40
+
41
+
***Pricing:** Gemini API usage is priced based on input and output tokens. Refer to the [Gemini pricing page](https://ai.google.dev/pricing) for detailed information.
42
+
***Free Tier:** As of the last update, Gemini models can be used by default up to certain request limits, after which pricing is based on prompt size.
Glama provides access to a variety of language models through a unified API, including models from Anthropic, OpenAI, and others. It offers features like prompt caching and cost tracking.
1.**Sign Up/Sign In:** Go to the [Glama sign-up page](https://glama.ai/sign-up). Sign up using your Google account or name/email/password.
10
+
2.**Get API Key:** After signing up, navigate to the [API Keys](https://glama.ai/settings/gateway/api-keys) page to get an API key.
11
+
3.**Copy the Key:** Copy the displayed API key.
12
+
13
+
## Supported Models
14
+
15
+
Roo Code will automatically try to fetch a list of available models from the Glama API. Some models that are commonly available through Glama include:
16
+
17
+
***Anthropic Claude models:** (e.g., `anthropic/claude-3-5-sonnet`) These are generally recommended for best performance with Roo Code.
18
+
***OpenAI models:** (e.g., `openai/o3-mini-high`)
19
+
***Other providers and open-source models**
20
+
21
+
Refer to the [Glama documentation](https://glama.ai/models) for the most up-to-date list of supported models and their IDs. Use the complete model ID in the form of `{provider}/{model-name}`.
22
+
23
+
## Configuration in Roo Code
24
+
25
+
1.**Open Roo Code Settings:** Click the gear icon (⚙️) in the Roo Code panel.
26
+
2.**Select Provider:** Choose "Glama" from the "API Provider" dropdown.
27
+
3.**Enter API Key:** Paste your Glama API key into the "API Key" field.
28
+
4.**Select Model:** Choose your desired model from the "Model" dropdown.
29
+
30
+
## Tips and Notes
31
+
32
+
***Pricing:** Glama operates on a pay-per-use basis. Pricing varies depending on the model you choose.
33
+
***Prompt Caching:** Glama supports prompt caching, which can significantly reduce costs and improve performance for repeated prompts.
34
+
***Model Specific Information**: Some models, like Claude 3 models, may support additional features like prompt caching. This information will appear below the model selector when using the Glama provider.
0 commit comments