Skip to content

Commit dc639a3

Browse files
Update faq.mdx
1 parent b2f2f2e commit dc639a3

File tree

1 file changed

+13
-0
lines changed

1 file changed

+13
-0
lines changed

docs/cody/faq.mdx

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -137,6 +137,19 @@ cody chat --model '$name_of_the_model' -m 'Hi Cody!'
137137

138138
For example, to use Claude 3.5 Sonnet, you'd pass the following command in your terminal, `cody chat --model 'claude-3.5-sonnet' -m 'Hi Cody!'
139139

140+
### Is it possible to set separate token usage just for programmatic access( via API) and not to be used via IDE?
141+
142+
Yes, it's possible to set separate token usage for API but only for completions (not chat). To achieve this you need to use the custom [model configuration](https://sourcegraph.com/docs/cody/enterprise/model-configuration#model-overrides). You need to configure the below settings
143+
144+
maxInputTokens: Specifies the maximum number of tokens for the contextual data in the prompt (e.g., question, relevant snippets
145+
146+
maxOutputTokens: Specifies the maximum number of tokens allowed in the response
147+
148+
Also, remove the below setting in the site configuration under the model configuration.
149+
150+
"capabilities": ["autocomplete", "chat"]
151+
152+
140153
## OpenAI o1
141154

142155
### What are OpenAI o1 best practices?

0 commit comments

Comments
 (0)