Skip to content

Commit 8959b99

Browse files
Clarify that sampling temperature ranges depend on providers
Co-authored-by: Cliff Hall <[email protected]>
1 parent af0d12d commit 8959b99

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/specification/draft/client/sampling.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -246,7 +246,7 @@ accordingly.
246246

247247
LLM sampling can be fine-tuned with the following parameters:
248248

249-
- `temperature`: Controls randomness in model responses. Higher values produce higher randomness, and lower values produce more stable output.
249+
- `temperature`: Controls randomness in model responses. Higher values produce higher randomness, and lower values produce more stable output. Valid range depends upon the model provider.
250250
- `maxTokens`: Maximum tokens to generate; required.
251251
- `stopSequences`: Array of sequences that stop generation.
252252
- `metadata`: Additional provider-specific parameters.

0 commit comments

Comments
 (0)