Skip to content

Commit 0ef2d89

Browse files
committed
Clarify misconceptions about model temperature in coding and enhance documentation for better understanding
1 parent 3159471 commit 0ef2d89

File tree

1 file changed

+21
-9
lines changed

1 file changed

+21
-9
lines changed

docs/advanced-usage/model-temperature.md

Lines changed: 21 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,35 @@
11
# Adjusting Model Temperature
22

3-
Temperature controls the randomness of AI model outputs. Adjusting this setting optimizes results for different tasks - from precise code generation to creative brainstorming.
4-
5-
:::info
6-
Temperature is one of the most powerful parameters for controlling AI behavior. A well-tuned temperature setting can dramatically improve the quality and appropriateness of responses for specific tasks.
7-
:::
3+
Temperature controls the randomness of AI model outputs. Adjusting this setting optimizes results for different tasks - from precise code generation to creative brainstorming. Temperature is one of the most powerful parameters for controlling AI behavior. A well-tuned temperature setting can dramatically improve the quality and appropriateness of responses for specific tasks.
84

95
## What is Temperature?
106

117
Temperature is a parameter that controls the randomness of the model's predictions. It's typically a value between 0.0 and 2.0, depending on the model.
128

13-
* **Lower Temperature (0.0-0.3):** The model becomes deterministic and focused, selecting the most probable outputs. Ideal for tasks requiring precision and correctness, like code generation.
14-
* **Higher Temperature (0.7-1.0+):** The model becomes more creative and diverse, exploring less likely outputs. Useful for brainstorming, generating variations, or exploring different design options.
159

16-
Technically, temperature affects the probability distribution of the model's next-token predictions by dividing the logits (pre-softmax activation values) before they're converted to probabilities, altering how the model samples from its vocabulary.
10+
:::info Common Misconceptions About Temperature Specifically Relating to Coding
11+
Temperature settings in large language models (LLMs) significantly influence coding outputs, primarily by controlling randomness rather than directly affecting code quality. Here are several common misconceptions clarified:
12+
13+
**Lower Temperature Equals Better Code:** A very low temperature (close to zero) leads to deterministic outputs, resulting in predictable, repetitive, and potentially overly simplistic code. It does not inherently improve the quality of solutions.
14+
15+
**Higher Temperature Generates Higher-Quality Code:** Increasing temperature introduces more randomness, which can lead to novel or creative solutions but also heightens the risk of errors, convoluted logic, or nonsensical variable names. Higher randomness does not equate to improved code quality.
16+
17+
**Temperature Directly Impacts Coding Accuracy:** Temperature does not directly affect accuracy or correctness of programming logic. The accuracy of code generated by the model is dependent on its training and the clarity of the prompt provided, rather than the randomness introduced by temperature adjustments.
18+
19+
**Temperature Zero is Always Ideal for Coding:** While a temperature of zero is beneficial for consistent, repeatable solutions suitable for basic examples or straightforward tasks, it can limit creativity and exploration for more complex coding challenges.
20+
:::
21+
22+
Ultimately, selecting an optimal temperature setting involves balancing deterministic responses and creative exploration. Moderate temperatures, typically ranging from 0.3 to 0.7, often offer the best balance for many coding scenarios, though ideal settings may vary based on specific task requirements and desired outcomes.
1723

1824
## Default Values in Roo Code
1925

20-
Roo Code uses a default temperature of 0.0 for most interactions, optimizing for maximum determinism and precision in code generation. There may be slight variances depending on the provider or model type - for instance, DeepSeek models default to 0.6 for slightly more creative outputs. Models with thinking capabilities (where the AI shows its reasoning process) require a fixed temperature of 1.0 which cannot be changed, as this setting ensures optimal performance of the thinking mechanism.
26+
Roo Code uses a default temperature of 0.0 for most models, optimizing for maximum determinism and precision in code generation. This applies to OpenAI models, Anthropic models (non-thinking variants), LM Studio models, and most other providers.
27+
28+
Some models use higher default temperatures - DeepSeek R1 models and certain reasoning-focused models default to 0.6, providing a balance between determinism and creative exploration.
29+
30+
Models with thinking capabilities (where the AI shows its reasoning process) require a fixed temperature of 1.0 which cannot be changed, as this setting ensures optimal performance of the thinking mechanism. This applies to any model with the ":thinking" flag enabled.
31+
32+
Some specialized models don't support temperature adjustments at all, in which case Roo Code respects these limitations automatically.
2133

2234
## When to Adjust Temperature
2335

0 commit comments

Comments
 (0)