Skip to content

Back temperature=0 for server as default after #32723#1038

Open
iboiko-habana wants to merge 3 commits intovllm-project:releases/v0.15.1from
iboiko-habana:fix32723_0.15.1
Open

Back temperature=0 for server as default after #32723#1038
iboiko-habana wants to merge 3 commits intovllm-project:releases/v0.15.1from
iboiko-habana:fix32723_0.15.1

Conversation

@iboiko-habana
Copy link
Collaborator

No description provided.

Signed-off-by: Iryna Boiko <iboiko@habana.ai>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR reverts to using temperature=0 as the default for vLLM benchmark server commands, restoring a setting that was apparently removed in PR #32723. The temperature parameter controls the randomness of text generation, with 0 meaning deterministic output, which is appropriate for benchmarking to ensure reproducible results.

Changes:

  • Added --temperature 0 flag to the vllm bench serve command in the benchmark template

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Collaborator

@kamil-kaczor kamil-kaczor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants