Skip to content

Commit 9ea7d6a

Browse files
Parse seed for vLLM (#602)
When seed is set through model_args, for example: ```bash "pretrained=deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B,seed=123,dtype=bfloat16,max_model_length=38768,gpu_memory_utilization=0.8,tensor_parallel_size=1" `` it is parsed as a string attribute in `VLLMModelConfig`, leading to an error during LLM initialization: ```python model = LLM(**self.model_args) ``` This PR ensures that seed is correctly cast to an integer before passing it to the model, preventing initialization errors. Co-authored-by: Nathan Habib <[email protected]>
1 parent 582d927 commit 9ea7d6a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/lighteval/models/vllm/vllm_model.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -182,7 +182,7 @@ def _create_auto_model(self, config: VLLMModelConfig, env_config: EnvConfig) ->
182182
"pipeline_parallel_size": int(config.pipeline_parallel_size),
183183
"max_model_len": self._max_length,
184184
"swap_space": 4,
185-
"seed": config.seed,
185+
"seed": int(config.seed),
186186
"max_num_seqs": int(config.max_num_seqs),
187187
"max_num_batched_tokens": int(config.max_num_batched_tokens),
188188
}

0 commit comments

Comments
 (0)