Skip to content

Commit db81577

Browse files
authored
Update transformers_moe.py
Signed-off-by: PatrykSaffer <[email protected]>
1 parent 482fd0b commit db81577

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/model_executor/models/transformers_moe.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -240,7 +240,7 @@ def forward(self, *args, **kwargs):
240240
# Expert parallel load balancing kwargs
241241
enable_eplb = self.parallel_config.enable_eplb
242242
num_redundant_experts = self.parallel_config.eplb_config.num_redundant_experts
243-
eplb_record_metrics=self.parallel_config.eplb_config.eplb_record_metrics
243+
eplb_record_metrics = self.parallel_config.eplb_config.eplb_record_metrics
244244

245245
# MixtureOfExperts mixin settings
246246
ep_size = self.ep_group.world_size

0 commit comments

Comments
 (0)