Skip to content

Commit 7e8d685

Browse files
authored
[Minor] Fix pre-commit error on main (#22579)
Signed-off-by: Isotr0py <[email protected]>
1 parent c498483 commit 7e8d685

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

vllm/model_executor/layers/fused_moe/fused_moe.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1038,9 +1038,9 @@ def inplace_fused_experts(
10381038
w2_zp: Optional[torch.Tensor] = None,
10391039
a1_scale: Optional[torch.Tensor] = None,
10401040
a2_scale: Optional[torch.Tensor] = None,
1041-
block_shape: Optional[List[int]] = None,
1041+
block_shape: Optional[List[int]] = None, #noqa: UP006
10421042
w1_bias: Optional[torch.Tensor] = None,
1043-
w2_bias: Optional[torch.Tensor] = None) -> None: #noqa: UP006
1043+
w2_bias: Optional[torch.Tensor] = None) -> None:
10441044
fused_experts_impl(hidden_states, w1, w2, topk_weights, topk_ids, True,
10451045
activation, is_act_and_mul,
10461046
apply_router_weight_on_input, use_fp8_w8a8,

0 commit comments

Comments
 (0)