Skip to content

Commit b247f77

Browse files
authored
fix
1 parent ece3d13 commit b247f77

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

lightllm/common/fused_moe/grouped_fused_moe.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -941,7 +941,6 @@ def inplace_fused_experts_impl_fake(
941941
w2_scale: Optional[torch.Tensor] = None,
942942
a1_scale: Optional[torch.Tensor] = None,
943943
a2_scale: Optional[torch.Tensor] = None,
944-
activate_fn: str = "silu",
945944
layout: str = "blocked",
946945
alpha: Optional[float] = None,
947946
limit: Optional[float] = None,

0 commit comments

Comments
 (0)