Skip to content

Commit c703bd5

Browse files
authored
fix
1 parent 174ff55 commit c703bd5

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

lightllm/models/deepseek2/layer_infer/transformer_layer_infer.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -217,8 +217,8 @@ def _context_attention_kernel_with_CC(
217217
q_nope, q_rope = q[:, :, : -self.qk_rope_head_dim], q[:, :, -self.qk_rope_head_dim :]
218218
o_tensor = self.alloc_tensor(q_nope.shape, dtype=q_nope.dtype) if out is None else out
219219
context_attention_fwd_with_v(
220-
q_nope.contiguous(),
221-
q_rope.contiguous(),
220+
q_nope,
221+
q_rope,
222222
k_nope,
223223
k_rope,
224224
v,

lightllm/tensor_acc_4.bin

Whitespace-only changes.

0 commit comments

Comments
 (0)