Skip to content

fuse fp8 quant in kv copying and add flashinfer decode mla operator in the attention module #614

fuse fp8 quant in kv copying and add flashinfer decode mla operator in the attention module

fuse fp8 quant in kv copying and add flashinfer decode mla operator in the attention module #614

Triggered via pull request February 21, 2025 02:43
Status Success
Total duration 29s
Artifacts

pre-commit.yml

on: pull_request
pre-commit
18s
pre-commit
Fit to window
Zoom out
Zoom in