Skip to content
This repository was archived by the owner on Sep 10, 2025. It is now read-only.

Add Initial Compile for Llama 3.2 11B: Decoder TransformerSelfAttentionLayer, TransformerCrossAttentionLayer#1287

Merged
Jack-Khuu merged 1 commit intomainfrom
compile_mm
Oct 10, 2024
Merged

Add Initial Compile for Llama 3.2 11B: Decoder TransformerSelfAttentionLayer, TransformerCrossAttentionLayer#1287
Jack-Khuu merged 1 commit intomainfrom
compile_mm

Commits

Commits on Oct 10, 2024