Skip to content

Conversation

@jan-service-account
Copy link

Updates dev branch with latest release (b5997) from ggml-org/llama.cpp

yeahdongcn and others added 3 commits July 26, 2025 10:36
Implement REGLU, GEGLU, SWIGLU ops according to ggml-org#14158
…gml-org#14624)

This commit adds support for MFMA instructions to MMQ. CDNA1/GFX908 CDNA2/GFX90a and CDNA3/GFX942 are supported by the MFMA-enabled code path added by this commit. The code path and stream-k is only enabled on CDNA3 for now as it fails to outperform blas in all cases on the other devices.
Blas is currently only consistently outperformed on CDNA3 due to issues in the amd-provided blas libraries.
This commit also improves the awareness of MMQ towards different warp sizes and as a side effect improves the performance of all quant formats besides q4_0 and q4_1, which regress slightly, on GCN gpus.
@jan-service-account jan-service-account merged commit 02af438 into dev Jul 27, 2025
9 checks passed
@jan-service-account jan-service-account deleted the update-dev-from-master-2025-07-27-00-14 branch July 27, 2025 00:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants