-
I don't seem to find any way to getting AVX512 support from llama.cpp enabled. |
Beta Was this translation helpful? Give feedback.
Answered by
LostRuins
Apr 15, 2023
Replies: 1 comment
-
Nope currently AVX512 is not supported as I don't own any devices that can run it, and cannot test for it. It should work if you rebuild from the makefile though I can't guarantee results. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
shoraaa
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Nope currently AVX512 is not supported as I don't own any devices that can run it, and cannot test for it. It should work if you rebuild from the makefile though I can't guarantee results.