Looking for mlx-community/llama-3.1-405b-Instruct with bf16 Precision #1549
quebic-source
started this conversation in
General
Replies: 1 comment 1 reply
-
For b16 you can just use the original model |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi! I'm looking for an existing version of mlx-community/llama-3.1-405b-Instruct with bf16 precision, or any guidance on how to convert the model to bf16. Any help would be appreciated!
Beta Was this translation helpful? Give feedback.
All reactions