How to run Devstral and mistral-vibe with MLX? #859
Replies: 2 comments 3 replies
-
|
Anyone here? |
Beta Was this translation helpful? Give feedback.
3 replies
-
|
working great in v0.30.7 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
So, I have downloaded Devstral-2-123B-Instruct-2512-4bit (I am running on a MBP M4 Max/128 GB)
Download Devstral-2-123B-Instruct-2512-4bit
python -c "from huggingface_hub import snapshot_download snapshot_download('mlx-community/Devstral-2-123B-Instruct-2512-4bit',local_dir='/Users/codrut/devstral-2-4bit-mlx')"Run
mlx_lm.chat --model ~/devstral-2-4bit-mlxStart as a server
In another terminal window I have installed mistral-vibe
in my ~/.vibe/config.toml
I have added
The problem seems to be that the answer from LLM is very very slow
in the server terminal I can see
is there any solution to this?
Beta Was this translation helpful? Give feedback.
All reactions