Replies: 1 comment
-
Answer from @ruben-arts on discord Argh you ran into a pickle. We currently don't have support yet of multi layer install/solves in pypi. You can just run the pip commands in pixi env. But because of an inconsistency between the vllm dependencies and the pytorch dependencies on conda-forge there is currently no easy workaround. I was trying the following. But keep running in snags.
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How can I get vllm to build on osx for MLX?
After setting up the normal python environment these requirements need to be installed as a separate step. Would you encapsulate these in a setup task? Or is there a better approach? I.e. similar to the normal git based pypi packages - but where I can specify the path to the specific pyproject toml/requirements.txt
Beta Was this translation helpful? Give feedback.
All reactions