-
I need to use vLLM via REAST API. but
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hey @mkhludnev! Great to see you back, navigating the adventurous world of LlamaIndex with us. Hope your code journey's been more of a smooth sail than a wild llama chase lately! 🦙💻 Based on the information provided, it appears that the Unfortunately, there doesn't seem to be a version of the However, you might be able to work around this issue by using a version of PyTorch that doesn't require CUDA. PyTorch offers CPU-only versions of their package, which you can install with the following command: pip install torch==1.8.1+cpu torchvision==0.9.1+cpu torchaudio===0.8.1 -f https://download.pytorch.org/whl/torch_stable.html Please replace the version numbers with the ones that match your requirements. This command will install PyTorch, torchvision, and torchaudio without CUDA support. After installing the CPU-only version of PyTorch, you should be able to install the Please note that this workaround might not work if the For more information, you can refer to the PyTorch website and the LlamaIndex repository. If you encounter any issues or need further assistance, please provide more details about your setup and the exact error messages you're seeing. Sources
|
Beta Was this translation helpful? Give feedback.
-
Here's a workaround
I'll try to come up with the fix for VllmServer in a few days. |
Beta Was this translation helpful? Give feedback.
Here's a workaround
I'll try to come up with the fix for VllmServer in a few days.