Replies: 12 comments
-
You should add a I copied your pixi.toml file and managed to install the packages you installed with uv by using Pay attention to the note in the docs:
I think I managed to do this despite |
Beta Was this translation helpful? Give feedback.
-
Could you paste any modifications you made to the |
Beta Was this translation helpful? Give feedback.
-
Sorry, already deleted 😬 |
Beta Was this translation helpful? Give feedback.
-
My current setup is this which is still not working after following the suggestions: pixi.toml: [workspace]
authors = ["sidmadala"]
channels = ["conda-forge", "nvidia"]
name = "test"
platforms = ["linux-64"]
version = "0.1.0"
[tasks]
[dependencies]
python = "3.12.*"
cuda-toolkit = { version = "12.4.*", channel = "nvidia" }
uv = ">=0.7.2,<0.8"
[pypi-options]
no-build-isolation = ["axolotl"]
[pypi-dependencies]
torch = "==2.6.0"
torchvision = "*"
huggingface-hub = { version = ">=0.30.0", extras = ["cli", "torch"] }
transformers = ">=4.51"
evaluate = ">=0.4.0"
datasets = ">=3.5.0"
accelerate = ">=1.6.0"
peft = ">=0.15.0"
bitsandbytes = ">=0.45.0"
trl = ">=0.17.0"
unsloth = "*"
sentencepiece = "*"
openai = "*"
scikit-learn = "*"
scipy = "*"
jupyterlab = "*"
matplotlib = "*"
seaborn = "*"
gradio = "*"
wandb = "*"
nltk = "*"
pandas = "*"
tenacity = "*"
requests = "*"
aiohttp = "*"
ratelimit = "*"
importlib-resources = "*"
mypy = "*"
python-dotenv = ">=1.0.0"
nvitop = "==1.5.0"
gpustat = "==1.1.1"
vllm = "==0.8.5"
# axolotl = { version = "==0.9.0", extras = ["flash-attn", "deepspeed"] } Commands: sidmadala@icgpu01:~/hocklab/test-repo$ pixi install # Installs all dependencies listed above
sidmadala@icgpu01:~/hocklab/test-repo$ pixi shell
(test-pixi) sidmadala@icgpu01:~/hocklab/test-pixi$ pixi add --pypi axolotl[flash-attn,deepspeed]==0.9.0
Error: × Failed to update PyPI packages for environment 'default'
├─▶ Failed to prepare distributions
├─▶ Failed to build `flash-attn==2.7.4.post1`
├─▶ The build backend returned an error
╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel` failed (exit status: 1)
[stderr]
Traceback (most recent call last):
File "<string>", line 14, in <module>
File "/home/sidmadala/.cache/rattler/cache/uv-cache/builds-v0/.tmp9tC27W/lib/python3.12/site-packages/setuptools/build_meta.py", line 331, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sidmadala/.cache/rattler/cache/uv-cache/builds-v0/.tmp9tC27W/lib/python3.12/site-packages/setuptools/build_meta.py", line 301, in _get_build_requires
self.run_setup()
File "/home/sidmadala/.cache/rattler/cache/uv-cache/builds-v0/.tmp9tC27W/lib/python3.12/site-packages/setuptools/build_meta.py", line 512, in run_setup
super().run_setup(setup_script=setup_script)
File "/home/sidmadala/.cache/rattler/cache/uv-cache/builds-v0/.tmp9tC27W/lib/python3.12/site-packages/setuptools/build_meta.py", line 317, in run_setup
exec(code, locals())
File "<string>", line 22, in <module>
ModuleNotFoundError: No module named 'torch'
hint: This error likely indicates that `[email protected]` depends on `torch`, but doesn't declare it as a build dependency. If `flash-attn` is a first-party package, consider adding `torch` to its `build-
system.requires`. Otherwise, `uv pip install torch` into the environment and re-run with `--no-build-isolation`. |
Beta Was this translation helpful? Give feedback.
-
For posterity, I seem to have solved the issue. Still not ideal (one command pixi.toml: [workspace]
authors = ["sidmadala"]
channels = ["conda-forge", "nvidia"]
name = "test-pixi"
platforms = ["linux-64"]
version = "0.1.0"
[tasks]
[dependencies]
python = "3.12.*"
cuda-toolkit = { version = "12.4.*", channel = "nvidia" }
uv = ">=0.7.2,<0.8"
[pypi-options]
no-build-isolation = ["axolotl", "flash-attn"] # THIS IS THE IMPORTANT SECTION!!!
[pypi-dependencies]
torch = "==2.6.0"
torchvision = "*"
huggingface-hub = { version = ">=0.30.0", extras = ["cli", "torch"] }
transformers = ">=4.51"
evaluate = ">=0.4.0"
datasets = ">=3.5.0"
accelerate = ">=1.6.0"
peft = ">=0.15.0"
bitsandbytes = ">=0.45.0"
trl = ">=0.17.0"
unsloth = "*"
sentencepiece = "*"
openai = "*"
scikit-learn = "*"
scipy = "*"
jupyterlab = "*"
matplotlib = "*"
seaborn = "*"
gradio = "*"
wandb = "*"
nltk = "*"
pandas = "*"
tenacity = "*"
requests = "*"
aiohttp = "*"
ratelimit = "*"
importlib-resources = "*"
mypy = "*"
python-dotenv = ">=1.0.0"
nvitop = "==1.5.0"
gpustat = "==1.1.1"
vllm = "==0.8.5"
# axolotl = { version = "==0.9.0", extras = ["flash-attn", "deepspeed"] } Commands: sidmadala@icgpu01:~/hocklab/test-repo$ pixi install # Installs all dependencies listed above
sidmadala@icgpu01:~/hocklab/test-repo$ pixi shell
(test-pixi) sidmadala@icgpu01:~/hocklab/test-pixi$ pixi add --pypi axolotl[flash-attn,deepspeed]==0.9.0
(test-pixi) sidmadala@icgpu01:~/hocklab/test-pixi$ p list | grep -E "torch|flash|deep|axol|vllm"
axolotl 0.9.0 1.2 MiB pypi axolotl-0.9.0-py3-none-any.whl
axolotl_contribs_lgpl 0.0.6 30.7 KiB pypi axolotl_contribs_lgpl-0.0.6-py3-none-any.whl
axolotl_contribs_mit 0.0.3 11.8 KiB pypi axolotl_contribs_mit-0.0.3-py3-none-any.whl
deepspeed 0.15.4 5.7 MiB pypi deepspeed-0.15.4-py3-none-any.whl
deepspeed_kernels 0.0.1.dev1698255861 181 MiB pypi deepspeed_kernels-0.0.1.dev1698255861-py3-none-manylinux1_x86_64.whl
flash_attn 2.7.4.post1 576.7 MiB pypi flash_attn-2.7.4.post1-cp312-cp312-linux_x86_64.whl
torch 2.6.0 1.4 GiB pypi torch-2.6.0-cp312-cp312-manylinux1_x86_64.whl
torchao 0.9.0 24.3 MiB pypi torchao-0.9.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl
torchaudio 2.6.0 12.4 MiB pypi torchaudio-2.6.0-cp312-cp312-manylinux1_x86_64.whl
torchvision 0.21.0 19 MiB pypi torchvision-0.21.0-cp312-cp312-manylinux1_x86_64.whl
vllm 0.8.5 1014 MiB pypi vllm-0.8.5-cp38-abi3-manylinux1_x86_64.whl |
Beta Was this translation helpful? Give feedback.
-
FYI, you could just run |
Beta Was this translation helpful? Give feedback.
-
Hi, so what is the issue remaining :)? Sorry bit hard to follow for me |
Beta Was this translation helpful? Give feedback.
-
IMO there's no real issue here There's the subtle points stated in the docs - for pypi dependencies that require other dependencies for building, one should either replace the needed dependency with a respective conda dependency, or pre-install the needed (pypi) dependency and then install the relevant package that needs it (i.e., 2-stage installation). See the torch discussion above... |
Beta Was this translation helpful? Give feedback.
-
Thanks that clarifies it, and that means |
Beta Was this translation helpful? Give feedback.
-
after adding with |
Beta Was this translation helpful? Give feedback.
-
Since |
Beta Was this translation helpful? Give feedback.
-
Hey! So I guess you are asking two things:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checks
I have checked that this issue has not already been reported.
I have confirmed this bug exists on the latest version of pixi, using
pixi --version
.Reproducible example
Shell Commands:
Testing if
axolotl
andvllm
were installed:Pixi.toml:
Issue description
Pixi is not able to see
axolotl
orvllm
when installed usinguv pip install
. I cannot useuv add
since I am usingpixi.toml
and notpyproject.toml
due to issue #3499. While the packages seem to be installed correctly when testing if they can be imported, the result of(test) sidmadala@icgpu01:~/hocklab/test-repo$ pixi list | grep -E "axol|vllm"
is empty, which confuses me. Any assistance would be much appreciated.Expected behavior
Since Pixi cannot install axolotl via
pixi add --pypi --no-build-isolation
or in thepixi.toml
itself (#3730), I would prefer ifpixi
can see what is installed viauv pip install --no-build-isolation
since this behavior seems to work when usingpip
inside of aconda
environment.Beta Was this translation helpful? Give feedback.
All reactions