Skip to content

Commit bbd6830

Browse files
Add instructions to install nightly AMD pytorch for windows. (#10190)
* Add instructions to install nightly AMD pytorch for windows. * Update README.md
1 parent 08726b6 commit bbd6830

File tree

1 file changed

+19
-7
lines changed

1 file changed

+19
-7
lines changed

README.md

Lines changed: 19 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,8 @@ Put your SD checkpoints (the huge ckpt/safetensors files) in: models/checkpoints
206206
Put your VAE in: models/vae
207207

208208

209-
### AMD GPUs (Linux only)
209+
### AMD GPUs (Linux)
210+
210211
AMD users can install rocm and pytorch with pip if you don't have it already installed, this is the command to install the stable version:
211212

212213
```pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/rocm6.4```
@@ -215,6 +216,23 @@ This is the command to install the nightly with ROCm 7.0 which might have some p
215216

216217
```pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm7.0```
217218

219+
220+
### AMD GPUs (Experimental: Windows and Linux), RDNA 3, 3.5 and 4 only.
221+
222+
These have less hardware support than the builds above but they work on windows. You also need to install the pytorch version specific to your hardware.
223+
224+
RDNA 3 (RX 7000 series):
225+
226+
```pip install --pre torch torchvision torchaudio --index-url https://rocm.nightlies.amd.com/v2/gfx110X-dgpu/```
227+
228+
RDNA 3.5 (Strix halo/Ryzen AI Max+ 365):
229+
230+
```pip install --pre torch torchvision torchaudio --index-url https://rocm.nightlies.amd.com/v2/gfx1151/```
231+
232+
RDNA 4 (RX 9000 series):
233+
234+
```pip install --pre torch torchvision torchaudio --index-url https://rocm.nightlies.amd.com/v2/gfx120X-all/```
235+
218236
### Intel GPUs (Windows and Linux)
219237

220238
(Option 1) Intel Arc GPU users can install native PyTorch with torch.xpu support using pip. More information can be found [here](https://pytorch.org/docs/main/notes/get_start_xpu.html)
@@ -270,12 +288,6 @@ You can install ComfyUI in Apple Mac silicon (M1 or M2) with any recent macOS ve
270288

271289
> **Note**: Remember to add your models, VAE, LoRAs etc. to the corresponding Comfy folders, as discussed in [ComfyUI manual installation](#manual-install-windows-linux).
272290
273-
#### DirectML (AMD Cards on Windows)
274-
275-
This is very badly supported and is not recommended. There are some unofficial builds of pytorch ROCm on windows that exist that will give you a much better experience than this. This readme will be updated once official pytorch ROCm builds for windows come out.
276-
277-
```pip install torch-directml``` Then you can launch ComfyUI with: ```python main.py --directml```
278-
279291
#### Ascend NPUs
280292

281293
For models compatible with Ascend Extension for PyTorch (torch_npu). To get started, ensure your environment meets the prerequisites outlined on the [installation](https://ascend.github.io/docs/sources/ascend/quick_install.html) page. Here's a step-by-step guide tailored to your platform and installation method:

0 commit comments

Comments
 (0)