OS: Arch Linux
Kernel: Linux 6.19.9-arch1-1
Python 3.13.12
torch 2.8.0+xpu
torchvision 0.23.0+xpu
Intel Arc B580
from terminal:
File "sd-scripts/library/custom_offloading_utils.py", line 136, in move_blocks
self.swap_weight_devices(block_to_cpu, block_to_cuda)
~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "sd-scripts/library/custom_offloading_utils.py", line 126, in swap_weight_devices
swap_weight_devices_cuda(self.device, block_to_cpu, block_to_cuda)
~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "sd-scripts/library/custom_offloading_utils.py", line 60, in swap_weight_devices_cuda
stream = torch.Stream(device="cuda")
RuntimeError: PyTorch is not linked with support for cuda devices
issue form https://github.com/kohya-ss/sd-scripts/blob/main/library/custom_offloading_utils.py#L124
def swap_weight_devices(self, block_to_cpu: nn.Module, block_to_cuda: nn.Module):
if self.cuda_available:
swap_weight_devices_cuda(self.device, block_to_cpu, block_to_cuda)
else:
swap_weight_devices_no_cuda(self.device, block_to_cpu, block_to_cuda)
For the sake of convenience, I kept only the “swap_weight_devices_no_cuda” function, and it runs normally.
OS: Arch Linux
Kernel: Linux 6.19.9-arch1-1
Python 3.13.12
torch 2.8.0+xpu
torchvision 0.23.0+xpu
Intel Arc B580
from terminal:
issue form https://github.com/kohya-ss/sd-scripts/blob/main/library/custom_offloading_utils.py#L124
For the sake of convenience, I kept only the “swap_weight_devices_no_cuda” function, and it runs normally.