Skip to content

Add Intel Arc XPU backend support#443

Open
LaNataliaaa wants to merge 1 commit intoHaidra-Org:mainfrom
LaNataliaaa:feat/intel-arc-xpu-support
Open

Add Intel Arc XPU backend support#443
LaNataliaaa wants to merge 1 commit intoHaidra-Org:mainfrom
LaNataliaaa:feat/intel-arc-xpu-support

Conversation

@LaNataliaaa
Copy link
Copy Markdown

Summary

  • add a runtime backend abstraction with explicit XPU and oneAPI selector support
  • add Linux XPU install/run scripts for Intel Arc
  • route XPU safety to CPU and fix backend-specific torch cache and device handling
  • document Intel Arc usage and add focused backend tests

Validation

  • python3 -m compileall horde_worker_regen download_models.py run_worker.py tests/test_runtime_backend.py
  • python3 -m unittest tests.test_runtime_backend
  • bash -n runtime-xpu.sh update-runtime-xpu.sh horde-bridge-xpu.sh

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Copilot AI review requested due to automatic review settings April 12, 2026 21:26
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds an explicit runtime backend abstraction to support Intel Arc (PyTorch XPU/oneAPI) alongside existing CUDA/ROCm/DirectML paths, with new install/bridge scripts, documentation, and targeted tests.

Changes:

  • Introduces HordeRuntimeBackend to centralize backend selection, environment setup (ONEAPI selector), ComfyUI args, and torch device/cache handling.
  • Threads the backend configuration through worker startup, process management, inference cache clearing, and model download flows.
  • Adds Intel XPU micromamba environment + Linux scripts, plus README documentation and a focused backend unit test.

Reviewed changes

Copilot reviewed 15 out of 15 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
update-runtime-xpu.sh Adds micromamba-based Intel XPU runtime installer/updater.
runtime-xpu.sh Wrapper to auto-install XPU runtime then run commands in the micromamba env.
horde-bridge-xpu.sh New bridge script to download models + start worker with --xpu (and optional oneAPI selector).
environment.xpu.yaml Minimal conda env definition for the Intel XPU runtime.
horde_worker_regen/runtime_backend.py New backend abstraction + torch device enumeration/cache clearing helpers.
horde_worker_regen/run_worker.py CLI adds --xpu/--oneapi-device-selector; passes backend through startup.
horde_worker_regen/process_management/main_entry_point.py Threads backend into HordeWorkerProcessManager.
horde_worker_regen/process_management/process_manager.py Uses backend for device discovery and safety-on-GPU behavior gating; passes backend to child processes.
horde_worker_regen/process_management/worker_entry_points.py Applies backend env + ComfyUI args when launching inference workers; routes safety behavior via backend.
horde_worker_regen/process_management/inference_process.py Clears torch cache via backend-aware helper (CUDA vs XPU vs DirectML).
horde_worker_regen/download_models.py Applies backend env + ComfyUI args during model download initialization.
download_models.py Adds --xpu/--oneapi-device-selector CLI plumbing for model downloads.
tests/test_runtime_backend.py Adds unit tests for backend validation and torch device/cache routing.
README.md Documents Intel Arc/XPU setup and usage, including oneAPI selector guidance and config notes.
README_advanced.md Adds manual Intel XPU install/run instructions and pip index guidance.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread update-runtime-xpu.sh
Comment on lines +22 to +30
CONDA_ENVIRONMENT_FILE=environment.xpu.yaml
PYTORCH_XPU_INDEX=https://download.pytorch.org/whl/xpu
PYPI_INDEX=https://pypi.org/simple

wget -qO- https://github.com/mamba-org/micromamba-releases/releases/latest/download/micromamba-linux-64.tar.bz2 | tar -xvj -C "${SCRIPT_DIR}"
if [ ! -f "$SCRIPT_DIR/conda/envs/linux/bin/python" ]; then
${SCRIPT_DIR}/bin/micromamba create --no-shortcuts -r "$SCRIPT_DIR/conda" -n linux -f ${CONDA_ENVIRONMENT_FILE} -y
fi
${SCRIPT_DIR}/bin/micromamba update --no-shortcuts -r "$SCRIPT_DIR/conda" -n linux -f ${CONDA_ENVIRONMENT_FILE} -y
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CONDA_ENVIRONMENT_FILE is set to a relative path (environment.xpu.yaml). If this script is invoked from a different working directory (e.g., via an absolute path or from another script), micromamba will fail to find the env file. Prefer building an absolute path using SCRIPT_DIR for the -f argument (and any other repo-local files).

Copilot uses AI. Check for mistakes.
Comment on lines +28 to +32
if len(selected_backends) > 1:
raise ValueError(
"Backend flags are mutually exclusive. "
f"Choose only one of: {', '.join(selected_backends)}.",
)
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The mutually-exclusive backend error message currently lists only the flags that were already selected (via selected_backends), which can read oddly as “Choose only one of: --amd, --xpu” when both were passed. Consider listing the full set of supported backend flags in the message (and optionally include what was provided) to make the parser error more actionable.

Copilot uses AI. Check for mistakes.
Comment thread README.md
Comment on lines +268 to +269
- Not all updates require this, but run it if unsure
- **Advanced users**: see [README_advanced.md](README_advanced.md) for manual options
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The nested bullets under “Run update-runtime for your OS…” are mis-indented, so the Markdown list renders inconsistently (the “Not all updates…” and “Advanced users…” bullets don’t align under step 4). Align the indentation so these remain sub-bullets of step 4.

Suggested change
- Not all updates require this, but run it if unsure
- **Advanced users**: see [README_advanced.md](README_advanced.md) for manual options
- Not all updates require this, but run it if unsure
- **Advanced users**: see [README_advanced.md](README_advanced.md) for manual options

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants