Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ repos:
- python-dotenv
- aiohttp
- horde_safety==0.2.3
- torch==2.5.0
- torch==2.7.1
- ruamel.yaml
- horde_engine==2.20.12
- horde_sdk==0.17.1
Expand Down
4 changes: 2 additions & 2 deletions README_advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ exit
- `git clone https://github.com/Haidra-Org/horde-worker-reGen.git`
- `cd .\horde-worker-reGen\`
- Install the requirements:
- CUDA: `pip install -r requirements.txt --extra-index-url https://download.pytorch.org/whl/cu124`
- CUDA: `pip install -r requirements.txt --extra-index-url https://download.pytorch.org/whl/cu126`
- RoCM: `pip install -r requirements.txt --extra-index-url https://download.pytorch.org/whl/rocm6.2`

### Run worker
Expand All @@ -115,7 +115,7 @@ exit
Pressing control-c will stop the worker but will first have the worker complete any jobs in progress before ending. Please try and avoid hard killing it unless you are seeing many major errors. You can force kill by repeatedly pressing control+c or doing a SIGKILL.

### Important note if manually manage your venvs
- You should be running `python -m pip install -r requirements.txt -U https://download.pytorch.org/whl/cu124` every time you `git pull`. (Use `/whl/rocm6.2` instead if applicable)
- You should be running `python -m pip install -r requirements.txt -U https://download.pytorch.org/whl/cu126` every time you `git pull`. (Use `/whl/rocm6.2` instead if applicable)


## Advanced users, running on directml
Expand Down
2 changes: 1 addition & 1 deletion horde_worker_regen/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

ASSETS_FOLDER_PATH = Path(__file__).parent / "assets"

__version__ = "10.0.7"
__version__ = "10.1.0"


import pkg_resources # noqa: E402
Expand Down
2 changes: 1 addition & 1 deletion horde_worker_regen/_version_meta.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"recommended_version": "10.0.7",
"recommended_version": "10.1.0",
"required_min_version": "9.0.2",
"required_min_version_update_date": "2024-09-26",
"required_min_version_info": {
Expand Down
8 changes: 4 additions & 4 deletions horde_worker_regen/bridge_data/data_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -201,10 +201,10 @@ def validate_performance_modes(self) -> reGenBridgeData:
)

if self.high_memory_mode:
if self.max_threads == 1:
logger.warning(
"High memory mode is enabled, you should consider setting max_threads to 2.",
)
# if self.max_threads == 1:
# logger.warning(
# "High memory mode is enabled, you should consider setting max_threads to 2.",
# )

if self.queue_size == 0:
logger.warning(
Expand Down
4 changes: 2 additions & 2 deletions horde_worker_regen/process_management/process_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -1351,8 +1351,8 @@ def __init__(
raise ValueError(
"VRAM heavy models detected. Total RAM is less than 24GB. "
"This is not enough RAM to run the worker."
"Disable `Stable Cascade 1.0` by adding it to your `models_to_skip` or remove it from your "
"`models_to_load`.",
"Disable the large models by adding it to your `models_to_skip` or remove it from your "
"`models_to_load`. Large models include: " + ", ".join(VRAM_HEAVY_MODELS),
)

self.target_ram_overhead_bytes = min(self.target_ram_overhead_bytes, int(20 * 1024 * 1024 * 1024 / 2))
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "horde_worker_regen"
version = "10.0.7"
version = "10.1.0"
description = "Allows you to connect to the AI Horde and generate images for users."
authors = [
{name = "tazlin", email = "tazlin.on.github@gmail.com"},
Expand Down
2 changes: 1 addition & 1 deletion requirements.rocm.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
torch==2.5.0
torch==2.7.1
qrcode==7.4.2 # >8 breaks horde-engine 2.20.12 via the qr code generation nodes

certifi # Required for SSL cert resolution
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
torch==2.5.0
torch==2.7.1
qrcode==7.4.2 # >8 breaks horde-engine 2.20.12 via the qr code generation nodes

certifi # Required for SSL cert resolution
Expand Down
2 changes: 1 addition & 1 deletion tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ deps =
pytest-sugar
pytest-cov
requests
-r requirements.txt --extra-index-url https://download.pytorch.org/whl/cu124
-r requirements.txt --extra-index-url https://download.pytorch.org/whl/cu126
commands =
pytest tests {posargs} --cov

Expand Down
6 changes: 3 additions & 3 deletions update-runtime.cmd
Original file line number Diff line number Diff line change
Expand Up @@ -67,16 +67,16 @@ micromamba.exe shell hook -s cmd.exe %MAMBA_ROOT_PREFIX% -v
call "%MAMBA_ROOT_PREFIX%\condabin\mamba_hook.bat"
call "%MAMBA_ROOT_PREFIX%\condabin\micromamba.bat" activate windows

python -s -m pip install torch==2.5.0 --index-url https://download.pytorch.org/whl/cu124 -U
python -s -m pip install torch==2.7.1 --index-url https://download.pytorch.org/whl/cu126 -U

if defined hordelib (
python -s -m pip uninstall -y hordelib horde_engine horde_model_reference
python -s -m pip install horde_engine horde_model_reference --extra-index-url https://download.pytorch.org/whl/cu124
python -s -m pip install horde_engine horde_model_reference --extra-index-url https://download.pytorch.org/whl/cu126
) else (
if defined scribe (
python -s -m pip install -r requirements-scribe.txt
) else (
python -s -m pip install -r requirements.txt --extra-index-url https://download.pytorch.org/whl/cu124 -U
python -s -m pip install -r requirements.txt --extra-index-url https://download.pytorch.org/whl/cu126 -U
)
)
call "%MAMBA_ROOT_PREFIX%\condabin\micromamba.bat" deactivate
Expand Down
4 changes: 2 additions & 2 deletions update-runtime.sh
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ ${SCRIPT_DIR}/bin/micromamba create --no-shortcuts -r "$SCRIPT_DIR/conda" -n lin

if [ "$hordelib" = true ]; then
${SCRIPT_DIR}/bin/micromamba run -r "$SCRIPT_DIR/conda" -n linux python -s -m pip uninstall -y hordelib horde_engine horde_sdk horde_model_reference
${SCRIPT_DIR}/bin/micromamba run -r "$SCRIPT_DIR/conda" -n linux python -s -m pip install horde_engine horde_model_reference --extra-index-url https://download.pytorch.org/whl/cu124
${SCRIPT_DIR}/bin/micromamba run -r "$SCRIPT_DIR/conda" -n linux python -s -m pip install horde_engine horde_model_reference --extra-index-url https://download.pytorch.org/whl/cu126
else
${SCRIPT_DIR}/bin/micromamba run -r "$SCRIPT_DIR/conda" -n linux python -s -m pip install -r "$SCRIPT_DIR/requirements.txt" -U --extra-index-url https://download.pytorch.org/whl/cu124
${SCRIPT_DIR}/bin/micromamba run -r "$SCRIPT_DIR/conda" -n linux python -s -m pip install -r "$SCRIPT_DIR/requirements.txt" -U --extra-index-url https://download.pytorch.org/whl/cu126
fi
Loading