A collection of helper nodes for working with LLM APIs in ComfyUI, intended to complement other LLM custom nodes.
cd ComfyUI/custom_nodes
git clone https://github.com/bedovyy/ComfyUI-LLM-Helper
cd ComfyUI-LLM-Helper
pip install -r requirements.txt- Set your LLM server's
base_urland select API key from environment variables (loaded securely fromComfyUI/.env). - Click "Update model names" to query the real /models endpoint and instantly refresh the
model_namedropdown with available models, then use the outputs downstream.
- Sends an unload request to
llama-serverwhen running in router mode, freeing the model from memory.