A Redis-based Human-in-the-Loop implementation for Hayhooks / Haystack Agents, integrated with Open WebUI for interactive tool approval workflows.
- Overview
- Architecture
- Prerequisites
- Quick Start with Docker Compose
- Manual Installation (Alternative)
- Usage
- Configuration
- Available Tools
- Adding Custom Tools
- Troubleshooting
- How It Works
- License
This implementation allows users to approve or reject tool calls made by a Haystack Agent in real-time through Open WebUI's confirmation dialogs.
The system consists of two main components:
pipelines/hitl/pipeline_wrapper.py- A Hayhooks pipeline containing a Haystack Agent with Redis-based HITL confirmation strategyopen-webui-pipe.py- A Pipe function deployed in Open WebUI that bridges the frontend with Hayhooks
Redis is used as a message broker to coordinate approval decisions between the Pipe function and the backend pipeline.
βββββββββββββββββββ βββββββββββββββββββ HTTP/SSE βββββββββββββββββββ
β β β β ββββββββββββββββΆ β β
β Open WebUI β βββββββΆ β Pipe Function β β Hayhooks β
β (Frontend) β βββββββ β (open-webui- β ββββββββββββββββ β (Server) β
β β β pipe.py) β SSE Events β β
βββββββββββββββββββ ββββββββββ¬βββββββββ ββββββββββ¬βββββββββ
β β
β βββββββββββββββββββ β
β β β β
βββΆβ Redis βββββββββββββββββ
β (Broker) β
β β
βββββββββββββββββββ
- User sends message β Open WebUI frontend calls the Pipe function (
open-webui-pipe.py) - Pipe forwards request β Pipe function makes HTTP POST to Hayhooks (
/hitl/run) - Agent processes β Hayhooks streams SSE events back to the Pipe function
- Tool call detected β Agent emits
tool_call_startevent via SSE - Approval requested β Pipe function shows confirmation dialog to user via Open WebUI
- User decides β Pipe function sends approval/rejection to Redis (
LPUSH) - Pipeline unblocks β Hayhooks reads from Redis (
BLPOP) and continues - Result streamed β Pipe function receives response and streams it back to Open WebUI frontend
Open WebUI Hayhooks
Frontend Pipe Function /Agent Redis
β β β β
β "What's the weather?" β β β
βββββββββββββββββββββββββΆβ β β
β β POST /hitl/run β β
β βββββββββββββββββββββββββββββββββββββΆβ β
β β β β
β β SSE: tool_call_start β β
β ββββββββββββββββββββββββββββββββββββββ β
β β β BLPOP (waiting) β
β β ββββββββββββββββββββββββΆβ
β π§ Approve tool? β β β
ββββββββββββββββββββββββββ β β
β β β β
β β
Yes / β No β β β
βββββββββββββββββββββββββΆβ β β
β β LPUSH "approved" β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββΆβ
β β β (unblocks) β
β β βββββββββββββββββββββββββ
β β β β
β β SSE: text (result) β β
β ββββββββββββββββββββββββββββββββββββββ β
β "Weather is cloudy" β β β
ββββββββββββββββββββββββββ β β
β β β β
- Docker and Docker Compose
- OpenAI API key (or compatible LLM endpoint)
The easiest way to get started is using Docker Compose, which sets up all services (Redis, Hayhooks, Open WebUI) automatically.
export OPENAI_API_KEY="your-api-key-here"
docker compose up -d --buildThis starts:
- Redis on port 6379
- Hayhooks on port 1416 (with the HITL pipeline loaded)
- Open WebUI on port 3000
Follow Deploy and Configure the Pipe Function section instructions to deploy the Pipe Function to Open WebUI.
If you prefer to run services manually without Docker Compose:
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # Linux/macOS
# .venv\Scripts\activate # Windows
# Install dependencies for Hayhooks server
pip install -r requirements.txtdocker run -d --name redis -p 6379:6379 redis:alpine# Set your OpenAI API key
export OPENAI_API_KEY="your-api-key-here"
# Redis defaults to localhost:6379, override if needed:
# export REDIS_HOST="localhost"
# export REDIS_PORT="6379"
# Start Hayhooks and deploy the pipeline
hayhooks runHayhooks will start on http://localhost:1416 and automatically load pipelines/hitl/pipeline_wrapper.py.
docker run -d \
--name open-webui \
-p 3000:8080 \
-e WEBUI_AUTH=False \
-v open-webui:/app/backend/data \
--add-host=host.docker.internal:host-gateway \
ghcr.io/open-webui/open-webui:mainNOTE: This step can be automated, but will require to enable authentication in Open WebUI, in order to get a JWT token and use it in the curl command. For this demo, since it's a one-time setup, we'll do it manually.
- Open http://localhost:3000 in your browser
- Go to Settings β Admin Settings
- Select Functions from top bar
- Copy the contents of
open-webui-pipe.pyinto the editor - Give a title, name and description to the function
- Save and enable the function
- Important: Update the Valves for manual setup:
BASE_URL:http://host.docker.internal:1416REDIS_HOST:host.docker.internal
- In Open WebUI, start a new chat
- Select the Hayhooks HITL Pipe as your model (this routes requests through the Pipe function)
- Ask something that triggers a tool call, e.g.:
- "What's the weather in Rome?"
- "What time is it in UTC?"
- The Pipe function will forward your message to Hayhooks
- When the Agent decides to call a tool, a confirmation dialog will appear
- Click Confirm to approve or Cancel to reject the tool execution
- The response will stream back through the Pipe function to your chat
| Variable | Default | Description |
|---|---|---|
OPENAI_API_KEY |
- | Required. Your OpenAI API key |
REDIS_HOST |
localhost |
Redis server hostname |
REDIS_PORT |
6379 |
Redis server port |
| Valve | Docker Compose | Manual Setup | Description |
|---|---|---|---|
BASE_URL |
http://hayhooks:1416 |
http://host.docker.internal:1416 |
Hayhooks server URL |
PIPELINE_NAME |
hitl |
hitl |
Pipeline endpoint name |
REDIS_HOST |
redis |
host.docker.internal |
Redis host |
REDIS_PORT |
6379 |
6379 |
Redis port |
Note: The defaults in
open-webui-pipe.pyare configured for Docker Compose. For manual setup, update the Valves via Admin Panel β Functions β Hayhooks HITL Pipe β βοΈ
The default configuration includes two example tools:
| Tool | Description |
|---|---|
weather_tool |
Returns weather information for a location |
get_time |
Returns current time in a timezone |
Edit pipelines/hitl/pipeline_wrapper.py to add your own tools:
from haystack.tools import create_tool_from_function
def my_custom_tool(param: str) -> str:
"""Your tool description."""
return f"Result for {param}"
custom_tool = create_tool_from_function(
function=my_custom_tool,
name="my_custom_tool",
description="Description shown to the LLM",
)
# Add to agent tools list and confirmation_strategies dict
self.agent = Agent(
# ...
tools=[weather_tool, time_tool, custom_tool],
confirmation_strategies={
# ...
custom_tool.name: self.confirmation_strategy,
},
)- Ensure Hayhooks is running:
curl http://localhost:1416/status - Docker Compose: Use
http://hayhooks:1416in Valve settings - Manual setup: Use
http://host.docker.internal:1416in Valve settings
- Check Redis is running:
redis-cli ping(should returnPONG) - Docker Compose: Set
REDIS_HOSTvalve toredis - Manual setup: Set
REDIS_HOSTvalve tohost.docker.internal - Verify host/port configuration in both pipeline and pipe
- Default timeout is 5 minutes
- Check Redis connectivity between all services
- Verify the pipe is correctly sending approvals to the same Redis instance
- Ensure
Accept: text/event-streamheader is set - Check for proxy/firewall issues that might buffer SSE
The RedisConfirmationStrategy class implements Haystack's ConfirmationStrategy interface:
- When a tool call is initiated, it emits a
tool_call_startevent to the async queue (obtained fromconfirmation_strategy_context) - It then waits on Redis
BLPOPfor an approval decision (using the Redis client fromconfirmation_strategy_context) - Once approval is received, it returns a
ToolExecutionDecisionobject - The agent proceeds to execute (or skip) the tool based on the decision
The per-request state (event_queue, redis_client) is passed via the confirmation_strategy_context parameter when calling agent.run_async().
The Pipe function acts as a bridge between the Open WebUI frontend and Hayhooks:
- Receives chat messages from the Open WebUI frontend
- Forwards the request to Hayhooks via HTTP POST (
/hitl/run) - Streams SSE events from Hayhooks back to the frontend
- When it receives a
tool_call_startevent, it shows a confirmation dialog to the user via__event_call__ - Based on user response, it sends
approvedorrejectedto Redis viaLPUSH - Continues streaming the response back to the Open WebUI frontend
MIT

