OpenClaw channel plugin for Stream Chat. Connects as a bot user via WebSocket, normalizes inbound messages into OpenClaw envelope format, and delivers agent responses using Stream Chat's AI streaming pattern (partialUpdateMessage + ai_indicator events).
- OpenClaw
>= 2026.2.13 - A Stream Chat application (API key + secret from the Stream Dashboard)
- Node.js
>= 20
cd openclaw-channel-streamchat
npm installYou have two options depending on your situation:
Option A — Fresh app (recommended for first-time setup)
Use setup-app.ts if you are starting from a new Stream Chat app. It creates the bot and test users, generates their tokens, creates a test channel, and writes both ~/.openclaw/openclaw.json and scripts/.env automatically:
STREAM_API_KEY=your_api_key STREAM_API_SECRET=your_api_secret npx tsx scripts/setup-app.tsAfter this, skip to step 4.
Option B — Existing app (bot token only)
Use generate-bot-token.ts if the app and channel already exist and you only need to mint or rotate the bot JWT. It prints the token to stdout — copy it into ~/.openclaw/openclaw.json manually:
STREAM_API_KEY=your_api_key STREAM_API_SECRET=your_api_secret npx tsx scripts/generate-bot-token.tsNote: Pass the API secret inline as shown above. It is only needed by these two provisioning scripts and should not be stored in
scripts/.env.
If you used Option B, add the channel config and plugin entry to ~/.openclaw/openclaw.json manually:
openclaw gateway restartThe plugin will connect to Stream Chat, watch all channels where the bot is a member, and start processing messages.
All test scripts live in scripts/ and load credentials from scripts/.env (populated by setup-app.ts). The plugin itself reads only from ~/.openclaw/openclaw.json — scripts/.env is not used at runtime.
See scripts/.env.example for the expected variables. You can also pass any variable inline to override the file:
STREAM_API_KEY=... TEST_USER_TOKEN=... npx tsx scripts/chat-client.tsLists all channels the test user belongs to:
npx tsx scripts/discover-channels.tsConnects as a test user, watches a channel, and lets you send messages interactively while printing incoming bot responses and AI indicator events:
# Auto-discover channels and use the first one
npx tsx scripts/chat-client.ts
# Specify a channel
npx tsx scripts/chat-client.ts myChannelId
# Send a single message
npx tsx scripts/chat-client.ts myChannelId "Hello bot"Commands inside the interactive client:
| Command | Description |
|---|---|
/thread <parentId> <text> |
Send a thread reply |
/quote <messageId> <text> |
Send a quoted reply |
/quit |
Disconnect and exit |
Sends a message and waits for the bot to respond, verifying the full streaming lifecycle (placeholder message, AI indicators, partial updates, final update):
npx tsx scripts/test-roundtrip.tsExpected output:
[NEW MSG][chatgpt] [AI]: (no text) # empty placeholder
[AI INDICATOR] AI_STATE_THINKING # thinking indicator
[AI INDICATOR] AI_STATE_GENERATING # generating indicator
[STREAMING] 2 + 2 = 4. # partial update
[FINAL] 2 + 2 = 4. # final update (generating: false)
[AI INDICATOR] cleared # indicator cleared
✓ Round-trip test PASSED — got bot response.
Sends a parent message, waits for the bot's response, then sends a thread reply and verifies the bot responds inside the thread:
npx tsx scripts/test-thread.tsInbound flow:
- Bot receives
message.newevent via WebSocket - Plugin filters out bot's own messages and AI-generated messages
- Builds an envelope with thread/reply context wrappers (
[Thread],[Replying]) - Dispatches to the OpenClaw agent pipeline
Outbound flow (streaming):
- Creates an empty placeholder message with
ai_generated: true - Sends
ai_indicator.updatewithAI_STATE_THINKING - On first text chunk, switches to
AI_STATE_GENERATING - Progressively updates the message via
partialUpdateMessagewithgenerating: true - On completion, sends final update with
generating: falseand clears the indicator
Thread handling:
- Thread replies include
parent_idso the bot's response routes to the correct thread - First message in a thread includes the parent message text for context
- Quoted replies are wrapped in
[Replying to ...]envelopes
{ "channels": { "streamchat": { "enabled": true, "apiKey": "your_api_key", "botUserId": "openclaw-bot", "botUserToken": "<token from generate-bot-token.ts>", // Optional: "ackReaction": "eyes", // reaction added when message is received (default: "eyes") "doneReaction": "white_check_mark", // reaction swapped in when response is done (default: "white_check_mark") "streamingThrottle": 15 // partial-update every Nth chunk (default: 15) } }, "plugins": { "load": { "paths": ["/absolute/path/to/openclaw-channel-streamchat"] }, "entries": { "streamchat": { "enabled": true } } } }