Skip to content

Commit 8e86d26

Browse files
adds streams
1 parent 7cedaac commit 8e86d26

File tree

1 file changed

+97
-0
lines changed

1 file changed

+97
-0
lines changed

docs/english/concepts/message-sending.md

Lines changed: 97 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ Within your listener function, `say()` is available whenever there is an associa
55
In the case that you'd like to send a message outside of a listener or you want to do something more advanced (like handle specific errors), you can call `client.chat_postMessage` [using the client attached to your Bolt instance](/tools/bolt-python/concepts/web-api).
66

77
Refer to [the module document](https://docs.slack.dev/tools/bolt-python/reference/kwargs_injection/args.html) to learn the available listener arguments.
8+
89
```python
910
# Listens for messages containing "knock knock" and responds with an italicized "who's there?"
1011
@app.message("knock knock")
@@ -38,4 +39,100 @@ def show_datepicker(event, say):
3839
blocks=blocks,
3940
text="Pick a date for me to remind you"
4041
)
42+
```
43+
44+
## Streaming messages
45+
46+
You can have your app's messages stream in for those AI chatbot vibes. This is done through three methods:
47+
48+
* `chat_startStream`
49+
* `chat_appendStream`
50+
* `chat_stopStream`
51+
52+
### Starting the message stream
53+
54+
First you need to begin the message stream.
55+
56+
```python
57+
# Example: Stream a response to any message
58+
@app.message()
59+
def handle_message(message, client):
60+
channel_id = payload["channel"]
61+
thread_ts = payload["thread_ts"]
62+
63+
# Start a new message stream
64+
stream_response = client.chat_startStream(
65+
channel=channel_id,
66+
thread_ts=thread_ts,
67+
)
68+
stream_ts = stream_response["ts"]
69+
```
70+
71+
### Appending content to the message stream
72+
73+
With the stream started, you can then append text to it in chunks to convey a streaming effect.
74+
75+
The structure of the text coming in will depend on your source. The following code snippet uses OpenAI's response structure as an example.
76+
77+
```python
78+
# continued from above
79+
for event in returned_message:
80+
if event.type == "response.output_text.delta":
81+
client.chat_appendStream(
82+
channel=channel_id,
83+
ts=stream_ts,
84+
markdown_text=f"{event.delta}"
85+
)
86+
else:
87+
continue
88+
```
89+
90+
### Finishing the message stream
91+
92+
Your app can then end the stream with the `chat_stopStream` method.
93+
94+
```python
95+
# continued from above
96+
client.chat_stopStream(
97+
channel=channel_id,
98+
ts=stream_ts
99+
)
100+
```
101+
102+
The method also provides you an opportunity to request user feedback on your app's responses using the [feedback buttons](/reference/block-kit/block-elements/feedback-buttons-element) block element within the [context actions](/reference/block-kit/blocks/context-actions-block) block. The user will be presented with thumbs up and thumbs down buttons
103+
104+
```python
105+
def create_feedback_block() -> List[Block]:
106+
blocks: List[Block] = [
107+
ContextActionsBlock(
108+
elements=[
109+
FeedbackButtonsElement(
110+
action_id="feedback",
111+
positive_button=FeedbackButtonObject(
112+
text="Good Response",
113+
accessibility_label="Submit positive feedback on this response",
114+
value="good-feedback",
115+
),
116+
negative_button=FeedbackButtonObject(
117+
text="Bad Response",
118+
accessibility_label="Submit negative feedback on this response",
119+
value="bad-feedback",
120+
),
121+
)
122+
]
123+
)
124+
]
125+
return blocks
126+
127+
@app.message()
128+
def handle_message(message, client):
129+
# ... previous streaming code ...
130+
131+
# Stop the stream and add interactive elements
132+
feedback_block = create_feedback_block()
133+
client.chat_stopStream(
134+
channel=channel_id,
135+
ts=stream_ts,
136+
blocks=feedback_block
137+
)
41138
```

0 commit comments

Comments
 (0)