Skip to content

Commit b81d947

Browse files
committed
fix: revert to processing one queued message at a time
Backend should process messages one at a time, allowing each message to trigger a full task cycle with LLM interaction. The UI fix (checking messageQueue.length > 0) prevents race conditions by ensuring new messages queue during drain.
1 parent 894b4a9 commit b81d947

File tree

1 file changed

+3
-5
lines changed

1 file changed

+3
-5
lines changed

src/core/task/Task.ts

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3040,14 +3040,12 @@ export class Task extends EventEmitter<TaskEvents> implements TaskLike {
30403040
}
30413041

30423042
/**
3043-
* Process any queued messages by dequeuing and submitting them.
3044-
* This drains ALL queued messages to prevent them from getting stuck.
3045-
* Messages are processed with microtask delays to avoid blocking.
3043+
* Process any queued messages by dequeuing and submitting them one at a time.
3044+
* Each message will trigger a full task cycle with LLM interaction.
30463045
*/
30473046
public processQueuedMessages(): void {
30483047
try {
3049-
// Drain all queued messages, not just one
3050-
while (!this.messageQueueService.isEmpty()) {
3048+
if (!this.messageQueueService.isEmpty()) {
30513049
const queued = this.messageQueueService.dequeueMessage()
30523050
if (queued) {
30533051
setTimeout(() => {

0 commit comments

Comments
 (0)