Skip to content
This repository was archived by the owner on Jul 22, 2025. It is now read-only.

Conversation

@SamSaffron
Copy link
Member

  • Usage was not showing automation or image caption in llm list.
  • Also: FIX - reasoning models would time out incorrectly after 60 seconds (raised to 10 minutes)

- Usage was not showing automation or image caption in llm list.
- Also: FIX - reasoning models would time out incorrectly after 60 seconds (raised to 10 minutes)
Copy link
Contributor

@nattsw nattsw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm going to approve but please fix the test before merging.

This implements a basic chat streamer, it provides 2 things:

1. Gives feedback to the user when LLM is generating
2. Streams stuff much more efficiently to client (given it may take 100ms or so per call to update chat)
@SamSaffron SamSaffron merged commit 2a5c60d into main Apr 24, 2025
6 checks passed
@SamSaffron SamSaffron deleted the bugfixes branch April 24, 2025 06:22
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants